- Issue created by @a.dmitriiev
- 🇩🇪Germany a.dmitriiev
I am not quite sure what model should be used as default for chat. I have set
llama-3.3-70b-versatile
but just because it seems like it is the latest version. Automatically closed - issue fixed for 2 weeks with no activity.