π¬ | LMStudio Provider | No valid stream events were sent. Make sure the events are using {text: string} or {html: string} format. You can also augment them using the responseInterceptor property: https://deepchat.dev/docs/interceptors#responseInterceptor
πΊπΈUnited States sushichris
π¬ | LMStudio Provider | No valid stream events were sent. Make sure the events are using {text: string} or {html: string} format. You can also augment them using the responseInterceptor property: https://deepchat.dev/docs/interceptors#responseInterceptor
πΊπΈUnited States sushichris
sushichris β created an issue.
πΊπΈUnited States sushichris
sushichris β created an issue.
πΊπΈUnited States sushichris
Ah, I am using Ollama, I was going to create a bug issue for the ollama provider but I thought it might be more associated with the AI assistant sub module, the assistant module appears to be responsible for the output format. The only Ollama ai model that behaves this way is the deepseek-r1 model, phi4 llama3.1 llama3.2 don't have the "thinking" output so they behave as expected.
πΊπΈUnited States sushichris
sushichris β created an issue.