- Issue created by @sarvjeetsingh
- 🇮🇳India Ishani Patel
I've been facing the same error,
When I enabled the OpenAI provider and checked in chat explorer with the "streamed" option
Following the screenshot belowThank you!
- 🇮🇳India prashant.c Dharamshala
Not sure, but this https://www.drupal.org/project/ai/issues/3519196 🐛 Handle errors for API explorers when a provider not configured Active might be related.
- First commit to issue fork.
This issue specifically occurs when you do not assign the Stop Sequences at the right side. To ensure the UI doesn't break maybe a default value for a stop sequence should solve this.
In my case, I am using Gemini Provider.
I think the cause of this issue is that in case of streamed response the error messages are not handled properly. Maybe there is a format in which it expects the response to be and treats the error message the same way which causes this issue.
- Merge request !666Issue #3521390: solve UI breaking when stream is True and Stop Sequences not provided → (Merged) created by Unnamed author
The issue was with stream.js file in the ai_api_explorer module, it was injecting a whole DOM content in the $message field in the ChatExplorer.php. I added validations to this part to ensure that the response that is sent back is not HTML as it indicates an error and have give a proper message to check the Stop Sequences because that the reason this UI breaking is happening.
- 🇮🇳India anjaliprasannan
I dont get the issue with the required fields filled and streamed checked. But when I don't fill the required field and streamed checked the issue is reproduced both in the issue branch and 1.2.x. please check the ss and update the steps to reproduce if I missed any step.
- 🇩🇪Germany marcus_johansson
Thanks @techmantejas - found one issue in my code review, anyone can check and I'll merge it after.
"Invalid response received. Please check your configuration." Then I guess this much should be fine
- Status changed to Needs work
24 days ago 9:51am 10 July 2025 - 🇵🇹Portugal bbruno Poland
I could replicate the issue, and can confirm that checking out this MR also fixes the layout issue on my side.
However, the streaming functionality itself does not seem to be working for me. Unsure if this is something wrong with local configuration or an existing problem. Using gpt-4o model.
-
marcus_johansson →
committed 58b720b6 on 1.2.x authored by
techmantejas →
Issue #3521390: solve UI breaking when stream is True and Stop Sequences...
-
marcus_johansson →
committed 58b720b6 on 1.2.x authored by
techmantejas →
-
marcus_johansson →
committed 336d3c59 on 1.1.x authored by
techmantejas →
Issue #3521390: solve UI breaking when stream is True and Stop Sequences...
-
marcus_johansson →
committed 336d3c59 on 1.1.x authored by
techmantejas →