- Issue created by @sahaj
- π¨π¦Canada mandclu
Not sure if it's possible, but it would be ideal if there way some way to set this only for the calls that are needed for the chatbot.
- π©πͺGermany marcus_johansson
I think we should split this up in setting it up in DDEV and setting it up in production. The markdown or the DDEV changes needed for streaming when you are developing we can push here: https://project.pages.drupalcode.org/ai/developers/ddev/
In production what @mandclu write is important - you don't want to turn off webserver buffering everywhere since it affects performance, specifically CPU usage and network congestion. It can also affect error logging in apache and nginx.
The biggest problem is that nginx and apache has to decide in the request phase if it wants to use or not use buffering, so we won't be able to set that based on some response header or other PHP application rule.
You could of course base this on something like a query string and a request header, but that would be easy to spoof and DDOS.
I'll research how Big Pipe does this, because it should use similar flush or ob_flush methods to send partial buffers to the webserver. If they have it working even with webserver is buffering, we should just copy the solution they have.
Another option would be that all streamed responses forwards the requests to a specific known endpoint with a session that could be setup to not buffer in nginx and apache, but that would need a lot of rewriting in the AI module. That would also be pretty complex nginx or apache setups, but I think that can be filed under "don't implement it, if you don't know how to set it up".
-
marcus_johansson β
committed 1ab10e57 on 3479605-real-time-feedback-in
Issue #3479605: Real-time Feedback in Chatbot: showing streaming answer...
-
marcus_johansson β
committed 1ab10e57 on 3479605-real-time-feedback-in
-
marcus_johansson β
committed 31cd5ab9 on 1.0.x
Issue #3479605: Real-time Feedback in Chatbot: showing streaming answer...
-
marcus_johansson β
committed 31cd5ab9 on 1.0.x