- Issue created by @andrewbelcher
- Merge request !792Issue #3538027: Support Fibers for collaborative multitasking on LLM io waiting → (Open) created by andrewbelcher
- 🇬🇧United Kingdom andrewbelcher
Needs a provider extending
OpenAiBasedProviderClientBase
that supports streaming to test. Can be tested with:phpunit --configuration phpunit.ai.xml --filter FiberTest
Where
phpunit.ai.xml
has a correctly configured OpenAI key. - 🇬🇧United Kingdom andrewbelcher
Also run a straight Fibers vs non-Fibers for 5 LLM calls each generating 15 haiku:
i$ ddev drush scr scripts/fibers.php Without fibers -------------- 5/5 [▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓] 100% [OK] Without fibers: 56.567 seconds With fibers ----------- 5/5 [▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓] 100% [OK] With fibers: 14.601 seconds Comparison ---------- [OK] Performance improvement: 3.9
- 🇬🇧United Kingdom andrewbelcher
This also includes a fix for #3531134-31: Create Base Class for OpenAI based clients → , though I'm not sure if
TRUE
is the correct default... - 🇩🇪Germany marcus_johansson
I added some comments, I think its also good if we add a flag for this - so anyone setting up a third party app, can check if the provider supports fibers.
We need documentation fort this as well, but let me know if you need help with writing that.