- Issue created by @marcus_johansson
- First commit to issue fork.
- πͺπΈSpain gxleano CΓ‘ceres
Once this issue is ready to go, we would need to also create follow up tickets on every provider in order to clean up the
openai-php/client
library from them. - π©πͺGermany marcus_johansson
Great work - I added some comments. The one about api key being optional needs to be fixed, look at the other and see if you agree. The streamed message could be a follow up issue, if needed.
- First commit to issue fork.
- π©πͺGermany marcus_johansson
Lookgs good for me - one comment - Dan will have a look as well, since the Amazee are dependent on it.
- πͺπΈSpain gxleano CΓ‘ceres
Then let's wait until Dan review and test it.
Thanks to check it Marcus.
- π©πͺGermany marcus_johansson
Looks great now - I'll set this in RTBC and merge tomorrow if Dan didn't have a look. I think this might be one of the issues where its easier to continue developing and finiding missing scope/bugs by having it already in 1.2.x-dev, so you can target that with the providers under development.
The upgrade path to this should be safe - the providers can't be updated without 1.2.0 of the AI module, and updating the AI module to 1.2.0, but keeping 1.1.0 providers is fine.
- π¨πSwitzerland dan2k3k4 Zurich
We've provided changes to the other issue / MR - https://www.drupal.org/node/3532631 β
https://git.drupalcode.org/project/ai_provider_amazeeio/-/merge_requests/15So this MR should be good from our side.
- π©πͺGermany marcus_johansson
No one complained in the AI Contrib meeting, so in it goes.
-
marcus_johansson β
committed 6bdb6567 on 1.2.x authored by
gxleano β
Resolve #3531134 "Create base class"
-
marcus_johansson β
committed 6bdb6567 on 1.2.x authored by
gxleano β
- π¬π§United Kingdom andrewbelcher
I think there is an issue with the implementation of
\Drupal\ai\Base\OpenAiBasedProviderClientBase::canChatStream
.Currently it simply returns
$this->streamed
. That property is whether or not we want a streamed response, not whether it's supported (which is what both the method name and documentation the method should do).In π Support Fibers for collaborative multitasking on LLM io waiting Active we are using streaming as a proxy for async requests, which means we need to be able to know whether it's supported separately to whether it's requested for the API call.
Not sure whether we want the default to be TRUE or FALSE? Specific providers can then override.
Automatically closed - issue fixed for 2 weeks with no activity.
- π¨πSwitzerland dan2k3k4 Zurich
@andrewbelcher perhaps, we should default to FALSE? Providers should explicitly pass TRUE if they support it?
- π¨πSwitzerland dan2k3k4 Zurich
@andrewbelcher perhaps, we should default to FALSE? Providers should explicitly pass TRUE if they support it?