Problem/Motivation
Currently out of the 15 clients we have, 5 of them are based on pure OpenAI Client with some sprinkles:
* OpenAI
* LiteLLM
* LmStudio
* AmazeeAI
Outside of that we have three providers that are using it but with heavy modifications in:
* Anthropic
* Ollama
* Mistral
* Groq
It should be noted that Mistral will move away, but for the other seven, it was obvious when we added function calling for instance, that we added the same piece of code on 4 places at least. This is just hard to maintain and just creates noise.
We already have a AiProviderClientBase that works as base class for all the clients, we should extend that and create an OpenAiBasedProviderClientBase that sets certain things and sets up the operation type methods in their most vanilla way.
Proposed resolution
* Require openai-php/client in AI Core
Create an OpenAiBasedProviderClientBase that derives from AiProviderClientBase and:
Set a variable called endpoint that is empty
Set a default (OpenAI based) way of isUsable function.
Set a default setAuthentication method
Set a default getClient method - do not base it on having moderation.
Set a default loadClient method, if the endpoint is set, configure it (see GroqProvider), otherwise do not set it in the OpenAI factory.
Set a vanilla Chat method (use Groq)
Set a vanilla moderation method (use OpenAI without the default omni-moderation-latest)
Set a vanilla textToImage from OpenAI, minus the specific error exceptions.
Set a vanilla textToSpeech from OpenAI, minus the specific error exceptions.
Set a vanilla speechToTetx from OpenAI, minus the specific error exceptions.
Set a vanilla embeddings from OpenAI, minus the specific error exceptions.
In the AiProviderClientBase we can also improve by
Set a default getConfig that gets the {modulename}.settings
Same for getApiDefintion