- Issue created by @marcus_johansson
- 🇮🇳India Akhil Babu Chengannur
Not sure if this is the correct way:
To show the warning, first add
if (isset($this->agentsManager->getDefinition($default_agent)['custom_type']) && $this->agentsManager->getDefinition($default_agent)['custom_type'] === 'config') { if (empty($this->providerManager->getSimpleProviderModelOptions('chat', FALSE, TRUE, [AiModelCapability::ChatTools]))) { $this->messenger->addError($this->t('This agent will not work with the selected model as function calling is not supported.')); } }
to
Drupal\ai_agents_explorer\Form\AiAgentExplorerForm::buildForm
But for this to work,
if (in_array(AiModelCapability::ChatTools, $capabilities)) { continue; }
should be added to
Drupal\ai_provider_openai\Plugin\AiProvider\OpenAiProvider::getModels
in the
1.0.2 version code - 🇬🇧United Kingdom MrDaleSmith
The alternative would be a hook_requirements check in the 1.1 modules to check that any other AI-family modules were also on the 1.1 versions: @Marcus did you have an idea of which way you want to go?
- 🇩🇪Germany marcus_johansson
Its a little bit complex because of a dumb architecture choice in some provider agents. We assumed OpenAI provider would be in the forefront of all developing, so it is working with an opt-out mechanism for capabilities, instead of an opt-in.
This means that when 1.0.0 is installed, it will still show that it has models for function calling, which is true in the sense that the model supports it, but since the provider itself won't, it will not work. And since it has graceful fallback, it will run the call, but say that it doesn't have any tools (which is true).
And since OpenAI is by far the most used provider, this will be a common problem.
The other issue is that there is no clear connection between the provider being in 1.1.x and it supporting function calling. We can allow that for each provider we control, but we do not control all of them.
The option I can think of right of the bat, is that we do something similar to the return array in the AI core that controls whether an operation type is supported, that we give back something like that. And if that method is missing, we can assume 1.0.0 version. This is however a breaking change, though with support in the base class it should be fine.
We can also do hook_requirements on the two by far most popular providers (OpenAI and Anthropic). Ollama is also popular, but we can assume that this is more technical people, that will ask on Slack/issue queue when issues happens.
Both are not optimal solutons of course.
- 🇬🇧United Kingdom MrDaleSmith
OK, it sounds to me that what we're saying is this:
- It's complicated, because decisions made in the past mean there's no easy way for this module to know if functionCalls are available in a provider;
- Really it should be fixed at the provider level, but we don't control every provider;
- Anything we do do could introduce breaking changes into either the 1.0 or 1.1 versions of this module.
In lieu of a better solution, I think we should really look at 2 as a quick fix: composer has a mechanism for telling people that certain combinations of versions don't work together and its very easy to implement. I've added and done a MR for adding a composer conflict to the Open AI Provider at https://www.drupal.org/project/ai_provider_openai/issues/3519419 🐛 Version of modules can go out of alignment Active - if that's approved, we can start rolling out similar for the other providers the maintainers here control, and leave this ticket open for a discussion about a more generic solution and when it might be best to implement it.