- Issue created by @AlfTheCat
- 🇹🇭Thailand AlfTheCat
Ah I see my typo. I'm using OpenAI 4o, when switching to OpenAI o1, the error is thrown.
I'm on Drupal 10.4.2 and AI module 1.04.
- 🇬🇧United Kingdom MrDaleSmith
@marcus Would I be right in thinking this needs to be resolved in https://git.drupalcode.org/project/ai_provider_openai/-/blob/1.0.x/src/P... rather than this module?
- Merge request !22Add event subscriber to resolve incorrect config key. #3508042 → (Open) created by MrDaleSmith
- 🇩🇪Germany marcus_johansson
I think in this case the method getModelSettings would do the job, instead of having to invoke an event.
By using unset and setting a new value it should be possible: https://git.drupalcode.org/project/ai_provider_openai/-/blob/1.1.x/src/P...
If that is not true, just set back to RTBC.
- Status changed to Needs work
about 2 months ago 7:22am 26 August 2025 - 🇫🇮Finland merilainen
Same issue exists with gpt-5 models, so I think the patch won't work:
if (str_starts_with($model_id, 'o1')) {
Needs to be updated to work with any model which might use max_completion_token instead of max_tokens. - 🇮🇳India pinesh
Convert max_tokens to max_completion_tokens for newer OpenAI models to prevent API errors
// Only process OpenAI provider requests if ($event->getProviderId() !== 'openai') { return; } $model_id = $event->getModelId(); $configuration = $event->getConfiguration(); // Handle o1 models which require max_completion_tokens instead of max_tokens if (preg_match('/^o1/i', $model_id) && isset($configuration['max_tokens'])) { $configuration['max_completion_tokens'] = $configuration['max_tokens']; unset($configuration['max_tokens']); $event->setConfiguration($configuration); } // Handle gpt-5 models which also may require max_completion_tokens if (preg_match('/^gpt-5/i', $model_id) && isset($configuration['max_tokens'])) { $configuration['max_completion_tokens'] = $configuration['max_tokens']; unset($configuration['max_tokens']); $event->setConfiguration($configuration); }
- 🇩🇪Germany marcus_johansson
Oldie, but getting merged now. Also fixed gpt-5. Backporting to 1.1.x.
Now that this issue is closed, please review the contribution record.
As a contributor, attribute any organization helped you, or if you volunteered your own time.
Maintainers, please credit people who helped resolve this issue.
-
marcus_johansson →
committed 3e17efda on 1.2.x authored by
mrdalesmith →
Use correct setting for 01 models. #3508042
-
marcus_johansson →
committed 3e17efda on 1.2.x authored by
mrdalesmith →
-
marcus_johansson →
committed e1d57aaf on 1.1.x authored by
mrdalesmith →
Use correct setting for 01 models. #3508042
-
marcus_johansson →
committed e1d57aaf on 1.1.x authored by
mrdalesmith →