Modify max_tokens for AI calls

Created on 15 September 2025, 23 days ago

Problem/Motivation

Things have been hunky dory when using Gemini as a provider, but when testing with Anthropic/Claude, we started receiving truncated responses for some calls.

It turns out that the max_tokens for Anthropic is somehow set to 2048 tokens. So, some responses are greater than 2048 and get truncated. See https://www.drupal.org/project/ai_migration/issues/3545198 πŸ“Œ Test multiple AI LLMs Active .

Please investigate / implement a way to increase this number.

Perhaps add a config value to the migration yml so devs can set this per migration.

πŸ“Œ Task
Status

Active

Version

1.0

Component

Code

Created by

πŸ‡ΊπŸ‡ΈUnited States majorrobot

Live updates comments and jobs are added and updated live.
Sign in to follow issues

Merge Requests

Comments & Activities

Not all content is available!

It's likely this issue predates Contrib.social: some issue and comment data are missing.

Production build 0.71.5 2024