Things have been hunky dory when using Gemini as a provider, but when testing with Anthropic/Claude, we started receiving truncated responses for some calls.
It turns out that the max_tokens for Anthropic is somehow set to 2048 tokens. So, some responses are greater than 2048 and get truncated. See https://www.drupal.org/project/ai_migration/issues/3545198 π Test multiple AI LLMs Active .
Please investigate / implement a way to increase this number.
Perhaps add a config value to the migration yml so devs can set this per migration.
Active
1.0
Code
Not all content is available!
It's likely this issue predates Contrib.social: some issue and comment data are missing.