I setup the Aws Bedrock Provider and tested the new models with the AI Chat Explorer. Everything worked as expected. I then configured the AI Content module to use one of the new models. I edited a node and tried to suggest taxonomy.
Drupal\ai\Exception\AiRequestErrorException: Error invoking model response: Error executing "Converse" on "https://bedrock-runtime.us-east-1.amazonaws.com/model/amazon.titan-text-..."; AWS HTTP error: Client error: `POST https://bedrock-runtime.us-east-1.amazonaws.com/model/amazon.titan-text-...` resulted in a `400 Bad Request` response: {"message":"1 validation error detected: Value 'system' at 'messages.1.member.role' failed to satisfy constraint: Member (truncated...) ValidationException (client): 1 validation error detected: Value 'system' at 'messages.1.member.role' failed to satisfy constraint: Member must satisfy enum value set: [user, assistant] - {"message":"1 validation error detected: Value 'system' at 'messages.1.member.role' failed to satisfy constraint: Member must satisfy enum value set: [user, assistant]"} in Drupal\ai\Plugin\ProviderProxy->wrapperCall() (line 190 of /app/web/modules/contrib/ai/src/Plugin/ProviderProxy.php).
It doesn't seem to like the system prompt.
If I filter out the system message, it works.
public function chat(array|string|ChatInput $input, string $model_id, array $tags = []): ChatOutput {
$this->loadClient();
// Normalize the input if needed.
$chat_input = $input;
$system_message = $this->chatSystemRole;
if ($input instanceof ChatInput) {
$chat_input = [];
/** @var \Drupal\ai\OperationType\Chat\ChatMessage $message */
foreach ($input->getMessages() as $key => $message) {
$content = [
[
'text' => $message->getText(),
],
];
if (count($message->getImages())) {
foreach ($message->getImages() as $image) {
$content[] = [
'image' => [
'format' => $image->getFileType() == 'jpg' ? 'jpeg' : $image->getFileType(),
'source' => [
'bytes' => $image->getBinary(),
],
],
];
}
}
// Add messages with valid roles.
if (in_array($message->getRole(), ['user', 'assistant'])) {
$chat_input[] = [
'role' => $message->getRole(),
'content' => $content,
];
}
}
}
// If there is a system message, prepend it to the messages
if ($system_message) {
array_unshift($chat_input, [
'role' => 'system', // Use a designated role for the system message
'content' => [['text' => $system_message]],
]);
}
// Normalize the configuration.
$this->normalizeConfiguration('chat', $model_id);
$payload = [
'modelId' => $model_id,
'messages' => $chat_input,
'inferenceConfig' => $this->configuration,
];
// Set system message.
if ($system_message) {
$payload['system'] = [['text' => $system_message]];
}
if ($this->streamed) {
$response = $this->client->converseStream($payload);
$message = new BedrockChatMessageIterator($response->get('stream'));
}
else {
$response = $this->client->converse($payload);
$message = new ChatMessage($response['output']['message']['role'], $response['output']['message']['content'][0]['text']);
}
return new ChatOutput($message, $response, $response['usage']);
}