The generic ECA actions provided by this module (e.g., "AI Chat") are an excellent bridge for interacting with various AI provider backends. They successfully abstract the core functionality of sending a prompt and receiving a text-based answer.
However, this abstraction is currently a "lossy" one. AI provider APIs return a wealth of valuable metadata alongside the text response, most critically the usageMetadata which contains token counts. The current implementation of the generic actions discards this metadata, returning only the string answer.
This limitation prevents site builders and developers from implementing essential production-level features, such as:
Cost Tracking: It's impossible to log the totalTokenCount for each call, which is necessary for budgeting and cost analysis.
Usage Monitoring: Without access to token data, we cannot monitor usage against API rate limits or build custom throttling logic.
Advanced Workflows: We cannot create ECA workflows that branch based on metadata. For example: "IF token_usage > 4000, THEN send a notification to an administrator."
A parallel issue has been created for the gemini_provider module to expose this data at the service level (see: https://www.drupal.org/project/gemini_provider/issues/3549099 ✨ Expose API response metadata (like usageMetadata) in ApiClient service for token tracking Active ). However, even if an individual provider exposes this data, the generic actions in ai_integration_eca need a mechanism to receive and forward it into the ECA workflow.
It is proposed that the generic AI actions in this module be enhanced to provide the full API response and specific metadata as optional outputs. This would require establishing a new, richer contract between ai_integration_eca and the AI provider plugins it calls.
A pragmatic, backward-compatible approach would be:
Establish a New Convention: Define a new, optional method that AI provider services can implement, such as getLastResponse(): ?array. This method would return the full, decoded JSON response from the most recent API call. The gemini_provider issue linked above proposes exactly this implementation.
Enhance the Generic Actions: The generic actions within ai_integration_eca (like the "AI Chat" action) should be updated. After calling the primary method (e.g., generateText()), they should check if the provider's service has a getLastResponse() method.
Provide New Outputs: If the getLastResponse() method exists, the action should call it and make the data available as new, optional output contexts (tokens) for the ECA workflow. For example:
full_response: A JSON string of the complete API response.
token_usage: The integer value of totalTokenCount if it exists in the response.
This approach allows modules that don't need this feature to continue working as-is, while enabling advanced functionality for those that opt-in by implementing the new method. This would make the entire AI integration ecosystem in Drupal significantly more powerful and ready for production use cases.
None directly in the action's configuration form. However, users will see the new output tokens ([task_id:full_response], [task_id:token_usage]) become available in the token browser for all subsequent tasks in a workflow.
So it should be mentioned in the helptext.
The generic ECA actions will gain new, optional output contexts.
None.
Needs work
1.0
Code
Not all content is available!
It's likely this issue predates Contrib.social: some issue and comment data are missing.
No activities found.