Do we need to add toArray() to the \Drupal\Core\TypedData\DataDefinitionInterface()?  I'm needing either toArray() or offsetGet() but the former isn't under contract.
michaellander → made their first commit to this issue’s fork.
michaellander → made their first commit to this issue’s fork.
This fixes this specific issue, but we should probably add test coverage and try more versions of this(maps for example) with constraints.
I would like to opt-in this module: 
            
              https://www.drupal.org/project/tool →
            
It has 48 total issues, and all maintainers agree we are ready to try GitLab issues. We understand we cannot revert this change and there may be some unexpected issues as early adopters.
There are a few PHPStan flags that I'm not sure the best way to fix. The one about using \Drupal to get the tempstore service is intentional, however, as I would like to replace the token stuff with a common API that works across modules and don't want to tie the tempstore service into the dependency injection until a unified solution is in place.
This branch depends on 📌 Determine path forward for turning Attributes into JSON schema Active , which I'm hoping to have fixed shortly.
I pushed up the normalizer's I'm testing, they are working but are not complete.
Worth noting, one of the changes is that the exhaustive tool list would not live on the page but instead in a modal/tray. So the tools displayed here are only the ones added to the agent. So while the amount of tools will grow, the amount used on the agent will just be a small subset of the total amount of tools. I'm open to putting them on a tab, I just don't think it's going to be a significant issue either way.
I was running into issues where the Gemini CLI schema validation is more strict than other clients.  In my case,  there were additional schema properties such as 'name' that I had to remove recursively and a missing parameters property in some cases.
This was happening when using MCP with AI Function Calls, and I did not get to go further than that yet.
I have a version of this for Tool API. It was put on the back burner temporarily but I'm delving back into it. I think we need to connect it in a way that leaves it up to the tools plugin to determine the configuration form for per-instance configuration. For example, when adding an existing function call, we can use a form similar to what we have currently. When adding an agent tool, the experience may differ, when adding a Tool API tool, we can inherit the refinement form from that module. If there are specific form elements that you want to persist across all tools, then we can just embed the per tool-type form within a shared form.
That's my opinion right now at least.. I'll work on getting the Tool API version back up and running so you can see it.
I've updated the original topic to reflect the solution that went into the module. I think overall it's a win-win in that we can lean on data type validations, but also communicate those dependencies to UI/AI/etc.
Also noting, we are using the AI normalizer in a few places currently without the dependency in place for Tool API, which I'm not intended on adding. That makes this extra important to solve independently of any one tool caller.
Fixed. The solution I landed on varies slightly from the original post. Will get that updated shortly.
michaellander → changed the visibility of the branch 3545828-introduce-dynamic-tools to hidden.
Added an initial set of tools, though some of which are going to be dependent on 🌱 Introduce Dynamic Tools Active . We can still start building them as it won't cause much backtracking, just additional refinement.
If these go quick or we get more help, we can definitely consider doing even more from:
https://docs.google.com/spreadsheets/d/18knLUFa2uUll_nOe4yGFPjDBQmDvqJtM...
michaellander → created an issue. See original summary → .
michaellander → created an issue. See original summary → .
I've made some changes to the original issue. The main thing is that the approach won't really change how tool calls happen now, it will just give flexibility when callers want to take advantage of the added metadata. If you were to use the tool calling exactly as it is now, it really just provides better validation on inputs.
michaellander → created an issue.
michaellander → created an issue.
Maybe we should try seeing how often the JsonDeserializer is even being used.  Originally I had it because I assumed most of the incorrect data coming through would still be json encoded, or double encoded, but maybe that's not the case.  Ideally we wouldn't need the converter at all.
michaellander → created an issue.
I've pushed up a commit that automatically handles 'refining' definitions as input values are set. This means that validation should occur with a more well defined definition than the original generic definition.
This change currently is only reflected in the \Drupal\tool_content\Plugin\tool\Tool\FieldSetValue tool.  Basically the tool accepts 3 values 'entity', 'field_name' and 'value', with 'value' being typed as any.  After values are set for 'entity' and 'field_name', the 'value' property then becomes a map, with a definition that matches the actual field definition(multiple, required, property definitions, etc). 
We could additionally add a constraint to confirm the field actually exists on the entity, but that introduces the next challenge.
How best do we communicate prior to execution that a tool(and specific properties) are dynamic?
Using the same field_set_value as an example, we could technically decide to only show 'entity' as the only starting input definition. Then after adding an entity we would refine the tool to append the 'field_name' input definition, with all available fields as 'options' for the field. Then after selecting a 'field_name' we would append the 'value' definition'.   This would make the tool truly dynamic, but means multiple tool calls from AI to fully understand the tool, and generating a form for the tool(in the case of ECA) would be impossible when using tokens.  We could alternatively present all top level inputs to start, and only allow existing inputs to be 'refined'.  This helps with the form challenge as we can have some sort of form element to display, which could be refined as additional values are provided.  This also helps with AI make it clear all inputs a tool is expecting, and may reduce total calls required.  Though this does leave definitions in a some what ambiguous state where it's not clear to a form or AI if an input definition is complete or still waiting to be refined.
This part is still TBD.
michaellander → created an issue.
michaellander → created an issue.
michaellander → created an issue.