I have implemented to streamed outputs and its fixed in dev release. I have tested with AI tools (ckeditor, summaries, suggestions etc) and it looks fine.
@merilainen I'd appreciate if you would have time to test it.
I'll soon release next alpha.
The reason is that streaming is not yet supported and AI tools are requesting streamed outputs. We are working on this and it will be part of next release. Will notify here.
Hi @ejb503
Once again, thanks for your thoughtful input and for pushing the conversation forward. Here's how I propose we proceed with the module implementation: letβs build both approaches.
Here is my thought process:
Currently, we have two potential paths:
- Make Drupal itself the MCP server.
- Build a separate MCP server binary that works with Drupal.
Option 1: Drupal as the MCP Server
This approach is definitely the most elegant and user-friendly. It eliminates the need for an intermediate server, enabling users to simply install the module, set their MCP applicationβs settings to the Drupal URL, and start using it immediately. Thereβs no additional tech stack or hosting required, making it a natural and low-effort choice for users. Its no-brainier.
That said, it's well-known, that PHP's limitations pose challenges, particularly with real-time communication. HTTP SSE require persistent processes, which PHP and Drupal arenβt designed for. Workarounds with while loops are functional but hacky (to me), and implementing a complete MCP server (initiation, negotiation, caching, etc.) in PHP/Drupal introduces complexity and potential stability issues.
Despite these challenges, again, I think, the usability and simplicity of this solution make it worth pursuing for the community.
Option 2: Separate Executable MCP Server
This is the approach weβve tested and published so far in our initial implementation. The idea is to create an independent server binary built on the official MCP SDK, fully supporting the protocol. Drupal becomes a source of data and actions (additional context for models), exposing the necessary information to the server.
This method, relies on a stable, pre-built transport layer for real-time communication (SSE or stdio) and Drupal focuses on providing context, tools, resources, and prompts without needing to handle low-level protocol details.
While this setup definitely adds complexity (e.g., requiring experienced users to host and configure the binary server), it aligns well with the way other MCP integrations workβfor example, MCP servers built on TS or Python that interact with PostgreSQL or Google Drive without making those systems MCP servers themselves.
Viewing Drupal as a "data source" rather than the MCP server isnβt inherently wrongβitβs simply a different perspective.
On the same time, making Drupal the server itself is also a perfectly right view.
--
So, I think, both views are right :))
Solution
Rather than committing to a single approach, we can develop both options in a modular way. This ensures flexibility and allows us to adapt to future changes, such as new transport mechanisms (e.g., HTTP REST) proposed by Anthropic.
We can adopt a clean, modular architecture to separate concerns and allow different server implementations to integrate.
Hereβs how we can structure it (its a draft things, we continue later on this deeply):
First, we have the core part
mcp_core module
- Discovers and provides tools, resources, and prompts.
- Handles authentication and user configuration.
- Acts as the foundation for all server implementations.
and then we have sub-modules for server implementations.
mcp_server_ts module
- Supports the TypeScript-based server binary.
- Relies on `mcp_core` for data and actions.
- Focuses on exposing endpoints for communication with the binary.
mcp_server_native module
- Implements a full-fledged MCP server directly in Drupal.
- Handles all MCP communication while leveraging `mcp_core` for tools, resources, and prompts.
Other developers can create their own server implementations that integrate with `mcp_core`. This design ensures server modules are like adapters, interchangeable and independent of the core module.
Let me know your thoughts, and Iβd be happy to iterate further. Here's a diagram to illustrate the architecture:
jibla β created an issue. See original summary β .
jibla β created an issue. See original summary β .
Module's front page is updated.
Initial documentation is here https://project.pages.drupalcode.org/mcp
and module front-page is also updated.
@marcus_johansson @mrdalesmith
Thanks, it makes sense- moved module under docs/examples.
Thank you for the feedback @mrdalesmith
The dropai_provider included, is not actually a provider, but an example module for the fictional provider explained in the documentation and the files are linked there. My motivation was that ic can help other developers and can be used as a starter code to build new providers.
Alternatively, I can create a separate project and put that module there.
Thank you for reporting. Will check and push the fix.
I have to decide yet. When I release first alpha, I will choose one :)
Fixed in alpha4, hence closing this issue.
This is known issue already fixed in Gemini Provider https://www.drupal.org/project/gemini_provider/issues/3467778 π Error: The role system, is not supported by Gemini Provider. RTBC
You can check dev version. Once AI module releases new updates, I will also release Gemini and it will be fixed for in next release.
I will add this info to README for next release.
Hi
@mErilainen
Thanks for testing.
I have noted about this on the project page. Reason is this:
Please note that in API explorer, before sending request, the default value of system role (`system`) needs to be changed with `model` as they only support `user` and `model` roles.
Perhaps we need to change the behavior of AI module's API explorer later so that after selecting a model in the text-to-text explorer, default value of system role is changed if necessary.
Changed and released.