Account created on 2 November 2010, over 14 years ago
#

Recent comments

🇬🇪Georgia jibla

It makes sense. I will merge this into the dev release and make sure its part of the next release. Thank you @ishani patel @mgifford

🇬🇪Georgia jibla

jibla made their first commit to this issue’s fork.

🇬🇪Georgia jibla

Closing this as new website is launched and it serves as a new documentation.

🇬🇪Georgia jibla

Closing this as new website is launched and it serves as a new documentation.

🇬🇪Georgia jibla

Can you provide the Drupal and PHP versions to reproduce the bug?

🇬🇪Georgia jibla

Thanks for reporting.

Next week we are releasing a first stable version of MCP module and we will deliver an updated new docs, where this will be fixed too.

🇬🇪Georgia jibla

@nelo_drup - This is merged and is part of the latest release.

🇬🇪Georgia jibla

@lekso_surameli - we need to check if this model is loaded and works in recent beta release. If yes, we can mark this as fixed, or identify what needs to be done more.

🇬🇪Georgia jibla

Assigning to Lekso for review.

🇬🇪Georgia jibla

@yautja_cetanu - Thank you for the list of test cases. I was thinking for asking something like this. We can try to make to automate this e2e tests for other providers later. I will work on this.

@pameeela – I understand your perspective. I believe this requires further discussion to determine the best approach. From an end-user standpoint, when downloading and installing Drupal CMS, enabling AI, and finding only two major providers (OpenAI and Anthropic) while Google's Gemini is absent, it may be difficult to justify.

I also agree that there should be clear principles guiding provider selection, as including all available providers may not be feasible. However, given that OpenAI and Anthropic are included, it seems reasonable to consider Gemini as well—especially since many agencies partner with GCP and are likely to use Gemini as their primary LLM provider.

For now, I’ll continue working on this PR, ensuring all tests pass and addressing any necessary fixes while this decision is being discussed.

P.S. If the decision is to exclude Gemini from the default setup, I’m happy to create a separate recipe for it.

🇬🇪Georgia jibla

This is definitely solved in https://www.drupal.org/project/gemini_provider/issues/3504309 Add Embedding capabilities Active and is already merged.

Closing this issue.

🇬🇪Georgia jibla

Merged! Thank you @vasike!

🇬🇪Georgia jibla

This is related to the module name.

When I started the module, there was no naming convention at all, even all the existing providers where part of the AI module. I agree its better if the module is renamed to ai_provider_gemini.

I am going to work on the renaming after we release this changes. After this I will see if its possible - we need to rename namespace in url/composer without affecting already installed instances. I will take care of this after this release, meanwhile I am setting this issue as postponed.

🇬🇪Georgia jibla

jibla made their first commit to this issue’s fork.

🇬🇪Georgia jibla

@a.dmitriiev - I am trying to merge it but get a strange error not merging. Will try later, if we can't merge from here, lets create a separate issue on this and merge separately.

🇬🇪Georgia jibla

@vasike Thanks a lot for the proposal. It looks very good.

However, I think, for now, its better to go with simply just the provider's official logo (as other providers do).

For future, I have an idea to propose general style for all the official providers so that they look consistent.

@lekso_surameli uploading a properly cropped Gemini logo and you can use this.

🇬🇪Georgia jibla

@vasike - Yes, you are absolutely right. We definitely need to setup proper ci/cd with gitlab actions.

If you want, you can work on this and @lekso-surameli will review and we can merge it as soon as its done, so that we benefit from it.

🇬🇪Georgia jibla

I need to check if we need to do more here.

🇬🇪Georgia jibla

Yes you are right, the reason this not working (actually model returning the dummy output) was because the actual image blob was not sent with the request.

Tested and works perfectly. I tested it with Drupal CMS and it also works with Flash 2.

Thanks to everyone, its merged and credits are given to all.

🇬🇪Georgia jibla

jibla made their first commit to this issue’s fork.

🇬🇪Georgia jibla

@merilainen @vasike

Which branch you are looking at? with 1.x I cant reproduce the error.

🇬🇪Georgia jibla

@vasike

Which branch you are looking at? with 1.x I cant reproduce the error.

🇬🇪Georgia jibla

Yes that makes sense. Thanks @vasike.

🇬🇪Georgia jibla

jibla made their first commit to this issue’s fork.

🇬🇪Georgia jibla

@vasike - There is a typo of file name in your commit, but since I already merged it, I will fix it directly into 1.x now.

🇬🇪Georgia jibla

@a.dmitriiev
@vasike

Thank you guys, I think this is enough for this issue. I will merge this into 1.x and this will be part of the next release. You both will be credited.

🇬🇪Georgia jibla

Hi @a.dmitriiev

This issue needs more work. If you test this MR, you still can not install Gemini with recipes.

First, please move install/gemini_provider.settings file directly under config. Now in your MR its under config/scheme/install.

Second and most important, in order this to work from recipes, you must implement getSetupData() method in provider plugin implementation. You can check ai_provider_openai or ai_provider_anthropic for reference.

Thanks for the contribution.

🇬🇪Georgia jibla

@phenaproxima

I have tested and works with this branch fine.

🇬🇪Georgia jibla

Closing this as we have released first alpha and updated the readme as promised in the initial comment.

🇬🇪Georgia jibla

This approach looks well thought out and aligns well with the Drupal way of abstracting things. I just want to clarify a few points to make sure my understanding is correct:

  1. From what I gather, the approach involves creating plugins to expose functions that can be used in AI providers as function calls. These plugins essentially act as the lowest level of actions that can be executed as AI functions. I think this is a great opportunity to not only use them for AI providers but also extend them for other use cases, like agents, MCP tools, or even other integrations.
  2. Using the ChatTrait’s getTools() setTools(), it seems all provider implementations will have these methods (as base class uses this trait). If a provider doesn’t implement function calling, this wouldn’t lead to any breaking changes - this is important to make sure that its not breaking.
  3. Finally, it seems that AI providers themselves are responsible for implementing the actual function calling logic. So, if getTools() returns functions, the provider would handle their execution in its own way (load the plugin and then execute). So what they have to take care of is 1) register the functions and 2) if response has function calls, then call them right.

If all of the above is accurate, I think this is an excellent, drupal-friendly abstraction that provides flexibility. The only concern would be ensuring it doesn't introduce any breaking changes for providers that either don’t implement or don’t support function calling.

🇬🇪Georgia jibla

Thanks. We will release initial version soon and update everything entirely.

🇬🇪Georgia jibla

jibla created an issue.

🇬🇪Georgia jibla

What is the status on this issue? I see MR, but issue status is "Active".

This weekend (25-26 Jan), I am organizing a contribution weekend in Tbilisi and I am looking for issues to select to work on. If we can contribute here (review, test or code), we would be happy.

🇬🇪Georgia jibla

I have implemented to streamed outputs and its fixed in dev release. I have tested with AI tools (ckeditor, summaries, suggestions etc) and it looks fine.

@merilainen I'd appreciate if you would have time to test it.

I'll soon release next alpha.

🇬🇪Georgia jibla

The reason is that streaming is not yet supported and AI tools are requesting streamed outputs. We are working on this and it will be part of next release. Will notify here.

🇬🇪Georgia jibla

Hi @ejb503

Once again, thanks for your thoughtful input and for pushing the conversation forward. Here's how I propose we proceed with the module implementation: let’s build both approaches.

Here is my thought process:

Currently, we have two potential paths:

  1. Make Drupal itself the MCP server.
  2. Build a separate MCP server binary that works with Drupal.

Option 1: Drupal as the MCP Server

This approach is definitely the most elegant and user-friendly. It eliminates the need for an intermediate server, enabling users to simply install the module, set their MCP application’s settings to the Drupal URL, and start using it immediately. There’s no additional tech stack or hosting required, making it a natural and low-effort choice for users. Its no-brainier.

That said, it's well-known, that PHP's limitations pose challenges, particularly with real-time communication. HTTP SSE require persistent processes, which PHP and Drupal aren’t designed for. Workarounds with while loops are functional but hacky (to me), and implementing a complete MCP server (initiation, negotiation, caching, etc.) in PHP/Drupal introduces complexity and potential stability issues.

Despite these challenges, again, I think, the usability and simplicity of this solution make it worth pursuing for the community.

Option 2: Separate Executable MCP Server

This is the approach we’ve tested and published so far in our initial implementation. The idea is to create an independent server binary built on the official MCP SDK, fully supporting the protocol. Drupal becomes a source of data and actions (additional context for models), exposing the necessary information to the server.

This method, relies on a stable, pre-built transport layer for real-time communication (SSE or stdio) and Drupal focuses on providing context, tools, resources, and prompts without needing to handle low-level protocol details.

While this setup definitely adds complexity (e.g., requiring experienced users to host and configure the binary server), it aligns well with the way other MCP integrations work—for example, MCP servers built on TS or Python that interact with PostgreSQL or Google Drive without making those systems MCP servers themselves.

Viewing Drupal as a "data source" rather than the MCP server isn’t inherently wrong—it’s simply a different perspective.

On the same time, making Drupal the server itself is also a perfectly right view.

--

So, I think, both views are right :))

Solution

Rather than committing to a single approach, we can develop both options in a modular way. This ensures flexibility and allows us to adapt to future changes, such as new transport mechanisms (e.g., HTTP REST) proposed by Anthropic.

We can adopt a clean, modular architecture to separate concerns and allow different server implementations to integrate.

Here’s how we can structure it (its a draft things, we continue later on this deeply):

First, we have the core part

mcp_core module
- Discovers and provides tools, resources, and prompts.
- Handles authentication and user configuration.
- Acts as the foundation for all server implementations.

and then we have sub-modules for server implementations.

mcp_server_ts module
- Supports the TypeScript-based server binary.
- Relies on `mcp_core` for data and actions.
- Focuses on exposing endpoints for communication with the binary.

mcp_server_native module
- Implements a full-fledged MCP server directly in Drupal.
- Handles all MCP communication while leveraging `mcp_core` for tools, resources, and prompts.

Other developers can create their own server implementations that integrate with `mcp_core`. This design ensures server modules are like adapters, interchangeable and independent of the core module.

Let me know your thoughts, and I’d be happy to iterate further. Here's a diagram to illustrate the architecture:

🇬🇪Georgia jibla

Module's front page is updated.

🇬🇪Georgia jibla

Initial documentation is here https://project.pages.drupalcode.org/mcp

and module front-page is also updated.

Production build 0.71.5 2024