- Issue created by @mindaugasd
- 🇩🇪Germany marcus_johansson
This will not go into core as per https://www.drupal.org/project/ai/issues/3483356 ✨ Move out all provider modules to contrib modules Active , we can keep requests like this available though if someone wants to create a third party provider.
And in this case the core probably needs to change in some regards to facilitate this.
Looking at the way the API is implemented (in the
1.1.x
branch, I haven't checked the MRs), it seems to assume that models are always either proxied through the server OR executed on the server; the AI Chatbot submodule for example, has JavaScript that always submits prompts to the deepchat API endpoint and streams responses through another AJAX query. It doesn't try to "collect" alternative implementations that might come from other scripts.It might be possible to create a Service Worker script that caches certain flagged models in browser storage, and intercepts AJAX requests with assistant IDs that match those models, then mocks a response by executing the models within the browser. This is not necessarily the best solution however, because of the overhead of request parsing and response mocking (both in terms of maintainability and performance).
Alternatively, the existing JavaScript could be refactored to decouple the AJAX queries from the form actions. For example, by using a custom event and moving the default AJAX-based implementation into a corresponding event listener. This would allow contrib modules to add their own JavaScript libraries to respond to the same event and provide a client-sided model implementation.
In my opinion, other than cost and convenience, another reason to support client-sided models is privacy. In certain user-bases, it can be difficult to convince the users that data processed by cloud-hosted or server-sided models remains private; client-sided models do not need to transmit any user information to any servers.
- 🇬🇧United Kingdom MrDaleSmith
As the maintainer has detailed, the core AI module does not do anything directly with an AI and aklways interacts through a provider. AIUI, the appropriate way to support a WebGPU Ai would be through a new provider module that the other AI functionality can interact with, as per https://project.pages.drupalcode.org/ai/developers/writing_an_ai_provider/
As such, I'm going to close this as won't fix.
AIUI, the appropriate way to support a WebGPU Ai would be through a new provider module that the other AI functionality can interact with, as per
Unfortunately, as I have already stated, this is not possible because of limitations in the AI Provider API.
Implementing a WebGPU provider using the method documented would result in multiple round trips in the processing pipeline.For example, when the chat window in the AI Chatbot is being used, the pipeline would look like this:
- Browser: webgpu_provider/webgpu.js open SSE stream /api/webgpu-proxy
- Browser: ai/ai_chatbot/form-stream.js: AJAX POST /api/deepchat
- Drupal: ai/AIProviderPluginManager: call webgpu_provider
- Drupal: webgpu_provider: forward the prompt to /api/webgpu-proxy stream
- Browser: webgpu_provider/webgpu.js: receive prompt from /api/webgpu-proxy stream
- Browser: webgpu_provider/webgpu.js: generate response
- Browser: webgpu_provider/webgpu.js: AJAX POST to /api/webgpu-proxy
- Drupal: webgpu_provider: return browser-generated response to AIProviderPluginManager
- Drupal: ai/ai_chatbot/DeepChatApi: stream response to AJAX POST /api/deepchat
- Browser: ai/ai_chatbot/form-stream.js: stream response to chat window
- 🇬🇧United Kingdom MrDaleSmith
Nevertheless, the way to do it is through a provider, so you will need to raise a ticket that details the changes required to be able to support it: I'd suggest going step by step as any one change could easily break system for every other plugin. The basis of the ciore AI module is that everything is abstracted so that it can be used by any provider: any changes will need to respect this and work regardless of how the AI is ultimately accessed.
I'm leaving this issue as closed unless one of the maintainers want to reopen it, because this doesn't feel like something the current code can support in a single issue. It may well be that the AI family of modules cannot support an AI that works so differently to the other ones, in which case it may be easier to implement a separate contrib module for webGPU.