[meta] Discussion: what LLM providers to include

Created on 10 June 2024, 18 days ago
Updated 24 June 2024, 4 days ago

Problem/Motivation

Created meta issue to group similar issues for central discussion and referencing/hyperlinking this question.
πŸ“Œ Ollama LLM Provider Active
πŸ“Œ MistralAI LLM Provider Active
πŸ“Œ Groq LLM Provider Active
πŸ“Œ Huggingface LLM Provider Active
πŸ“Œ LM Studio LLM Provider Active
πŸ“Œ Anthropic LLM Provider Active
πŸ“Œ OpenAI LLM Provider Active
✨ Discussion: Add Dreamstudio Provider into core Fixed

Steps to reproduce

Proposed resolution

Remaining tasks

User interface changes

API changes

Data model changes

πŸ“Œ Task
Status

Active

Version

1.0

Component

Discussion

Created by

πŸ‡±πŸ‡ΉLithuania mindaugasd

Live updates comments and jobs are added and updated live.
Sign in to follow issues

Comments & Activities

  • Issue created by @mindaugasd
  • πŸ‡±πŸ‡ΉLithuania mindaugasd

    I see AI module as Drupal CMS module, so maybe include or endorse none, and focus on Drupal CMS functionality only as described here: ✨ Create AI ecosystem "add-ons" page Active
    But we could create composer dependency on some important modules like openai v2 β†’ or to something more open (in the spirit of open Drupal) like HuggingFace, to ensure out-of-the-box experience.
    Majority people will use around 1 service provider to get the job done.

  • πŸ‡±πŸ‡ΉLithuania mindaugasd

    Another option is to bundle all LLM providers under 1 sub-module, that way we don't bloat /admin/modules page.

  • πŸ‡±πŸ‡ΉLithuania mindaugasd

    Assuming project_browser enables to install modules with a few clicks (no composer terminal required), then any service could be not included.

    There could be specialized UI which hooks into Drupal update mechanism (I am still to learn how that magic works).

  • πŸ‡¬πŸ‡§United Kingdom yautja_cetanu
  • πŸ‡©πŸ‡ͺGermany Marcus_Johansson

    I don't think we can put them under one submodule. If you want some specific function that happens to fall under one provider, that should be possible.

    Fireworks AI has for instance an ImageToImage model that creates creative QR codes from an image. If my module does QR codes and I want that and only that functionality, I should be able to require just that provider.

  • πŸ‡©πŸ‡ͺGermany Marcus_Johansson

    All the discussed models are now in review!

  • πŸ‡©πŸ‡°Denmark ressa Copenhagen

    Nice work @Marcus_Johansson, thanks! I'll try to test Ollama LLM Provider later today.

  • πŸ‡±πŸ‡ΉLithuania mindaugasd
  • πŸ‡©πŸ‡ͺGermany Marcus_Johansson

    Should we close this ticket and anything that should go into core needs to be a follow up request?

  • πŸ‡±πŸ‡ΉLithuania mindaugasd

    I would like this issue to be open for some time.
    I don't have good understanding why providers exists within core in the first place.

  • πŸ‡±πŸ‡ΉLithuania mindaugasd

    1) There are many AI solutions, many will die off (such as go bankrupt), many other new will be invented, some will grow to take the market share and live on - how AI core will deal with deprecations after it turns out some module no longer useful/usable/inferior for most people, or then better alternatives emerge (alternatives to dreamstudio ✨ Discussion: Add Dreamstudio Provider into core Fixed for example, or ollama πŸ’¬ Can a module be provided for use with Jan.ai as a provider? Closed: won't fix ). Is it good idea to maintain this add/remove question in core constantly long time?

    2) Drupal Core never had integrations by default, so it is important to know what Drupal CMS leadership thinks about it. The maintenance of services bundled with AI module would become their responsibility (or headache?) as well by extension of being part of Drupal CMS, because of being tightly bundled within AI module.

    As alternative, integrations could be bundled out-of-the-box by adding or removing composer dependency. It is same experience from user perspective and it is easier to add/remove. The difference for developer would be more namespaces to develop/track/maintain, while it is only one project today.

Production build 0.69.0 2024