AI chat user interface module support

Created on 6 November 2023, about 1 year ago
Updated 22 June 2024, 6 months ago

Problem/Motivation

Wouldn't it be nice to have Ollama as a backend option for AI chat user interface

Feature request
Status

Active

Version

1.0

Component

Code

Created by

🇹🇷Turkey orkut murat yılmaz Istanbul

Live updates comments and jobs are added and updated live.
Sign in to follow issues

Comments & Activities

  • Issue created by @orkut murat yılmaz
  • 🇱🇹Lithuania mindaugasd

    Hi Orkut,

    OpenChat looks like an open source models library

    OpenChat an innovative library of open-source language models

    Therefor it belongs within your module https://www.drupal.org/project/ollama :)

    So I changed issue title, description and moved the issue to ollama issue queue :)
    Also referenced similar issues in Augmentor and OpenAI issue queues.

    On the other hand, if Augmentor integrates with AIChat, then Ollama does not need to integrate with AIChat.

    Because, we would have this integration three: OpenChat -> Ollama -> Augmentor -> AIChat.
    1. OpenChat integrates with Ollama library.
    2. Ollama integrates with Augmentor.
    3. Augmentor integrates with AIChat.

    Maybe that is the way to go?

  • 🇱🇹Lithuania mindaugasd

    But augmentor code will be within ollama module, so AIChat backend would also be within this ollama module.

  • 🇱🇹Lithuania mindaugasd

    Ollama could have a submodule called "ollama_aichat", which would provide backend for AI Chat user interface.

    This is temporary solution, longer term we will figure out better one. For example, ollama could provide plugins for https://www.drupal.org/project/aiwork module instead, but since aiwork is not developed yet, we can have this temporary solution and will have more elegant solution later in the future.

  • 🇹🇷Turkey orkut murat yılmaz Istanbul

    Thanks for your suggestions, we can do that in this way, I guess.

  • 🇩🇰Denmark ressa Copenhagen

    Thanks @Orkut Murat Yılmaz! I just tried the Ollama Docker image as described in Spring AI with Ollama and Meta’s Llama2, and after a few minutes I was presented with a prompt. And the description is spot in :)

    >>> Please describe the country Denmark in 50 words, in the style of the dude.

    Oh, man, like, Denmark is this totally chill Nordic country, you know? It's got a super low crime rate and, like, the people are so laid back they're practically horizontal. They're always talking about "hygge," which is like this whole vibe of coziness and warmth and stuff. And, dude, the food? Like, whoa, it's all about pastries and cheese and beer. It's like, the ultimate hangout spot, you know? Just chillin' with your buds, eatin' good food, drinkin' cold brews... it's like, the dream, man.

    Would it be possible to have a release of the module, so that we can try it out?

  • 🇹🇷Turkey orkut murat yılmaz Istanbul

    This weekend I'll have some more time and I'll publish the first release I guess. We may start planning this feature next week. Thanks for reminding:)

  • 🇩🇰Denmark ressa Copenhagen

    Sounds fantastic @Orkut Murat Yılmaz, I look forward to trying it, and will give it a go, as soon as it is available. Have a nice weekend :)

  • 🇱🇹Lithuania mindaugasd

    Hi,
    I just received an issue from person asking for ollama support, so referencing it here: 💬 Ollama Fixed

    @Orkut Murat Yılmaz, also have a look at this issue Collaboration with existing projects Active

    Making integrations with many different modules can be quite impractical, so it would be great to integrate AI chat user interface with LLM provider only, and this LLM provider would provide all the LLMs including Ollama.

    Like this:
    Ollama -> LLM provider -> AI Chat user interface .

    So Ollama would integrate with LLM provider only, simplifying for you all the integrations as well, because LLM provider could also integrate with Augmentor and Interpolator, so you also would only need to do 1 LLM provider integration.

    But we can continue to explore having direct (temporary) integration as well in case we need one, because LLM provider is not ready as of now.

    Also recently I released quite big updates for AI chat and AI prompt modules, you can find about changes in the release notes of both modules here:

  • 🇱🇹Lithuania mindaugasd

    Related parent issue Rebase on AI module's abstraction layer Active
    When that issue is completed ✅, this issue will be automatically completed as well ✅

Production build 0.71.5 2024