Created on 22 April 2024, 7 months ago
Updated 6 May 2024, 6 months ago

Hello,

It would be very convenient to have those models available on Ollama, so that we can install them on our local laptops and directly interact with the model.

Is this something possible?

Thank you.

💬 Support request
Status

Fixed

Version

1.0

Component

Code

Created by

🇨🇭Switzerland sir_squall

Live updates comments and jobs are added and updated live.
Sign in to follow issues

Comments & Activities

  • Issue created by @sir_squall
  • Status changed to Fixed 7 months ago
  • 🇱🇹Lithuania mindaugasd

    Hi,
    AI developer assistant is not communicating with AIs directly, and will do it through other modules. While AI chat user interface is interacting with AIs.

    First item of the aichat module roadmap (available on module description) is:

    • "1. Integrate more AIs and APIs (issues 1, 2, 3, 4, 5)"

    About ollama support we talk in first issue of "LLM provider" module here: Collaboration with existing projects Active

    "AI chat user interface" should connect to "LLM provider" which should connect to https://www.drupal.org/project/ollama module.
    So you could have ollama support.

    We also have this issue AI chat user interface module support Active on Ollama module, but I will update it, that we plan to integrate Ollama through "LLM provider".

    Ollama module is also in development, and not ready yet, so I would encourage you to contact Ollama module maintainer to collaborate on making ollama module happen.

  • Automatically closed - issue fixed for 2 weeks with no activity.

Production build 0.71.5 2024