Unable to configure ollama on fresh D11 site

Created on 29 April 2025, 13 days ago

Problem/Motivation

I'm using DDEV and am unable to configure the provider to use a model I've installed in the ollama container.

When I try to configure it via /admin/config/ai/providers/ollama, I get the following error:

Failed to get models from Ollama: Client error: `GET http://host.docker.internal:11434/api/tags` resulted in a `404 Not Found` response: 404 page not found

This is regardless of if I set the Host name to "http://ollama" or "http://host.docker.internal" (both with port 11434.)

From the command line, if I run "ddev exec curl host.docker.internal:11434", I get "404 page not found".

But, oddly, I can utilize the API Explorer successfully against the installed model and it works fine.

Steps to reproduce

  • Fresh Drupal 11 site.
  • Dev versions of both AI and AI Provider Ollama modules.
  • Using the docker-compose.ollama.yaml file from the AI docs folder.
  • Install the gemma3:1b model via "ollama run gemma3:1b"
  • Configure the provider with either "http://ollama" or "http://host.docker.internal" with port 11434.
  • I can successfully use /admin/config/ai/explorers/chat_generator to get a valid response.
  • When I try to use the "AI Content Suggestions" module to use "AI Suggestions" form widget settings (to suggest a title), I get an "There was an error obtaining a response from the LLM." in the UI and the following in Recent log messages, "Error invoking model response: POST predict: Post "http://127.0.0.1:43741/completion": EOF" (I have no idea where 43741 is coming from.)

-mike

πŸ› Bug report
Status

Active

Version

1.1

Component

Code

Created by

πŸ‡ΊπŸ‡ΈUnited States ultimike Florida, USA

Live updates comments and jobs are added and updated live.
Sign in to follow issues

Comments & Activities

Production build 0.71.5 2024