Port not taking into account when configuring ai_provider_ollama

Created on 1 August 2025, 16 days ago

I have an ollama instance running on an extra pc which is available at http://192.168.0.20:11434 from within my network.

I install module ai, install ai_provider_ollama on a fresh Drupal 11, enable them, and try to set up Ollama.

On admin/config/ai/providers/ollama I enter as Hostname http://192.168.0.27 and Port 11434. When I press "Save configuration" I get following error message:

Failed to get models from Ollama: cURL error 28: Connection timed out after 5001 milliseconds (see https://curl.haxx.se/libcurl/c/libcurl-errors.html) for http://192.168.0.20/api/tags

Notice that the port is not showing in the URL in the error message. Then I changed the Hostname to http://192.168.0.20:11434 and all is working.

Conclusion: port number is not used in the configuration of ollama. Either lose the port number and add it to the Hostname or construct the URL in the proper way. Preference for the latter.

🐛 Bug report
Status

Active

Version

1.1

Component

Code

Created by

Live updates comments and jobs are added and updated live.
Sign in to follow issues

Comments & Activities

  • Issue created by @sboden
  • Found it. It looks like "11434" is prefilled, but it's not of course (it's just a placeholder)

    I would remove the placeholder "11434" in the text field, and put below in the description "The port number for the API. Ollama normally uses 11434. Port be left empty if 80 or 443.

    Or additionally really prefill the port to "11434".

Production build 0.71.5 2024