White screen, Failed to get models from Ollama: cURL error 28: Connection timed out after

Created on 27 October 2024, about 2 months ago

Problem/Motivation

Hi,

After installing the latest DEV and going to edit and ECA action (not related to AI anyhow)
the site hangs with white screen and in the logs have this: (meanwhile ollama was not running)

Oct 27 04:38:57 R6800HS my_site: https://local.my_site2.net|1729996737|d_log_cat:provider_ollama|192.168.50.45|https://local.my_site2.net/admin/config/workflow/eca/process_vnelwbe/edit|https://local.my_site2.net/admin/config/workflow/eca|1||Failed to get models from Ollama: cURL error 28: Connection timed out after 120001 milliseconds (see https://curl.haxx.se/libcurl/c/libcurl-errors.html) for http://192.168.1.254:11434/api/tags

When started the ollama (ollama run llama3) on the other server, the site comes back online.

Steps to reproduce

Proposed resolution

Remaining tasks

User interface changes

API changes

Data model changes

🐛 Bug report
Status

Active

Version

1.0

Component

Other Submodules

Created by

🇫🇮Finland anaconda777

Live updates comments and jobs are added and updated live.
Sign in to follow issues

Comments & Activities

  • Issue created by @anaconda777
  • 🇩🇪Germany marcus_johansson

    I think what we will have to do in this case is to have a lower timeout on the get models call. The problem is that ECA has to see that the provider is working and what models/operation types it can trigger. Part of that is loading the models.

    So in the case of loading the models we should make that a separate configuration value that is much shorter (<5 seconds) since such a call should happen instantaniously.

    We are moving out the providers, so I will add this as a bug request to that.

Production build 0.71.5 2024