Support multiple models

Created on 25 July 2025, 3 months ago

Problem/Motivation

Support multiple servers.

VLLM cant run multiple models in same server so there is a need to define multiple
vLLM servers.

Does Drupal AI even support this? So there should be another field for the base URL and
for the model name.

Steps to reproduce

Proposed resolution

Remaining tasks

User interface changes

API changes

Data model changes

Feature request
Status

Active

Version

1.0

Component

Code

Created by

🇫🇮Finland anaconda777

Live updates comments and jobs are added and updated live.
Sign in to follow issues

Comments & Activities

Production build 0.71.5 2024