- Issue created by @anaconda777
Support multiple servers.
VLLM cant run multiple models in same server so there is a need to define multiple
vLLM servers.
Does Drupal AI even support this? So there should be another field for the base URL and
for the model name.
Active
1.0
Code