- Issue created by @olegiv
- π±πΉLithuania mindaugasd
Hi @olegiv,
based on "Steps to reproduce" - proxy should already work if it is configured.
Observe that the setup fails because the test request is not routed through the configured proxy.
How did you configure it?
Here I pushed a code change which plugged Drupal HTTP client to OpenAI library.
- https://www.drupal.org/project/ai/issues/3453589#comment-15641623 π OpenAI LLM Provider Active
- https://git.drupalcode.org/project/ai/-/merge_requests/1/diffs?commit_id...
So that middleware could be used, which enables change or log anything about HTTP request.
So with OpenAI you can already setup proxy by writing some middleware code:
* https://docs.guzzlephp.org/en/stable/request-options.html#proxyOr setup in other way (depending how you did it), maybe some modules exists for that?
BUT for all other providers (Anthropic etc.) you have to study case by case how the request is made in code, and if it does not use Drupal HTTP client, change to use it if possible (maybe push merge request like I did). Let me know what you find.
- π±πΉLithuania mindaugasd
If you really configured proxy, another thing to try -> turn off streaming feature.
Streaming may not be supported by proxy.
It is limitation of guzzle and/or PHP.
- π±πΉLithuania mindaugasd
By default guzzle use https://curl.se/, but for streaming it use PHP native streaming wrappers https://www.php.net/manual/en/wrappers.php, which has limited functionality and proxied requests, AI just said to me, it is not supported.
Example issue on Github: https://github.com/guzzle/guzzle/issues/2616 - π±πΉLithuania mindaugasd
Or maybe it does support, proxy is listed as configuration option https://www.php.net/manual/en/context.http.php
- π©πͺGermany marcus_johansson
Is this issue still of interest - I would move it to the specific provider. OpenAI provider has a host variable now, so you can set a custom host. I will close this on the 4th of February otherwise.
- π§πͺBelgium wouters_f Leuven
Any outhgoing request using \Drupal::httpClient() wil normally take the proxy settings into account.
So if we are using guzzle directly and not the drupal client your issue might indeed be true.We should use the drupal wrapper for every outhgoing request in orde to have everything working over a proxy.
- π§πͺBelgium wouters_f Leuven
If we use provider modules and we want them to support the proxy, we'll indeed need to add a proxy to the providers if they are using something else.
- πΊπΈUnited States kevinquillen
So if we are using guzzle directly and not the drupal client your issue might indeed be true.
If I read this right, specifically with OpenAI, the OpenAI client library uses Guzzle and not the one from \Drupal::httpClient, which would preconfigure it properly.
https://github.com/openai-php/client/blob/main/src/Factory.php#L180
But, the
AiProviderClientBase
class is injecting an instance of a client from http_client_factory with options from the provider configuration:public static function create(ContainerInterface $container, array $configuration, $plugin_id, $plugin_definition) { $client_options = $configuration['http_client_options'] ?? []; return new static( $plugin_id, $plugin_definition, $container->get('http_client_factory')->fromOptions($client_options + [ 'timeout' => 60, ]),
Which is then passed to the OpenAI wrapper to use as the Client:
https://git.drupalcode.org/project/ai_provider_openai/-/blob/1.1.x/src/P...
So I think the AI module(s) are doing the right thing here. I suppose one question is, here, where is this configuration set, or does more need to be added to the provider to pass along?
$client_options = $configuration['http_client_options'] ?? [];
- πΊπΈUnited States kevinquillen
I found an area where OpenAI\Client is used directly:
https://git.drupalcode.org/project/ai_provider_openai/-/blob/1.1.x/src/F...
- π¬π§United Kingdom MrDaleSmith
That would need to be an issue in the Open AI provider's queue.
- π¬π§United Kingdom MrDaleSmith
That would be https://www.drupal.org/project/ai_provider_openai β - the providers are no longer part of the AI core module