Zurich
Account created on 3 January 2013, almost 13 years ago
#

Merge Requests

More

Recent comments

🇨🇭Switzerland dan2k3k4 Zurich

Seems good to me, merging

🇨🇭Switzerland dan2k3k4 Zurich

Agree with @b_sharpe - it would make sense to move provider code out of AI Core and use another smaller module for the "OpenAI SDK / base provider" which all other "OpenAI-like" providers can extend and depend on.

🇨🇭Switzerland dan2k3k4 Zurich

Added the "Try Drupal AI experiences" to this issue

Working on the repos / initial databases for them

🇨🇭Switzerland dan2k3k4 Zurich

Closing in favour of #3540794

https://www.drupal.org/node/3540794

🇨🇭Switzerland dan2k3k4 Zurich
🇨🇭Switzerland dan2k3k4 Zurich
🇨🇭Switzerland dan2k3k4 Zurich
🇨🇭Switzerland dan2k3k4 Zurich
🇨🇭Switzerland dan2k3k4 Zurich
🇨🇭Switzerland dan2k3k4 Zurich
🇨🇭Switzerland dan2k3k4 Zurich

The code looks good to me.

Asked in Slack for testing instructions:

  1. Install AI Agents and AI Agents Explorer as well.
  2. Goto the AI Test provider: /admin/config/ai/providers/ai-test
  3. Check both checkboxes and save.
  4. Visit the Agents Explorer for the Field Agent: /admin/config/ai/agents/explore?agent_id=field_agent_triage
  5. Try against some provider like Amazee What is the max resolution on the image fields on the article?
  6. It will now do some turns, calling some calls using your provider.
  7. Now go to the mock provider results: /admin/content/ai-mock-provider-result
  8. Click edit on each of the saves responses, and check "Mock Enabled". You can also for fun set sleep time to 1. Save them
  9. Go back to the explorer page, but pick EchoAI - GPT Test.
  10. You should now see the exact same result as before - and a lot faster if you changed the time.

Step #10 is using the changes here to replicate the results from an array.

Works as expected for me.

🇨🇭Switzerland dan2k3k4 Zurich
🇨🇭Switzerland dan2k3k4 Zurich

Thanks @vasi1186

I've created an MR from your patch:
https://git.drupalcode.org/project/ai_provider_amazeeio/-/merge_requests/20

Will test/review shortly

🇨🇭Switzerland dan2k3k4 Zurich

Pushed a commit on dev that should fix the issue

🇨🇭Switzerland dan2k3k4 Zurich
🇨🇭Switzerland dan2k3k4 Zurich

dan2k3k4 created an issue.

🇨🇭Switzerland dan2k3k4 Zurich

dan2k3k4 created an issue.

🇨🇭Switzerland dan2k3k4 Zurich

dan2k3k4 created an issue.

🇨🇭Switzerland dan2k3k4 Zurich

dan2k3k4 created an issue.

🇨🇭Switzerland dan2k3k4 Zurich

dan2k3k4 created an issue.

🇨🇭Switzerland dan2k3k4 Zurich

dan2k3k4 created an issue.

🇨🇭Switzerland dan2k3k4 Zurich

dan2k3k4 created an issue.

🇨🇭Switzerland dan2k3k4 Zurich
🇨🇭Switzerland dan2k3k4 Zurich

dan2k3k4 created an issue.

🇨🇭Switzerland dan2k3k4 Zurich
🇨🇭Switzerland dan2k3k4 Zurich
🇨🇭Switzerland dan2k3k4 Zurich
🇨🇭Switzerland dan2k3k4 Zurich

Thanks for reporting, having a look into this now!

🇨🇭Switzerland dan2k3k4 Zurich

RTBC / unassigned

Marcus plans to test this too

🇨🇭Switzerland dan2k3k4 Zurich

And testing with the Extended Logger as per #10

Downloaded and enabled Extended Logger (specifically https://www.drupal.org/project/extended_logger/releases/1.3.0-beta2).

Enabled also extended_logger_db.

Tested again the provider with the AI Chat Explorer.

Output in the Extended Logger DB page at /admin/reports/extended-logs shows like this (for the call and response):

Seems to be working as expected.

🇨🇭Switzerland dan2k3k4 Zurich

Based on #10, I created a new Drupal project, added the MR from this issue, and added the amazee.ai provider.

I installed the provider, along with all AI related modules, dblog, and also enabled the ai_observability module from this MR.

After enabling, I played around with the Chat Explorer, and these actions get logged to dblog like so:

This seems to work as expected.

I will test the "Extended Logger" version shortly.

🇨🇭Switzerland dan2k3k4 Zurich

I did the full review in #16 but forgot to RTBC

As for the quote in #17, it would be a feature to be added to the amazee.ai provider, but it could potentially be ported to other providers or to AI core... however it can wait for 1.3.x or later.

🇨🇭Switzerland dan2k3k4 Zurich
🇨🇭Switzerland dan2k3k4 Zurich

Hmm, I'm actively maintaining the AI-related modules for amazee.ai / LiteLLM and accompanying recipes. Recipes don't have any PHP code though.
- https://www.drupal.org/project/ai_provider_amazeeio

Recently merged some open MRs into these modules:
- https://www.drupal.org/project/graphql_extras
- https://www.drupal.org/project/npm

Created this old module based on a similar D8 module:
- https://www.drupal.org/project/responsive_image_formatter_link_to_image_...

🇨🇭Switzerland dan2k3k4 Zurich

dan2k3k4 created an issue.

🇨🇭Switzerland dan2k3k4 Zurich

I quite like Vercels suggestion to use a new script tag in the header of a page with a new type so that it doesn't impact client browsers

https://vercel.com/blog/a-proposal-for-inline-llm-instructions-in-html

It would allow fine tuning for LLMs per page. Instead of one global llms.txt file.

🇨🇭Switzerland dan2k3k4 Zurich

Having watched the video in #21 by @b_sharpe, it makes it quite clear and easy to set up multiple Automators. I am not sure if a non-technical person will easily understand the UI/UX behind it, but I imagine that they would be accompanied by some training prior to better understand the setup.

If we can get this in for 1.2.x then wouldn't that make the demos for DrupalCon Vienna look better for marketing purposes?

🇨🇭Switzerland dan2k3k4 Zurich

Rewrote the provider class a bit, removing OpenAI related code, as we don't need to depend on that since the main base OpenAI provider is in core.

1.2.x works again, models show up and the API explorer worked for me.

I pushed a new RC release just to have a working version.

Code could be refactored a bit but needed a working solution in time for the Drupal AI hackathon.

🇨🇭Switzerland dan2k3k4 Zurich

Approved MR - left one comment about fixing "Th" to "The"

Tested, the warning shows up like this on the LiteLLM provider settings page:

And once filling in the OpenAI provider with a key, the warning would no longer show up as expected.

🇨🇭Switzerland dan2k3k4 Zurich

Tested this with a new Drupal site.

I used this issue's MR for the AI module and grabbed the ai_provider_amazeeio_recipe then edited the recipe.yml to include a check for the provider.

Test Steps

# drop the DB
drush sql-drop -y

# install drupal based on config, note none of the ai modules/providers are in core.extension.yml
drush si --existing-config -y

# test the recipe
drush recipe ../recipes/ai_provider_amazeeio_recipe

For a failing test, I checked that openai was attempted to be set up with this in the recipe.yml
(note: minified recipe.yml, config.actions has other things and not just ai.settings)

config:
  actions:
    ai.settings:
      verifySetupAi:
        provider_is_setup:
          - openai

For a passing test, I checked that amazeeio was set up with this in the recipe.yml

config:
  actions:
    ai.settings:
      verifySetupAi:
        provider_is_setup:
          - amazeeio

🇨🇭Switzerland dan2k3k4 Zurich

Read through the MR, I like the idea and agree that we do need a way to check for a provider being set up and/or the different models being set up.

One thing that we're thinking about for the amazee.ai provider is to have a hidden feature flag, something like "redirect_to_auth_if_not_setup" which would do as described, redirect the user (check has access to configure the provider) to the provider settings page after login.

In the sense that if you install the module via a recipe, and then boot up the site, you would get redirected (as admin) to the provider settings page to login/authenticate the provider.

I will try to test the MR with a modified recipe (of the ai_provider_amazeeio_recipe) to test out how it looks and feels.

🇨🇭Switzerland dan2k3k4 Zurich

@Marcus, should this be "Needs review" or RTBC? It was "Active" when I tested it but I wasn't sure if you were planning further changes to the MR

🇨🇭Switzerland dan2k3k4 Zurich

dan2k3k4 made their first commit to this issue’s fork.

🇨🇭Switzerland dan2k3k4 Zurich

Ported the VDB provider code into the amazee.ai provider module.

Released these fully stable versions for these providers/recipe:

  • ai_provider_amazeeio versions 1.0.1 and 1.1.1
  • ai_provider_litellm versions 1.0.0 and 1.1.0
  • ai_provider_amazeeio_recipe versions 1.0.2 and 1.1.0

Marking this as fixed now.

🇨🇭Switzerland dan2k3k4 Zurich

Code looks good to me. Works as expected.

RBTC

🇨🇭Switzerland dan2k3k4 Zurich

Thanks, merged MR and credited everyone using the new Contribution Record system
https://new.drupal.org/contribution-record/11415156

🇨🇭Switzerland dan2k3k4 Zurich

dan2k3k4 made their first commit to this issue’s fork.

🇨🇭Switzerland dan2k3k4 Zurich

Marcus asked me to test this alongside https://www.drupal.org/project/ai_provider_dreamstudio/issues/3531285 Add ImageToImage capabilities Active

Marking this as RBTC.

Testing Steps

Set up a new local Drupal, then added modules with source via:

composer require 'drupal/ai:1.2.x-dev' -W --prefer-source
composer require 'drupal/ai_provider_dreamstudio:1.0.x-dev' -W --prefer-source

Enabled modules via (AI gets enabled as it's a dependency):

drush en -y ai_provider_dreamstudio
drush en -y ai_api_explorer

Added a new Key with my API key from DreamStudio / Stability AI
Set up the provider for the Image-to-Image

Went to the Image-to-Image Explorer page at: /admin/config/ai/explorers/image_to_image_generator

I tried a few options.

The Outpaint does not seem to generate a new image.
The Erase sort of works, at least the image is edited but it might have been the combination of the image and mask image that produced a weird overlay effect.
The Remove Background worked as expected.

One thing I noticed, when I first loaded the explorer, I think I clicked on Upscale but the "prompt" field was not showing up on first load, so when I clicked generate I got this error - but I think this is unrelated to this provider issue
Error invoking model response: The Dreamstudio API returned an error: {"errors":["prompt: string must contain at least 1 character(s)"],"id":"f470c1eeea8ba89398fc93769c9adfaa","name":"bad_request"} with code 400

🇨🇭Switzerland dan2k3k4 Zurich

Marcus asked me to test this alongside https://www.drupal.org/project/ai/issues/3531212 Create Image-To-Image operation type Active

Testing Steps

Set up a new local Drupal, then added modules with source via:

composer require 'drupal/ai:1.2.x-dev' -W --prefer-source
composer require 'drupal/ai_provider_dreamstudio:1.0.x-dev' -W --prefer-source

Enabled modules via (AI gets enabled as it's a dependency):

drush en -y ai_provider_dreamstudio
drush en -y ai_api_explorer

Added a new Key with my API key from DreamStudio / Stability AI
Set up the provider for the Image-to-Image

Went to the Image-to-Image Explorer page at: /admin/config/ai/explorers/image_to_image_generator

I tried a few options.

The Outpaint does not seem to generate a new image.
The Erase sort of works, at least the image is edited but it might have been the combination of the image and mask image that produced a weird overlay effect.
The Remove Background worked as expected.

One thing I noticed, when I first loaded the explorer, I think I clicked on Upscale but the "prompt" field was not showing up on first load, so when I clicked generate I got this error - but I think this is unrelated to this provider issue
Error invoking model response: The Dreamstudio API returned an error: {"errors":["prompt: string must contain at least 1 character(s)"],"id":"f470c1eeea8ba89398fc93769c9adfaa","name":"bad_request"} with code 400

🇨🇭Switzerland dan2k3k4 Zurich

Will take a look into testing this today.

It requires this issue/MR too from the ai_provider_dreamstudio module:
https://www.drupal.org/project/ai_provider_dreamstudio/issues/3531285 Add ImageToImage capabilities Active

🇨🇭Switzerland dan2k3k4 Zurich

Tested locally using a key for amazee.ai as the LiteLLM provider.

Works as expected, able to select models and run chat generation.

🇨🇭Switzerland dan2k3k4 Zurich

I believe we need to push a 1.2.x branch and target 1.2.x of the AI module, same as the other provider modules

🇨🇭Switzerland dan2k3k4 Zurich

Unsure if this is because I'm testing with latest 1.2.x version of the AI module (SHA: 302ba82a8d89)

On route `/admin/config/ai/ai-external-moderation`

TypeError: set_error_handler(): Argument #1 ($callback) must be a valid callback or null, class Drupal\ai_provider_mistral\Plugin\AiProvider\MistralProvider does not have a method "errorCatcher" in set_error_handler() (line 35 of modules/contrib/ai_provider_mistral/src/Plugin/AiProvider/MistralProvider.php).

Also on the default settings route: `/admin/config/ai/settings` - I have no models listed for Mistral AI.

🇨🇭Switzerland dan2k3k4 Zurich

It appears to work, at least if I use amazee.ai endpoint as the LiteLLM and use a key from there, then I can see LiteLLM as an available provider

However, I don't see models but I think that's an issue on the amazee.ai provider (api) side and not the LiteLLM module (as I can reproduce the issue for the amazee.ai provider module).

We may need to bump this to 1.2.x branch like for ai_provider_openai.

🇨🇭Switzerland dan2k3k4 Zurich

Works locally for me.

The chat() method that @admitriiev mentioned, can be reworked at a later stage if needed.

🇨🇭Switzerland dan2k3k4 Zurich

@andrewbelcher perhaps, we should default to FALSE? Providers should explicitly pass TRUE if they support it?

🇨🇭Switzerland dan2k3k4 Zurich

@andrewbelcher perhaps, we should default to FALSE? Providers should explicitly pass TRUE if they support it?

🇨🇭Switzerland dan2k3k4 Zurich

We've opted into Security Coverage, and I've pushed up a stable 1.0.0.

I'm in touch with the maintainer of ai_vdb_provider_postgres to push forward with getting a stable release for that for 1.1.x / 1.2.x.

I think we can also push up a stable 1.1.x too.

🇨🇭Switzerland dan2k3k4 Zurich

Tests passed, merged MR, and backported to 1.1.x too

🇨🇭Switzerland dan2k3k4 Zurich

The team_id should work now, however MR fails on phpunit tests but seems unrelated to the code

🇨🇭Switzerland dan2k3k4 Zurich

dan2k3k4 made their first commit to this issue’s fork.

Production build 0.71.5 2024