wouters_f → created an issue.
wouters_f → created an issue.
wouters_f → created an issue.
Looks good!
Merged
wouters_f → made their first commit to this issue’s fork.
bramdriesen → credited wouters_f → .
There is a way to skip moderations for your content:
https://www.drupal.org/project/ai/issues/3510599#comment-16028722
✨
Allow skipping of moderations for some embeddings (controlled input)
Active
(also updated branch)
Yes valid feedback added your proposed changes.
Thanks for this.
vrancje → credited wouters_f → .
Quick local test seemed ok.
O yea that absolutely makes sense.
The spell fix issue was in lingo for a long time and other changes went in first. that explains the mismatch.
Thanks for pointing this out!
I am not a maintainer i can not merge this.
I am not a maintainer i can not merge this.
Merged
You are absolutely right and I already created a ticket for this:
https://www.drupal.org/project/ai_search_block/issues/3510772
✨
Debounce the submit button.
Active
I'll close the other one as your description is more complete.
Merged
wouters_f → created an issue.
Check also the related issue in openai provider.
https://www.drupal.org/project/ai_provider_openai/issues/3510601
✨
allow skipping of moderations for some embeddings (not all)
Active
wouters_f → created an issue.
I have seen these logs and its absolutely better than nothing.
Even better would be a dead letter queue for these messages but that would be a search_api or core queue thing.
wouters_f → created an issue.
@stanzin: oK!
wouters_f → created an issue.
borisson_ → credited wouters_f → .
wouters_f → created an issue.
Basic logging of queries and answers (and the nodes) is in there.
merged and fixed
Merged!
wouters_f → made their first commit to this issue’s fork.
MR dale smith,
I'd be curious to learn how to do that (maybe to demo for the AI workshop in davos).
Could you describe it to me a little more verbose how that should happen?
That would be awesome.
This always starts from a node right?
You can't start from a view where you select an automator and add it on top (tus for example summarise the data)?
@dotist if this is sufficient you may close this issue.
Relevant links to the documentation from Marcus:
https://project.pages.drupalcode.org/ai/developers/develop_third_party_module/
https://project.pages.drupalcode.org/ai/developers/call_chat/
(and pulled in 1.0.x so that we dont bump into other merge conflicts)
Resolved the comments that made sense, the hyperlink one I think should not happen.
I've added an example below.
You can also find working code
- here (backend)
- and here (javascript)
The (simplified a bit) example:
Backend:
$output = $provider->chat($input, $ai_model_to_use, ['my_module_tag']);
$response = $output->getNormalized();
return new StreamedResponse(function () use ($response) {
$log_output = '';
foreach ($responseas $part) {
$item = [];
$item['answer_piece'] = $part->getText();
$out = Json::encode($item);
echo $out . '|§|';
ob_flush();
flush();
}
}, 200, [
'Cache-Control' => 'no-cache, must-revalidate',
'Content-Type' => 'text/event-stream',
'X-Accel-Buffering' => 'no',
]);
And in the frontend you want to catch it like so:
try {
var xhr = new XMLHttpRequest();
xhr.open('POST', drupalSettings.ai_search_block.submit_url, true);
xhr.setRequestHeader('Content-Type', 'application/json');
xhr.setRequestHeader('Accept', 'application/json');
// Cache variables to hold the full output.
var lastResponseLength = 0;
var joined = '';
xhr.onprogress = function () {
var responseText = xhr.responseText || '';
// Get only the new part of the response.
var newData = responseText.substring(lastResponseLength);
lastResponseLength = responseText.length;
// Split new data using the delimiter.
var chunks = newData.trim().split('|§|').filter(Boolean);
// Parse each chunk and accumulate the answer pieces.
chunks.forEach(function (chunk) {
try {
var parsed = JSON.parse(chunk);
joined += parsed.answer_piece || '';
} catch (e) {
console.error('Error parsing chunk:', e, chunk);
}
});
// Overwrite the full output (letting browsers fix broken HTML)
// and re-append the loader.
$resultsBlock.html(joined).append($loader);
};
xhr.onreadystatechange = function () {
if (xhr.readyState === 4) {
if (xhr.status === 200) {
// Remove the loader upon successful completion.
$loader.remove();
if ($suffixText.length) {
$suffixText.html(drupalSettings.ai_search_block.suffix_text);
Drupal.attachBehaviors($suffixText[0]);
$suffixText.show();
}
// (Optional) If needed, update log Id from final response here.
} else if (xhr.status === 500) {
$resultsBlock.html('An error happened.');
console.error('Error response:', xhr.responseText);
try {
var parsedError = JSON.parse(xhr.responseText);
if (parsedError.response && parsedError.response.answer_piece) {
$resultsBlock.html(parsedError.response.answer_piece);
}
Drupal.attachBehaviors($resultsBlock[0]);
} catch (e) {
console.error('Error parsing 500 response:', e);
}
}
}
};
// Send the streaming request.
xhr.send(
JSON.stringify({
query: queryVal,
stream: streamVal,
block_id: blockIdVal
})
);
} catch (e) {
console.error('XHR error:', e);
}
I thin kif you use the openai module you should be ale to switch the backend calls pretty seamlessly.
I can imagine you don't want to rebuild the AI keys and connectors in the frontend so having them as an API seems obvious to me.
I'm not aware of plans of exposing them as generic API's at the moment.
there are some (specific) implementations that are not bound to a vendor and work with all ai providers.
- Check the controller for translations in ai_translate,
- check the controller for completion in ai_ckeditor
- there's a controller for image generation in ai_image
Having them as a separate module could make sense.
I'd call it ai_api, which you can then use in your frontends.
If someone built that I would consider switching those modules to use the generic one.
You could ask Marcus What the plans or ideas are since he's in the lead at the moment.
These two are boilerplate I dont even remember from where. They can be removed.
Merged
Manually tested. Good to go.
wouters_f → made their first commit to this issue’s fork.
Merged.
Strange that this sneeked in,
This fix looks pretty straightforward to me. Good for me.
bramdriesen → credited wouters_f → .
Tested and merged!
I think after that's fixed, it looks good.
I tested it manually and:
if i disable streaming it works.
If i enable streaming, the server streams back the responses, but the js seems not to do anything anymore.
I've debugged it a bit and see that you now test "streamVal" which used to be
const $stream = $form.find('[data-drupal-selector="edit-stream"]').val() === 'true';
but you now test it with
if (streamVal === '1') {
So I guess that's why it stopped working.
Hey Sirclickalot, great idea.
If you're interested I'm willing to set something up.
maybe there's already a solution for you with automators or with ECA and the
ECA Ai plugins →
.
With ECA you could use the event submission, and then trigger a AI node with your fancy prompt (and the assignment as input).
The output can then be put in a separate field (or wherever).
But obviously we can also make a separate module with a ckeditor plugin that triggers a prompt. so many options :D
if you are really blocked, you can try to create a ai_content folder in ai/modules and add a ai_content.info.yml with some contents
name: AI Content
description: Placeholder.
package: AI
type: module
core_version_requirement: ^10.2 || ^11
dependencies:
- ai:ai
That will allow you to uninstall it and fter that you can remove it again.
What happened is the following.
At some point the submodule ai_content was renamed to ai_content_suggestions.
I propose to clear caches, uninstall that module (if enabled) clear caches again and then install the ai_content_suggestions submodule.
So two changes in this one:
- All fields selected by default
- Added some Javascript for cleaner interface (hide the selects and toggle on click)
Goal: editors have an easier, less cluttered experience using AI.
added an implementation with JS.
(all keeps working without JS).
wouters_f → created an issue.
I did not test it but the code looks sound.
I'm not sure, should we also check $account->hasPermission('translate any entity')
OR will that happen automatically in $handler->getTranslationAccess($entity, 'create')->isAllowed()
?
wouters_f → created an issue.
wouters_f → changed the visibility of the branch 1.0.x to hidden.
Fixed coding standards and used the renamed branch (1.0.x was a strange branch name there).
wouters_f → made their first commit to this issue’s fork.
If we use provider modules and we want them to support the proxy, we'll indeed need to add a proxy to the providers if they are using something else.
Any outhgoing request using \Drupal::httpClient() wil normally take the proxy settings into account.
So if we are using guzzle directly and not the drupal client your issue might indeed be true.
We should use the drupal wrapper for every outhgoing request in orde to have everything working over a proxy.
Removed the hardcoded colors, and now:
wouters_f → made their first commit to this issue’s fork.
We wont be fixing this here then.
@lammensj I trust you'll make a ticket for deprecating the internal eca module in favour of the spinoff separate one?
(there is also a merge conflict which might prevent it from getting merged. might be best to resolve that too).
Tested this:
Gave me this error:
NOTICE: PHP message: Uncaught PHP Exception AssertionError: "Failed to assert that "ai_content_suggestions_plugins, ai_providers, entity_types, user.roles:authenticated" are valid cache contexts." at /var/www/html/web/core/lib/Drupal/Core/Cache/Cache.php line 31
If i removed the line
$form['#cache']['contexts'][] = 'entity_types';
From the Settings form, it worked as expected.
wouters_f → made their first commit to this issue’s fork.
Manually tested. Looks good to me.
Merged
wouters_f → made their first commit to this issue’s fork.
wouters_f → created an issue.
The new interface alloows for better model selection
Sorry failure on my part.
wouters_f → created an issue.
tested with japanese stable diffusion, dalle2,3 and normal stable diffusion
wouters_f → created an issue.
wouters_f → made their first commit to this issue’s fork.
looks good to me
Change looks good to me.
Please merge this looks good to me