- Issue created by @wouters_f
- 🇬🇧United Kingdom yautja_cetanu
I'm going to put this into Automators for now. This is very easy to do with Automators manually for each kind of Content.
But there is a design pattern that may be more similar to what Augmenters does where its a reusable design pattern you apply to all content that might be worthwhile.
Also, correct me if I'm wrong Marcus, but it seems like validation with Automators could prevent something being published. But it could prevent it being uploaded (Maybe if we expose it as a widget or something).
100% this should be supported functionality somewhere as it will be used SO much! It possibly overlaps with the proposed AI_Security module.
- 🇩🇪Germany Marcus_Johansson
I/we have looked into this already for form validation via YOLO. I think it would make sense to have validators for certain fields. I think it is not for automators, but rather the editorial experience.
In a text field you can have a form widget add-on that adds a validator that scans the text against a prompt - "Is this text leaking any information about our future projects?" and it stops and gives you a validation message if it does.
Or for an image validation for a Mercedes site - "Is this image a Mercedes car, otherwise do not validate".
So its outside of the scope of the automator.
We should look if there is a module with a plugin system that does this today without AI already. Then we can build in AI for it. That would take less then a day.
Building a stable framework for text and image fields it is 2-4 days.
- 🇧🇪Belgium wouters_f Leuven
I'm also not sure if automators is the place to be.
inspiration
I know a safe search has been built in https://www.drupal.org/project/google_vision →
I have integrated it and have some code snippets in a slideshow i did in 2019
https://drive.google.com/file/d/1dNVaFdjeEnMceFC-SEcUwQfk55FXXE-O/view?u...
(skip to slides 55 and further)
Might serve as inspiration?The simplest version
- Widget (only shows a error after inputting wrong content) shows nothing if correct) Later we could provide custom widgets.
- Overview of AI validations (entity?) (could be a bit like metatag, have a separate place).
- Add validation form1. Select field (could later even be multiple fields using same validation)
2. if TXT: show input field for prompt (this is the MVP case I think)
2. if IMG: select vision api and output evaluation "rule" (visions modules should provide these "nudity detection" "person is smiling" based on what the api allows).- ai_validation would then just call the validation rule in the vision module.
Some more examples
Rules (for images) that should be provided to the validation module (by the vision module)- "image has label [inputfield] "
- "image should not have label [inputfield] "
- "image main color [inputfield] "
- "image without color [inputfield] "
- "image (ocr) contains text "
- "image (ocr) contains text [inputfield]"
- "image (ocr) does not contain text "
- "image (ocr) does not contain text [inputfield]"
- 🇩🇪Germany Marcus_Johansson
A use case for our module bot :D
This module has a plugin system for this: https://www.drupal.org/project/field_validation →
We could do a AI field validation module in the AI module, but since field validation is a contributed module, I would opt for this being its own module and using the Field Validation and AI module together.
No need to reinvent the wheel.
- 🇧🇪Belgium wouters_f Leuven
I've created the module and a plugin for textual validating.
You can now configure the AI text validator
And then you can select a prompt and error message for this field
If you then submit the form
You will see the validation being triggered.
1 task left
in
AiTextConstraintValidator.php
there is 1 TODO left.
I was not able to use the AI with the example code.
SO if you could replace this with the call to the LLM. I'd be happy to test this. - Status changed to Needs review
7 days ago 1:23am 22 June 2024 - 🇬🇧United Kingdom yautja_cetanu
One thing we should do with this in documentation. If we use this for moderating potentially harmful content (eg suicidal ideation) we could write some documentation after doing some research on how to handle it with the various models as some won't do it, and others will as long as you get the prompt right.
We could explore whether or not we are OK recommending jailbroken open source models for things like this too.
- 🇧🇪Belgium wouters_f Leuven
Should I make a separate issue for the visual validation?
I created this ticket in google_vision https://www.drupal.org/project/google_vision/issues/3456401#comment-1565... ✨ Add google vision validators to AI_validators (ai submodule) Active to plug that one in.