Support for ai automators finding specific meta data to provide as context alongside rag

Created on 2 July 2024, 11 months ago

Problem/Motivation

When providing an answer to a user's query they may want to use RAG. However it may be the case that there are a number of services that could provide extra metadata to the question to help the LLM provide an answer.

For example, perhaps for "I want to recreate reddit in Drupal", for each chunk the system found the attached module. It then used AI Automators to do a lookup on the Issue Queue health or number of downloads. The LLM was then prompted to take into account this meta data before answering. (For example, if the problem is quite niche than low number of downloads might not be a problem).

Proposed resolution

Instead of just a "Pre-prompt" before asking, maybe its possible to insert whole Automator chains before and after the vector search?

Remaining tasks

User interface changes

API changes

Data model changes

Feature request
Status

Active

Version

1.0

Component

AI Search

Created by

🇬🇧United Kingdom yautja_cetanu

Live updates comments and jobs are added and updated live.
Sign in to follow issues

Comments & Activities

Production build 0.71.5 2024