Store resulting embedding from AI into local k/v for later lookup

Created on 27 August 2025, about 1 month ago

Problem/Motivation

In Search API, each facet/filter/sort etc request prompts another search, which causes another text input embedding generation to happen. This is costly and not performant.

Proposed resolution

Create a local table that stores id, text, and vector at a minimum. In the event subscriber, look up the keys and return the vector if there is a record found. Otherwise, generate and store the vector result so it is then available in future searches.

Data model changes

Introduce table/schema.

✨ Feature request
Status

Active

Version

1.0

Component

Code

Created by

πŸ‡ΊπŸ‡ΈUnited States kevinquillen

Live updates comments and jobs are added and updated live.
Sign in to follow issues

Comments & Activities

Production build 0.71.5 2024