- Issue created by @drupals.user
Hi @drupals.user,
There's a rough roadmap in the description for what I believe should happen before this module is considered stable:
Handle the changes to the table schema better
Work out where we can make these schema changes but keep user data for as long as possible
Potentially look to search_api_db to see how it solves this problem
Currently tables may have to be dropped by clearing the index for big schema changes
Investigate if this can use coreโs db connection
Lots of calls are made to sanitise what could be user input, can these calls be cached for performance
Tests
Support for facetsThe main parts I think are use of Drupal's core db connection, the integration of that with search_api, which can take inspiration for search_api_db, and the addition of a comprehensive test suite.
There is currently no estimated timeline for this. I am looking at the refactor of the logic to use core's db connection but it is a significant refactor and is taking some time of which I don't have much, unfortunately. I'm pushing for more dedicated time to work on this.
The current version is very much in alpha and I would not consider it production ready.
The basics are there and you'd likely be able to have a production site with a vector db search running, however the code has no tests so could be missing something and could be prone to regressions.
I'd also not consider the module at this point to be data safe โ there are not a good level of events or hooks I can use, in its current state, to check when I need to rebuild the index. See https://www.drupal.org/project/ai_vdb_provider_postgres/issues/3535073 ๐ Clearing search index deletes any extra filter fields Active . I'm working on this with the core db connection refactor. A search index is naturally something that can be lost and rebuilt quite safely, but if you have a large index, this could take time and potentially add cost to transform to vector at index.