- Issue created by @jan kellermann
- 🇩🇪Germany jan kellermann
I added the invoke call.
Please review and feedback.
Thank you!
- 🇮🇳India akulsaxena
Hey, the cspell pipeline is failing due to the word 'operationtype'
Can you please fix that? - 🇩🇪Germany marcus_johansson
It was on purpose actually from the beginning to keep the operation types tied to the AI module, so there didn't sprout up things like Chat with Vision, Chat with Image, Chat with Picture etc. that only worked on a specific provider, thus killing the whole idea of having an abstraction layer.
The idea was to sooner or later open it up as a plugin system, when there were enough "normal" operation types in the AI module. But your idea of using a hook, might actually be a really smart solution, since it will be an operation type specifically for the installed module - and if two or more provider seems to do similar or same operation types, then it goes into the AI module.
I'll chip in the other maintainers, thanks as always Jan!
- 🇩🇪Germany jan kellermann
@akulsaxena: Thank you for feedback. I added operationtype to cspell because the hook name "operation_type" would be misleading.
@marcus_johansson: Thank you again for feedback and the strategic thinking. We need this for search-api because AnythingLLM directly provides an endpoint for the indexing (embedding incl. vdb).
- 🇮🇳India akulsaxena
Thanks, 'operationtype' was added to cspell file and all the pipelines are green now
Changes LGTM
Moving to RTBC+ - 🇬🇧United Kingdom yautja_cetanu
Could you provide any insights into why you wanted a new operation type?
Its possible that the hook alter is good but the abstraction layer stops being one if every provider just has it's own operation type
- 🇬🇧United Kingdom yautja_cetanu
From the sounds of things that operation end type looks like it woild be good in the module? An embedding with llm endpoint?
Similarly the reranking endpoint seems like a good one for the ai module.
Maybe it's worth including though anyway bevause would it be that bad if the future has end points being in their own api modules?
Like you could have a translation endpoint that could swap between Llms and ML or something.
Actually I do agree with this change because I don't like it as I'd prefer if we thought this through and had some plan for operation types but it's sometimes good to do things organically and see what happens. I do wonder if operation types are going to morph into something where there are tons of different specific common functions that you wsnt to swap out (an alt text generation endpoint).
- 🇩🇪Germany jan kellermann
> Could you provide any insights into why you wanted a new operation type?
AnythingLLM provides ONE endpoint for embedding AND storing embedded data in vector database. AnythingLLM is an abstraction layer also (you can choose which LLM for embedding and which VDB for storing the data). This means we cannot separate embedding and saving in the vector database.