- Issue created by @seogow
The current Text Automator in the AI module effectively handles many-to-many relationships using prompts like “Provide output row per each input row.” However, it encounters difficulties when processing long texts due to token limitations of Large Language Models (LLMs).
Introduce an “Advanced Mode (Token, Chunked)” option under the “Automator Input Mode” settings. This mode will:
Needs work
1.0
AI Automators