- Issue created by @seogow
- First commit to issue fork.
- Merge request !524Issue #3489273: Implement Advanced Input Mode with Token Chunking for Text Automator → (Open) created by Unnamed author
The current Text Automator in the AI module effectively handles many-to-many relationships using prompts like “Provide output row per each input row.” However, it encounters difficulties when processing long texts due to token limitations of Large Language Models (LLMs).
Introduce an “Advanced Mode (Token, Chunked)” option under the “Automator Input Mode” settings. This mode will:
Needs work
1.0
AI Automators