Allow chunking for openai_eca

Created on 10 January 2024, 12 months ago

Problem/Motivation

Hi,

Very exciting to see ECA support in the openai module suite :)
On first try I ran into the issue that processing long prompts/outputs (rewriting texts, for instance) results in an incomplete result being returned. I understand the module does not address token limits, but I wonder if it could work around them via a chunking mechanism.

Hope this can be useful to others as well and can be an addition to the module in a future release.

Steps to reproduce

Proposed resolution

Remaining tasks

User interface changes

API changes

Data model changes

Feature request
Status

Active

Version

1.0

Component

Code

Created by

🇹🇭Thailand AlfTheCat

Live updates comments and jobs are added and updated live.
Sign in to follow issues

Comments & Activities

Production build 0.71.5 2024