Add an event that triggers on chat iterator consumption

Created on 25 November 2024, 27 days ago

Problem/Motivation

Currently the logging of responses are not possible on streamedresponses, since its triggered at the beginning of the response, rather then the end of a response. This causes issues for modules like the logging module that wants to interact and store the actual response.

There is this issue for this already, but it might be overkill and will need to send in custom iterators into the OpenAI client, instead of just managing what we take care of. This pattern might also not be available in all clients.

Proposed resolution

Figure out if magic method destroy or a forced interface is the best path forward.
Add an event when the whole stream is consumed that exposes the response and meta data.
Force the streaminterfaces to fill out this data.

Remaining tasks

User interface changes

API changes

Data model changes

📌 Task
Status

Active

Version

1.0

Component

AI Core module

Created by

🇩🇪Germany marcus_johansson

Live updates comments and jobs are added and updated live.
Sign in to follow issues

Comments & Activities

Production build 0.71.5 2024