- Issue created by @yautja_cetanu
The current prompt is hardcoded. This doesn't give people much control over what is sent to ChatGPT and doesn't give much room for experimentation.
- Make it so that the block has an advanced configuration for what the prompt that is sent to chatgpt is.
- Make it similar to the views_chatgpt but with a few hard-coded tokens [user-prompt] - [context]. Where user prompt is what the user has typed verbatim and the context is what is passed from pinecone.
- Make it so the debug page shows the prompt that will be used.
- If we do message history, make it so that outputs the exact prompt used for each message.
Active
1.0
Code