Change default fallback for token counting to cl100k_base

Created on 3 September 2024, 18 days ago
Updated 17 September 2024, 4 days ago

Problem/Motivation

Currently we fall back to gpt 3.5 turbo for counting when tiktoken-php does not have the provided model, yet this is confusing for developers as it implies openai is used in some errors yet openai may not actually be used.
cl100k_base is what gpt 3.5 uses anyways so change it to be the fallback.

Steps to reproduce

Use a model that is not supported by tiktoken-php

Proposed resolution

Fallback to cl100k_base

Remaining tasks

MR

User interface changes

N/A

API changes

N/A

Data model changes

N/A

📌 Task
Status

Fixed

Version

1.0

Component

AI Core module

Created by

🇬🇧United Kingdom scott_euser

Live updates comments and jobs are added and updated live.
Sign in to follow issues

Merge Requests

Comments & Activities

Production build 0.71.5 2024