Crates.io | chat-splitter |
lib.rs | chat-splitter |
version | 0.1.1 |
source | src |
created_at | 2023-07-14 17:04:09.673204 |
updated_at | 2023-09-04 17:55:41.268937 |
description | Never exceed OpenAI's chat models' maximum number of tokens when using the async_openai Rust crate |
homepage | |
repository | https://github.com/schneiderfelipe/chat-splitter |
max_upload_size | |
id | 916498 |
size | 60,144 |
For more information, please refer to the blog announcement.
When utilizing the async_openai
Rust crate,
it is crucial to ensure that you do not exceed
the maximum number of tokens specified by OpenAI's chat models.
chat-splitter
categorizes chat messages into 'outdated' and 'recent' messages,
allowing you to split them based on both the maximum
message count and the maximum chat completion token count.
The token counting functionality is provided by
tiktoken_rs
.
Here's a basic example:
// Get all your previously stored chat messages...
let mut stored_messages = /* get_stored_messages()? */;
// ...and split into 'outdated' and 'recent',
// where 'recent' always fits the context size.
let (outdated_messages, recent_messages) =
ChatSplitter::default().split(&stored_messages);
For a more detailed example,
see examples/chat.rs
.
Contributions to chat-splitter
are welcome!
If you find a bug or have a feature request,
please submit an issue.
If you'd like to contribute code,
please feel free to submit a pull request.
License: MIT