Struct chat_splitter::ChatSplitter
source · pub struct ChatSplitter { /* private fields */ }
Expand description
Chat splitter for OpenAI’s chat models when using async_openai
.
For more detailed information, see the crate documentation.
Implementations§
source§impl ChatSplitter
impl ChatSplitter
sourcepub fn new(model: impl Into<String>) -> Self
pub fn new(model: impl Into<String>) -> Self
Create a new ChatSplitter
for the given model.
Panics
If for some reason tiktoken_rs
gives a context size twice as large
as what would fit in a u16
.
If this happens,
it should be considered a bug,
but this behaviour might change in the future,
as models with larger context sizes are released.
sourcepub fn max_messages(self, max_messages: impl Into<usize>) -> Self
pub fn max_messages(self, max_messages: impl Into<usize>) -> Self
Set the maximum number of messages to have in the chat.
Splits will have at most that many messages, never more.
sourcepub fn max_tokens(self, max_tokens: impl Into<u16>) -> Self
pub fn max_tokens(self, max_tokens: impl Into<u16>) -> Self
Set the maximum number of tokens to leave for chat completion.
This is the same as in the official API and given to async_openai
.
The total length of input tokens and generated tokens is limited by the
model’s context size.
Splits will have at least that many tokens
available for chat completion,
never less.
sourcepub fn model(self, model: impl Into<String>) -> Self
pub fn model(self, model: impl Into<String>) -> Self
Set the model to use for tokenization,
e.g.,
gpt-3.5-turbo
.
It is passed to tiktoken_rs
to select the correct tokenizer.
sourcepub fn split<'a, M>(&self, messages: &'a [M]) -> (&'a [M], &'a [M])where
M: IntoChatCompletionRequestMessage + Clone,
pub fn split<'a, M>(&self, messages: &'a [M]) -> (&'a [M], &'a [M])where M: IntoChatCompletionRequestMessage + Clone,
Split the chat into two groups of messages, the ‘outdated’ and the ‘recent’ ones.
The ‘recent’ messages are guaranteed to satisfy the given limits, while the ‘outdated’ ones contain all the ones before ‘recent’.
For a detailed usage example,
see examples/chat.rs
.
Panics
If tokenizer for the specified model is not found or is not a supported chat model.
Trait Implementations§
source§impl Clone for ChatSplitter
impl Clone for ChatSplitter
source§fn clone(&self) -> ChatSplitter
fn clone(&self) -> ChatSplitter
1.0.0 · source§fn clone_from(&mut self, source: &Self)
fn clone_from(&mut self, source: &Self)
source
. Read more