Struct aleph_alpha_client::Stopping
source · pub struct Stopping<'a> {
pub maximum_tokens: u32,
pub stop_sequences: &'a [&'a str],
}
Expand description
Controls the conditions under which the language models stops generating text.
Fields§
§maximum_tokens: u32
The maximum number of tokens to be generated. Completion will terminate after the maximum number of tokens is reached.Increase this value to allow for longer outputs. A text is split into tokens. Usually there are more tokens than words. The total number of tokens of prompt and maximum_tokens depends on the model.
stop_sequences: &'a [&'a str]
List of strings which will stop generation if they are generated. Stop sequences are helpful in structured texts. E.g.: In a question answering scenario a text may consist of lines starting with either “Question: “ or “Answer: “ (alternating). After producing an answer, the model will be likely to generate “Question: “. “Question: “ may therfore be used as stop sequence in order not to have the model generate more questions but rather restrict text generation to the answers.