pub struct GenerateOptions {
pub temperature: Option<f64>,
pub top_p: Option<f64>,
pub top_k: Option<u32>,
pub num_predict: Option<u32>,
pub stop: Option<Vec<String>>,
pub seed: Option<u64>,
}
Expand description
Options for generating text with Ollama
This struct contains parameters that control how the Ollama model generates text, including temperature, top-p sampling, and other generation parameters.
§Examples
use projets_indexer::ollama::GenerateOptions;
let options = GenerateOptions {
temperature: Some(0.7),
top_p: Some(0.9),
top_k: Some(40),
num_predict: Some(100),
stop: Some(vec!["\n".to_string()]),
seed: Some(42),
};
Fields§
§temperature: Option<f64>
Temperature for text generation
Controls the randomness of the output. Higher values make the output more random, while lower values make it more deterministic.
top_p: Option<f64>
Top-p sampling parameter
Controls diversity via nucleus sampling. Higher values allow more diverse outputs, while lower values make the output more focused.
top_k: Option<u32>
Top-k sampling parameter
Controls diversity by limiting the number of tokens considered for each step of text generation.
num_predict: Option<u32>
Maximum number of tokens to generate
The maximum length of the generated text in tokens.
stop: Option<Vec<String>>
Stop sequences
A list of strings that, when encountered, will stop the generation.
seed: Option<u64>
Random seed for generation
A seed value for the random number generator used in text generation. This allows for reproducible outputs.