Struct llm_base::model::ModelParameters
source · pub struct ModelParameters {
pub prefer_mmap: bool,
pub n_context_tokens: usize,
pub inference_params: InferenceParameters,
pub inference_prompt_params: InferenceWithPromptParameters,
}Expand description
Parameters for tuning model instances
Fields§
§prefer_mmap: boolFor GGML formats that support it, mmap
is the default. Although mmap typically improves performance, setting this value to false may
be preferred in resource-constrained environments.
n_context_tokens: usizeThe context size (“memory”) the model should use when evaluating a prompt. A larger context consumes more resources, but produces more consistent and coherent responses.
inference_params: InferenceParametersDefault InferenceParameters to use when evaluating a prompt with this model.
inference_prompt_params: InferenceWithPromptParametersDefault InferenceWithPromptParameters to use when evaluating a prompt with this model.
Trait Implementations§
Auto Trait Implementations§
impl RefUnwindSafe for ModelParameters
impl Send for ModelParameters
impl Sync for ModelParameters
impl Unpin for ModelParameters
impl UnwindSafe for ModelParameters
Blanket Implementations§
source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere T: ?Sized,
source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Mutably borrows from an owned value. Read more