pub struct OllamaConfig {
pub base_url: String,
pub default_model: String,
pub max_context_tokens: usize,
pub retry_policy: RetryPolicy,
}Expand description
Configuration for Ollama local models.
Ollama is a tool for running open-source LLMs locally. It provides an OpenAI-compatible API and doesn’t require an API key.
§Example
use multi_llm::OllamaConfig;
let config = OllamaConfig {
base_url: "http://localhost:11434".to_string(),
default_model: "llama2".to_string(),
max_context_tokens: 4096,
..Default::default()
};§Environment Variables
None required (local service).
§Popular Models
llama2: Meta’s Llama 2mistral: Mistral AI’s modelcodellama: Code-specialized Llamaphi: Microsoft’s Phi model
Install models with: ollama pull <model-name>
Fields§
§base_url: StringBase URL for the Ollama server (default: http://localhost:11434).
default_model: StringDefault model to use (must be pulled with ollama pull).
max_context_tokens: usizeMaximum context window size (depends on model).
retry_policy: RetryPolicyRetry policy for transient failures.
Trait Implementations§
Source§impl Clone for OllamaConfig
impl Clone for OllamaConfig
Source§fn clone(&self) -> OllamaConfig
fn clone(&self) -> OllamaConfig
Returns a duplicate of the value. Read more
1.0.0 · Source§fn clone_from(&mut self, source: &Self)
fn clone_from(&mut self, source: &Self)
Performs copy-assignment from
source. Read moreSource§impl Debug for OllamaConfig
impl Debug for OllamaConfig
Source§impl Default for OllamaConfig
impl Default for OllamaConfig
Source§impl<'de> Deserialize<'de> for OllamaConfig
impl<'de> Deserialize<'de> for OllamaConfig
Source§fn deserialize<__D>(__deserializer: __D) -> Result<Self, __D::Error>where
__D: Deserializer<'de>,
fn deserialize<__D>(__deserializer: __D) -> Result<Self, __D::Error>where
__D: Deserializer<'de>,
Deserialize this value from the given Serde deserializer. Read more
Source§impl ProviderConfig for OllamaConfig
impl ProviderConfig for OllamaConfig
Source§fn provider_name(&self) -> &'static str
fn provider_name(&self) -> &'static str
Get the provider identifier (e.g., “openai”, “anthropic”).
Source§fn max_context_tokens(&self) -> usize
fn max_context_tokens(&self) -> usize
Get the maximum context window size in tokens.
Source§fn validate(&self) -> LlmResult<()>
fn validate(&self) -> LlmResult<()>
Validate that the configuration is complete and valid. Read more
Source§fn default_model(&self) -> &str
fn default_model(&self) -> &str
Get the default model name for this provider.
Source§fn retry_policy(&self) -> &RetryPolicy
fn retry_policy(&self) -> &RetryPolicy
Get the retry policy for transient failures.
Auto Trait Implementations§
impl Freeze for OllamaConfig
impl RefUnwindSafe for OllamaConfig
impl Send for OllamaConfig
impl Sync for OllamaConfig
impl Unpin for OllamaConfig
impl UnwindSafe for OllamaConfig
Blanket Implementations§
Source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
Source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Mutably borrows from an owned value. Read more