Expand description
Ollama provider for the llm-stack SDK.
This crate implements Provider for Ollama’s
Chat API, supporting both non-streaming and streaming generation
with tool calling.
Ollama runs locally and requires no authentication by default.
§Quick start
use llm_stack_ollama::{OllamaConfig, OllamaProvider};
use llm_stack::{ChatMessage, ChatParams, Provider};
let provider = OllamaProvider::new(OllamaConfig::default());
let params = ChatParams {
messages: vec![ChatMessage::user("Hello!")],
..Default::default()
};
let response = provider.generate(¶ms).await?;
println!("{}", response.text().unwrap_or("no text"));Structs§
- Ollama
Config - Configuration for the Ollama provider.
- Ollama
Factory - Factory for creating
OllamaProviderinstances from configuration. - Ollama
Provider - Ollama provider implementing
Provider.
Functions§
- register_
global - Registers the Ollama factory with the global registry.