Skip to main content

Crate llm_stack_ollama

Crate llm_stack_ollama 

Source
Expand description

Ollama provider for the llm-stack SDK.

This crate implements Provider for Ollama’s Chat API, supporting both non-streaming and streaming generation with tool calling.

Ollama runs locally and requires no authentication by default.

§Quick start

use llm_stack_ollama::{OllamaConfig, OllamaProvider};
use llm_stack::{ChatMessage, ChatParams, Provider};

let provider = OllamaProvider::new(OllamaConfig::default());

let params = ChatParams {
    messages: vec![ChatMessage::user("Hello!")],
    ..Default::default()
};

let response = provider.generate(&params).await?;
println!("{}", response.text().unwrap_or("no text"));

Structs§

OllamaConfig
Configuration for the Ollama provider.
OllamaFactory
Factory for creating OllamaProvider instances from configuration.
OllamaProvider
Ollama provider implementing Provider.

Functions§

register_global
Registers the Ollama factory with the global registry.