Expand description
siumai-provider-ollama
Ollama provider implementation + shared Ollama protocol standard.
This crate owns:
- the Ollama provider implementation (client + builder + extensions)
- the Ollama protocol mapping and streaming helpers used by the provider
Modules§
- auth
- Authentication helpers and token providers. This module defines a minimal trait to supply Bearer tokens (e.g., for Vertex AI).
- builder
- Builder utilities shared across provider crates.
- client
- Client Module
- core
- Core Abstractions
- defaults
- Default Configuration Values
- error
- Error handling (re-export).
- execution
- Execution Layer - Unified Public API
- hosted_
tools - Provider-defined tools (core crate).
- observability
- Observability entrypoint: unified namespace for tracing and telemetry.
- params
- Parameter Management Module
- provider_
metadata - Provider-owned typed response metadata. Provider-owned typed response metadata.
- provider_
options - Provider-owned typed option structs (Ollama-specific). Provider-owned typed option structs (Ollama).
- providers
- retry
- Retry module (ergonomic namespace)
- retry_
api - Public Retry API Facade
- standards
- streaming
- Streaming Module
- traits
- Core Trait Definitions
- types
- Core data type definitions (re-export).
- utils
- Utility modules for siumai
Structs§
- Chat
Response - Chat response from the provider
- Common
Params - Common AI parameters
Enums§
- LlmError
- The primary error type for the LLM library.