Expand description
LLM client — hybrid provider with trait-based adapter.
LlmProvider trait with multiple implementations:
OllamaProvider— local Ollama serverOpenAiProvider— OpenAI compatible APIs
Structs§
- Message
- A message in the conversation.
- Ollama
Provider - Ollama LLM provider.
- Open
AiProvider - OpenAI-compatible provider (works with OpenAI, OpenRouter, etc.)
- Provider
Config - Configuration for LLM provider selection.
- Response
- Complete LLM response.
- Response
Chunk - LLM response chunk (for streaming).
- Usage
- Token usage statistics.
Enums§
Traits§
- LlmProvider
- Trait for LLM providers.
Functions§
- create_
provider - Create an LLM provider from configuration.
- extract_
json_ from_ response - Extract a JSON object from an LLM response string.