Skip to main content

Module llm

Module llm 

Source
Expand description

LLM client — hybrid provider with trait-based adapter.

LlmProvider trait with multiple implementations:

  • OllamaProvider — local Ollama server
  • OpenAiProvider — OpenAI compatible APIs

Structs§

Message
A message in the conversation.
OllamaProvider
Ollama LLM provider.
OpenAiProvider
OpenAI-compatible provider (works with OpenAI, OpenRouter, etc.)
ProviderConfig
Configuration for LLM provider selection.
Response
Complete LLM response.
ResponseChunk
LLM response chunk (for streaming).
Usage
Token usage statistics.

Enums§

LlmError
Errors from the LLM layer.
Role
Message roles.

Traits§

LlmProvider
Trait for LLM providers.

Functions§

create_provider
Create an LLM provider from configuration.
extract_json_from_response
Extract a JSON object from an LLM response string.