Expand description
LLM Provider and Model Configuration
This module provides the configuration types for LLM providers and models.
§Provider Configuration
Providers are configured in a providers HashMap where the key becomes the
model prefix for routing requests to the correct provider.
§Built-in Providers
openai- OpenAI APIanthropic- Anthropic API (supports OAuth viaaccess_token)gemini- Google Gemini API
For built-in providers, you can use the model name directly without a prefix:
claude-sonnet-4-5→ auto-detected as Anthropicgpt-4→ auto-detected as OpenAIgemini-2.5-pro→ auto-detected as Gemini
§Custom Providers
Any OpenAI-compatible API can be configured using type = "custom".
The provider key becomes the model prefix.
§Model Routing
Models can be specified with or without a provider prefix:
claude-sonnet-4-5→ auto-detected asanthropicprovideranthropic/claude-sonnet-4-5→ explicitanthropicprovideroffline/llama3→ routes toofflinecustom provider, sendsllama3to APIcustom/anthropic/claude-opus→ routes tocustomprovider, sendsanthropic/claude-opusto the API
§Example Configuration
[profiles.default]
provider = "local"
smart_model = "claude-sonnet-4-5" # auto-detected as anthropic
eco_model = "offline/llama3" # custom provider
[profiles.default.providers.anthropic]
type = "anthropic"
# api_key from auth.toml or ANTHROPIC_API_KEY env var
[profiles.default.providers.offline]
type = "custom"
api_endpoint = "http://localhost:11434/v1"Structs§
- Generation
Delta Tool Use - LLMAnthropic
Options - Anthropic-specific options
- LLMChoice
- LLMCompletion
Response - LLMCompletion
Stream Response - LLMGoogle
Options - Google/Gemini-specific options
- LLMInput
- LLMMessage
- LLMMessage
Image Source - LLMOpenAI
Options - OpenAI-specific options
- LLMProvider
Config - Aggregated provider configuration for LLM operations
- LLMProvider
Options - Provider-specific options for LLM requests
- LLMStream
Choice - LLMStream
Delta - LLMStream
Input - LLMThinking
Options - Thinking/reasoning options
- LLMToken
Usage - LLMTool
- Prompt
Tokens Details - SimpleLLM
Message
Enums§
- Generation
Delta - LLMMessage
Content - LLMMessage
Typed Content - LLMModel
- Provider
Config - Unified provider configuration enum
- SimpleLLM
Role - Token
Type