Expand description
Configuration types for LLM providers.
This module provides configuration structures for all supported LLM providers.
Each provider has its own config type implementing ProviderConfig, plus
shared types for default parameters and dual-path setups.
§Quick Start
use multi_llm::{LLMConfig, OpenAIConfig, DefaultLLMParams, UnifiedLLMClient};
// Create config programmatically
let config = LLMConfig {
provider: Box::new(OpenAIConfig {
api_key: Some("sk-...".to_string()),
..Default::default()
}),
default_params: DefaultLLMParams::default(),
};
let client = UnifiedLLMClient::from_config(config)?;§From Environment Variables
use multi_llm::{LLMConfig, UnifiedLLMClient};
// Uses AI_PROVIDER and provider-specific env vars
let config = LLMConfig::from_env()?;
let client = UnifiedLLMClient::from_config(config)?;§Provider-Specific Configs
| Provider | Config Type | Required Env Vars |
|---|---|---|
| OpenAI | OpenAIConfig | OPENAI_API_KEY |
| Anthropic | AnthropicConfig | ANTHROPIC_API_KEY |
| Ollama | OllamaConfig | (none, local) |
| LM Studio | LMStudioConfig | (none, local) |
Structs§
- Anthropic
Config - Configuration for Anthropic Claude models.
- DefaultLLM
Params - Default parameters for LLM generation.
- LLMConfig
- System-wide LLM configuration.
- LMStudio
Config - Configuration for LM Studio local models.
- Ollama
Config - Configuration for Ollama local models.
- OpenAI
Config - Configuration for OpenAI GPT models.
Traits§
- Provider
Config - Trait for provider-specific configuration.