Module config

Module config 

Source
Expand description

Configuration types for LLM providers.

This module provides configuration structures for all supported LLM providers. Each provider has its own config type implementing ProviderConfig, plus shared types for default parameters and dual-path setups.

§Quick Start

use multi_llm::{LLMConfig, OpenAIConfig, DefaultLLMParams, UnifiedLLMClient};

// Create config programmatically
let config = LLMConfig {
    provider: Box::new(OpenAIConfig {
        api_key: Some("sk-...".to_string()),
        ..Default::default()
    }),
    default_params: DefaultLLMParams::default(),
};

let client = UnifiedLLMClient::from_config(config)?;

§From Environment Variables

use multi_llm::{LLMConfig, UnifiedLLMClient};

// Uses AI_PROVIDER and provider-specific env vars
let config = LLMConfig::from_env()?;
let client = UnifiedLLMClient::from_config(config)?;

§Provider-Specific Configs

ProviderConfig TypeRequired Env Vars
OpenAIOpenAIConfigOPENAI_API_KEY
AnthropicAnthropicConfigANTHROPIC_API_KEY
OllamaOllamaConfig(none, local)
LM StudioLMStudioConfig(none, local)

Structs§

AnthropicConfig
Configuration for Anthropic Claude models.
DefaultLLMParams
Default parameters for LLM generation.
LLMConfig
System-wide LLM configuration.
LMStudioConfig
Configuration for LM Studio local models.
OllamaConfig
Configuration for Ollama local models.
OpenAIConfig
Configuration for OpenAI GPT models.

Traits§

ProviderConfig
Trait for provider-specific configuration.