Expand description
Model crate — LLM provider implementations, enum dispatch, configuration, construction, and runtime management.
Merges all provider backends (OpenAI, Claude, Local) with the Provider
enum, ProviderManager, and ProviderConfig into a single crate. Config
uses ApiStandard (OpenAI or Anthropic) to select the API protocol.
All OpenAI-compatible providers route through the OpenAI backend.
Re-exports§
pub use config::ApiStandard;pub use config::HfModelConfig;pub use config::ModelConfig;pub use config::ProviderConfig;pub use manager::ProviderManager;
Modules§
- config
- Provider configuration.
- manager
ProviderManager— concurrent-safe named provider registry with model routing and active-provider swapping.- remote
- remove providers
Structs§
- Client
- An asynchronous
Clientto make Requests with.
Enums§
- Provider
- Unified LLM provider enum.
Functions§
- build_
provider - Construct a remote
Providerfrom config and a shared HTTP client. - default_
model - Default model name when none is configured.