Expand description
Model crate — LLM provider implementations via crabtalk, enum dispatch, configuration, construction, and runtime management.
Uses crabtalk-provider for the actual LLM backends (OpenAI, Anthropic,
Google, Bedrock, Azure). Wraps them behind wcore’s Model trait with
type conversion and retry logic.
Re-exports§
pub use config::ModelConfig;pub use manager::ProviderRegistry;
Modules§
- config
- Model configuration.
- manager
ProviderRegistry— concurrent-safe named provider registry with model routing and active-provider swapping.
Structs§
- Client
- An asynchronous
Clientto make Requests with. - Provider
- Unified LLM provider wrapping a crabtalk provider instance.
- Provider
Def - Configuration for a single LLM provider.
Enums§
- ApiStandard
- Which provider implementation to use.
Functions§
- build_
provider - Construct a
Providerfrom a provider definition and model name. - default_
model - Default model name when none is configured.