Skip to main content

Crate walrus_model

Crate walrus_model 

Source
Expand description

Model crate — LLM provider implementations, enum dispatch, configuration, construction, and runtime management.

Merges all provider backends (OpenAI, Claude, Local) with the Provider enum, ProviderManager, and ProviderConfig into a single crate. Config uses ApiStandard (OpenAI or Anthropic) to select the API protocol. All OpenAI-compatible providers route through the OpenAI backend.

Re-exports§

pub use config::ApiStandard;
pub use config::HfModelConfig;
pub use config::ModelConfig;
pub use config::ProviderConfig;
pub use manager::ProviderManager;

Modules§

config
Provider configuration.
manager
ProviderManager — concurrent-safe named provider registry with model routing and active-provider swapping.
remote
remove providers

Structs§

Client
An asynchronous Client to make Requests with.

Enums§

Provider
Unified LLM provider enum.

Functions§

build_provider
Construct a remote Provider from config and a shared HTTP client.
default_model
Default model name when none is configured.