pub struct ProviderConfigs;Expand description
Predefined provider configurations for multiple AI services
Each provider configuration includes default models for chat and optional multimodal support. Developers can override these defaults or build custom model managers for advanced use cases.
Implementations§
Source§impl ProviderConfigs
impl ProviderConfigs
pub fn groq() -> ProviderConfig
pub fn openai() -> ProviderConfig
pub fn deepseek() -> ProviderConfig
pub fn ollama() -> ProviderConfig
Sourcepub fn xai_grok() -> ProviderConfig
pub fn xai_grok() -> ProviderConfig
xAI / Grok configuration - OpenAI-compatible hosted offering
Sourcepub fn azure_openai() -> ProviderConfig
pub fn azure_openai() -> ProviderConfig
Azure OpenAI configuration - highly compatible but often uses resource-specific base URL
Sourcepub fn huggingface() -> ProviderConfig
pub fn huggingface() -> ProviderConfig
Hugging Face Inference API - configured to reuse generic adapter; may need adjustments per model
Sourcepub fn together_ai() -> ProviderConfig
pub fn together_ai() -> ProviderConfig
Together AI - OpenAI-compatible chat API
Sourcepub fn groq_as_generic() -> ProviderConfig
pub fn groq_as_generic() -> ProviderConfig
Groq configuration - proving OpenAI compatibility
Sourcepub fn qwen() -> ProviderConfig
pub fn qwen() -> ProviderConfig
Qwen / Tongyi Qianwen (Alibaba Cloud) - OpenAI-compatible mode Uses DASHSCOPE_API_KEY and optional DASHSCOPE_BASE_URL to override the base URL.
Sourcepub fn baidu_wenxin() -> ProviderConfig
pub fn baidu_wenxin() -> ProviderConfig
Baidu Wenxin (ERNIE) - OpenAI compatible mode via Qianfan/Console configuration Environment variables: BAIDU_WENXIN_BASE_URL (optional), BAIDU_WENXIN_API_KEY, BAIDU_WENXIN_SECRET
Sourcepub fn tencent_hunyuan() -> ProviderConfig
pub fn tencent_hunyuan() -> ProviderConfig
Tencent Hunyuan - Tencent Cloud OpenAI compatible endpoint Environment variables: TENCENT_HUNYUAN_BASE_URL (optional), TENCENT_HUNYUAN_API_KEY
Sourcepub fn iflytek_spark() -> ProviderConfig
pub fn iflytek_spark() -> ProviderConfig
iFlytek Spark - OpenAI compatible endpoint example Environment variables: IFLYTEK_BASE_URL (optional), IFLYTEK_API_KEY
Sourcepub fn moonshot() -> ProviderConfig
pub fn moonshot() -> ProviderConfig
Moonshot (Kimi) - OpenAI compatible endpoint Environment variables: MOONSHOT_BASE_URL (optional), MOONSHOT_API_KEY
Sourcepub fn anthropic() -> ProviderConfig
pub fn anthropic() -> ProviderConfig
Anthropic Claude configuration - requires special handling
Sourcepub fn openrouter() -> ProviderConfig
pub fn openrouter() -> ProviderConfig
OpenRouter configuration
OpenRouter is a unified gateway for multiple AI models with OpenAI-compatible API. Base URL: https://openrouter.ai/api/v1 Documentation: https://openrouter.ai/docs/api-reference/overview
Sourcepub fn replicate() -> ProviderConfig
pub fn replicate() -> ProviderConfig
Replicate configuration
Replicate provides access to various AI models with OpenAI-compatible API. Base URL: https://api.replicate.com/v1 Documentation: https://replicate.com/docs/reference/http
Sourcepub fn zhipu_ai() -> ProviderConfig
pub fn zhipu_ai() -> ProviderConfig
智谱AI (GLM) configuration
智谱AI provides GLM series models with OpenAI-compatible API. Base URL: https://open.bigmodel.cn/api/paas/v4 Documentation: https://docs.bigmodel.cn/cn/api/introduction
Sourcepub fn minimax() -> ProviderConfig
pub fn minimax() -> ProviderConfig
MiniMax configuration
MiniMax provides AI models with OpenAI-compatible API. Base URL: https://api.minimax.chat/v1 Documentation: https://www.minimax.io/platform/document/ChatCompletion