pub enum LlmProvider {
Show 20 variants
Anthropic,
OpenAI,
GoogleGemini,
GoogleVertex,
AzureOpenAI,
AWSBedrock,
Ollama,
XAI,
Groq,
Mistral,
DeepSeek,
Cohere,
Perplexity,
Cerebras,
TogetherAI,
FireworksAI,
AlibabaQwen,
OpenRouter,
CloudflareAI,
Opencode,
}Expand description
Supported LLM providers (OpenAI-compatible where applicable)
Variants§
Anthropic
Anthropic Claude models (native API)
OpenAI
OpenAI GPT models
GoogleGemini
Google Gemini via AI Studio (OpenAI-compatible) Base: https://generativelanguage.googleapis.com/v1beta/openai/
GoogleVertex
Google Vertex AI (requires GCP auth) Base: https://aiplatform.googleapis.com/v1/projects/{project}/locations/{location}/endpoints/openapi
AzureOpenAI
Azure OpenAI Service Base: https://{resource}.openai.azure.com/openai/deployments/{deployment}
AWSBedrock
AWS Bedrock (OpenAI-compatible via Converse API) Base: https://bedrock-runtime.{region}.amazonaws.com
Ollama
Ollama (local server; can route to Ollama Cloud models) Base: http://localhost:11434
XAI
xAI Grok models Base: https://api.x.ai/v1
Groq
Groq (ultra-fast inference) Base: https://api.groq.com/openai/v1
Mistral
Mistral AI Base: https://api.mistral.ai/v1/
DeepSeek
DeepSeek (research-focused) Base: https://api.deepseek.com/v1
Cohere
Cohere (enterprise NLP) Base: https://api.cohere.ai/v1
Perplexity
Perplexity (search-augmented) Base: https://api.perplexity.ai
Cerebras
Cerebras (wafer-scale inference) Base: https://api.cerebras.ai/v1
TogetherAI
Together AI (open model hosting) Base: https://api.together.xyz/v1
FireworksAI
Fireworks AI (fast open model inference) Base: https://api.fireworks.ai/inference/v1
AlibabaQwen
Alibaba Qwen / DashScope Base: https://dashscope-intl.aliyuncs.com/compatible-mode/v1
OpenRouter
OpenRouter (300+ models, automatic fallbacks) Base: https://openrouter.ai/api/v1
CloudflareAI
Cloudflare AI Gateway (unified endpoint) Base: https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/
Opencode
Opencode AI Base: https://api.opencode.ai/v1
Implementations§
Source§impl LlmProvider
impl LlmProvider
Sourcepub fn all() -> &'static [LlmProvider]
pub fn all() -> &'static [LlmProvider]
Get all available providers
Sourcepub fn default_base_url(&self) -> &'static str
pub fn default_base_url(&self) -> &'static str
Get default base URL for provider
Sourcepub fn default_model(&self) -> &'static str
pub fn default_model(&self) -> &'static str
Get default model for provider (updated Dec 2025 - using standardized aliases)
Sourcepub fn is_anthropic_format(&self) -> bool
pub fn is_anthropic_format(&self) -> bool
Check if provider uses native Anthropic API format
Sourcepub fn is_openai_compatible(&self) -> bool
pub fn is_openai_compatible(&self) -> bool
Check if provider uses OpenAI-compatible format
Sourcepub fn requires_special_auth(&self) -> bool
pub fn requires_special_auth(&self) -> bool
Check if provider requires special authentication
Sourcepub fn display_name(&self) -> &'static str
pub fn display_name(&self) -> &'static str
Get provider display name
Trait Implementations§
Source§impl Clone for LlmProvider
impl Clone for LlmProvider
Source§fn clone(&self) -> LlmProvider
fn clone(&self) -> LlmProvider
1.0.0 · Source§fn clone_from(&mut self, source: &Self)
fn clone_from(&mut self, source: &Self)
source. Read moreSource§impl Debug for LlmProvider
impl Debug for LlmProvider
Source§impl Default for LlmProvider
impl Default for LlmProvider
Source§fn default() -> LlmProvider
fn default() -> LlmProvider
Source§impl<'de> Deserialize<'de> for LlmProvider
impl<'de> Deserialize<'de> for LlmProvider
Source§fn deserialize<__D>(__deserializer: __D) -> Result<Self, __D::Error>where
__D: Deserializer<'de>,
fn deserialize<__D>(__deserializer: __D) -> Result<Self, __D::Error>where
__D: Deserializer<'de>,
Source§impl Display for LlmProvider
impl Display for LlmProvider
Source§impl Hash for LlmProvider
impl Hash for LlmProvider
Source§impl PartialEq for LlmProvider
impl PartialEq for LlmProvider
Source§impl Serialize for LlmProvider
impl Serialize for LlmProvider
impl Copy for LlmProvider
impl Eq for LlmProvider
impl StructuralPartialEq for LlmProvider
Auto Trait Implementations§
impl Freeze for LlmProvider
impl RefUnwindSafe for LlmProvider
impl Send for LlmProvider
impl Sync for LlmProvider
impl Unpin for LlmProvider
impl UnwindSafe for LlmProvider
Blanket Implementations§
Source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
Source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Source§impl<T> CloneToUninit for Twhere
T: Clone,
impl<T> CloneToUninit for Twhere
T: Clone,
Source§impl<Q, K> Equivalent<K> for Q
impl<Q, K> Equivalent<K> for Q
Source§impl<Q, K> Equivalent<K> for Q
impl<Q, K> Equivalent<K> for Q
Source§fn equivalent(&self, key: &K) -> bool
fn equivalent(&self, key: &K) -> bool
key and return true if they are equal.Source§impl<Q, K> Equivalent<K> for Q
impl<Q, K> Equivalent<K> for Q
Source§impl<Q, K> Equivalent<K> for Q
impl<Q, K> Equivalent<K> for Q
Source§impl<T> Instrument for T
impl<T> Instrument for T
Source§fn instrument(self, span: Span) -> Instrumented<Self>
fn instrument(self, span: Span) -> Instrumented<Self>
Source§fn in_current_span(self) -> Instrumented<Self>
fn in_current_span(self) -> Instrumented<Self>
Source§impl<T> IntoEither for T
impl<T> IntoEither for T
Source§fn into_either(self, into_left: bool) -> Either<Self, Self>
fn into_either(self, into_left: bool) -> Either<Self, Self>
self into a Left variant of Either<Self, Self>
if into_left is true.
Converts self into a Right variant of Either<Self, Self>
otherwise. Read moreSource§fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
self into a Left variant of Either<Self, Self>
if into_left(&self) returns true.
Converts self into a Right variant of Either<Self, Self>
otherwise. Read more