pub enum Provider {
OpenAI(OpenAI),
Claude(Claude),
}Expand description
Unified LLM provider enum.
The gateway constructs the appropriate variant based on ProviderKind
detected from the model name. The runtime is monomorphized on Provider.
Variants§
OpenAI(OpenAI)
OpenAI-compatible API (covers OpenAI, DeepSeek, Grok, Qwen, Kimi, Ollama).
Claude(Claude)
Anthropic Messages API.
Implementations§
Trait Implementations§
Source§impl Model for Provider
impl Model for Provider
Source§fn stream(
&self,
request: Request,
) -> impl Stream<Item = Result<StreamChunk>> + Send
fn stream( &self, request: Request, ) -> impl Stream<Item = Result<StreamChunk>> + Send
Stream a chat completion response.
Source§fn context_limit(&self, model: &str) -> usize
fn context_limit(&self, model: &str) -> usize
Resolve the context limit for a model name. Read more
Source§fn active_model(&self) -> CompactString
fn active_model(&self) -> CompactString
Get the active/default model name.
Auto Trait Implementations§
impl Freeze for Provider
impl !RefUnwindSafe for Provider
impl Send for Provider
impl Sync for Provider
impl Unpin for Provider
impl UnsafeUnpin for Provider
impl !UnwindSafe for Provider
Blanket Implementations§
Source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
Source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Mutably borrows from an owned value. Read more