#[non_exhaustive]pub enum Provider {
Ollama,
OpenAi,
Anthropic,
}Expand description
LLM provider. Selected via LlmConfigBuilder::provider or the
AGENT_LINE_PROVIDER env var when using LlmConfig::from_env.
Variants (Non-exhaustive)§
This enum is marked as non-exhaustive
Non-exhaustive enums could have additional variants added in future. Therefore, when matching against variants of non-exhaustive enums, an extra wildcard arm must be added to account for any future variants.
Ollama
Ollama (default). Local inference, no API key needed.
OpenAi
OpenAI-compatible APIs (OpenRouter, etc.).
Anthropic
Anthropic API.
Trait Implementations§
impl Copy for Provider
impl Eq for Provider
impl StructuralPartialEq for Provider
Auto Trait Implementations§
impl Freeze for Provider
impl RefUnwindSafe for Provider
impl Send for Provider
impl Sync for Provider
impl Unpin for Provider
impl UnsafeUnpin for Provider
impl UnwindSafe for Provider
Blanket Implementations§
Source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
Source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Mutably borrows from an owned value. Read more