pub enum Provider {
LMStudio,
Ollama,
LlamaCpp,
VLLM,
}Expand description
Enum representing supported local LLM server providers.
Each provider has a default base URL where its API server typically runs. These are convenience shortcuts to avoid hardcoding URLs in application code.
§Provider Details
| Provider | Default URL | Port | Description |
|---|---|---|---|
| LMStudio | http://localhost:1234/v1 | 1234 | GUI-based local server |
| Ollama | http://localhost:11434/v1 | 11434 | CLI-focused server |
| LlamaCpp | http://localhost:8080/v1 | 8080 | C++ inference engine |
| VLLM | http://localhost:8000/v1 | 8000 | High-performance server |
All providers implement the OpenAI-compatible API standard, making them interchangeable from the SDK’s perspective.
Variants§
LMStudio
LM Studio - Popular GUI-based local model server (default port 1234)
Ollama
Ollama - Command-line focused local model server (default port 11434)
LlamaCpp
llama.cpp - C++ inference engine with server mode (default port 8080)
VLLM
vLLM - High-performance inference server (default port 8000)
Implementations§
Source§impl Provider
impl Provider
Sourcepub fn default_url(&self) -> &'static str
pub fn default_url(&self) -> &'static str
Get the default base URL for this provider.
Returns the standard localhost URL where each provider’s API server
typically runs. All URLs include the /v1 path suffix required by
the OpenAI-compatible API standard.
§Returns
A static string slice containing the full base URL including protocol, host, port, and API version path.
§Examples
use open_agent::Provider;
assert_eq!(Provider::Ollama.default_url(), "http://localhost:11434/v1");
assert_eq!(Provider::LMStudio.default_url(), "http://localhost:1234/v1");Trait Implementations§
Source§impl FromStr for Provider
impl FromStr for Provider
Source§fn from_str(s: &str) -> Result<Self, Self::Err>
fn from_str(s: &str) -> Result<Self, Self::Err>
Parse a provider name from a string.
This implementation is case-insensitive and supports multiple naming conventions (dashes, underscores, dots) for flexibility.
§Supported Formats
- LMStudio: “lmstudio”, “lm-studio”, “lm_studio” (case-insensitive)
- Ollama: “ollama” (case-insensitive)
- LlamaCpp: “llamacpp”, “llama-cpp”, “llama_cpp”, “llama.cpp” (case-insensitive)
- VLLM: “vllm” (case-insensitive)
§Errors
Returns a String error message if the provider name is not recognized.
§Examples
use open_agent::Provider;
use std::str::FromStr;
let provider = "ollama".parse::<Provider>().unwrap();
assert_eq!(provider, Provider::Ollama);
let provider = "LM-Studio".parse::<Provider>().unwrap();
assert_eq!(provider, Provider::LMStudio);
assert!("unknown".parse::<Provider>().is_err());impl Copy for Provider
impl Eq for Provider
impl StructuralPartialEq for Provider
Auto Trait Implementations§
impl Freeze for Provider
impl RefUnwindSafe for Provider
impl Send for Provider
impl Sync for Provider
impl Unpin for Provider
impl UnwindSafe for Provider
Blanket Implementations§
Source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
Source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Source§impl<T> CloneToUninit for Twhere
T: Clone,
impl<T> CloneToUninit for Twhere
T: Clone,
Source§impl<Q, K> Equivalent<K> for Q
impl<Q, K> Equivalent<K> for Q
Source§impl<Q, K> Equivalent<K> for Q
impl<Q, K> Equivalent<K> for Q
Source§fn equivalent(&self, key: &K) -> bool
fn equivalent(&self, key: &K) -> bool
key and return true if they are equal.