pub struct LMStudioConfig {
pub base_url: String,
pub default_model: String,
pub max_context_tokens: usize,
pub retry_policy: RetryPolicy,
}Expand description
Configuration for LM Studio local models.
LM Studio provides an OpenAI-compatible API for running local models. No API key is required since it runs locally.
§Example
use multi_llm::LMStudioConfig;
let config = LMStudioConfig {
base_url: "http://localhost:1234".to_string(),
default_model: "local-model".to_string(),
max_context_tokens: 4096,
..Default::default()
};§Environment Variables
LM_STUDIO_BASE_URLorOPENAI_BASE_URL: Server URL (default:http://localhost:1234)
§Notes
- Start LM Studio server before making requests
- Context window depends on the loaded model
- Model name in config is ignored; uses whatever model is loaded in LM Studio
Fields§
§base_url: StringBase URL for the LM Studio server (default: http://localhost:1234).
default_model: StringDefault model name (LM Studio uses the loaded model regardless).
max_context_tokens: usizeMaximum context window size (depends on loaded model).
retry_policy: RetryPolicyRetry policy for transient failures.
Trait Implementations§
Source§impl Clone for LMStudioConfig
impl Clone for LMStudioConfig
Source§fn clone(&self) -> LMStudioConfig
fn clone(&self) -> LMStudioConfig
Returns a duplicate of the value. Read more
1.0.0 · Source§fn clone_from(&mut self, source: &Self)
fn clone_from(&mut self, source: &Self)
Performs copy-assignment from
source. Read moreSource§impl Debug for LMStudioConfig
impl Debug for LMStudioConfig
Source§impl Default for LMStudioConfig
impl Default for LMStudioConfig
Source§impl<'de> Deserialize<'de> for LMStudioConfig
impl<'de> Deserialize<'de> for LMStudioConfig
Source§fn deserialize<__D>(__deserializer: __D) -> Result<Self, __D::Error>where
__D: Deserializer<'de>,
fn deserialize<__D>(__deserializer: __D) -> Result<Self, __D::Error>where
__D: Deserializer<'de>,
Deserialize this value from the given Serde deserializer. Read more
Source§impl ProviderConfig for LMStudioConfig
impl ProviderConfig for LMStudioConfig
Source§fn provider_name(&self) -> &'static str
fn provider_name(&self) -> &'static str
Get the provider identifier (e.g., “openai”, “anthropic”).
Source§fn max_context_tokens(&self) -> usize
fn max_context_tokens(&self) -> usize
Get the maximum context window size in tokens.
Source§fn validate(&self) -> LlmResult<()>
fn validate(&self) -> LlmResult<()>
Validate that the configuration is complete and valid. Read more
Source§fn default_model(&self) -> &str
fn default_model(&self) -> &str
Get the default model name for this provider.
Source§fn retry_policy(&self) -> &RetryPolicy
fn retry_policy(&self) -> &RetryPolicy
Get the retry policy for transient failures.
Auto Trait Implementations§
impl Freeze for LMStudioConfig
impl RefUnwindSafe for LMStudioConfig
impl Send for LMStudioConfig
impl Sync for LMStudioConfig
impl Unpin for LMStudioConfig
impl UnwindSafe for LMStudioConfig
Blanket Implementations§
Source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
Source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Mutably borrows from an owned value. Read more