UnifiedLLMClient

Struct UnifiedLLMClient 

Source
pub struct UnifiedLLMClient { /* private fields */ }
Expand description

Unified client for multi-provider LLM operations.

UnifiedLLMClient is the primary interface for using multi-llm. It wraps all supported providers behind a single LlmProvider interface, allowing you to switch providers without changing your application code.

§Quick Start

use multi_llm::{unwrap_response, UnifiedLLMClient, LLMConfig, UnifiedMessage, UnifiedLLMRequest, LlmProvider};

// Create client from environment variables
let client = UnifiedLLMClient::from_env()?;

// Build a request
let request = UnifiedLLMRequest::new(vec![
    UnifiedMessage::system("You are a helpful assistant."),
    UnifiedMessage::user("What's the capital of France?"),
]);

// Execute the request
let response = unwrap_response!(client.execute_llm(request, None, None).await?);
println!("Response: {}", response.content);

§From Configuration

use multi_llm::{UnifiedLLMClient, LLMConfig, OpenAIConfig, DefaultLLMParams};

let config = LLMConfig {
    provider: Box::new(OpenAIConfig {
        api_key: Some("sk-...".to_string()),
        default_model: "gpt-4-turbo-preview".to_string(),
        ..Default::default()
    }),
    default_params: DefaultLLMParams::default(),
};

let client = UnifiedLLMClient::from_config(config)?;

§Tool Calling

use multi_llm::{unwrap_response, UnifiedLLMClient, UnifiedMessage, UnifiedLLMRequest, RequestConfig, Tool, ToolChoice, LlmProvider};

// Define a tool
let weather_tool = Tool {
    name: "get_weather".to_string(),
    description: "Get current weather".to_string(),
    parameters: serde_json::json!({
        "type": "object",
        "properties": {
            "city": {"type": "string"}
        },
        "required": ["city"]
    }),
};

let request = UnifiedLLMRequest::new(vec![
    UnifiedMessage::user("What's the weather in Paris?"),
]);

let config = RequestConfig {
    tools: vec![weather_tool],
    tool_choice: Some(ToolChoice::Auto),
    ..Default::default()
};

let response = unwrap_response!(client.execute_llm(request, None, Some(config)).await?);

// Check for tool calls
if !response.tool_calls.is_empty() {
    for call in &response.tool_calls {
        println!("Tool call: {} with {}", call.name, call.arguments);
        // Execute tool and continue conversation...
    }
}

§Supported Providers

ProviderConfig TypeAPI Key Required
AnthropicAnthropicConfigYes
OpenAIOpenAIConfigYes
OllamaOllamaConfigNo (local)
LM StudioLMStudioConfigNo (local)

Implementations§

Source§

impl UnifiedLLMClient

Source

pub fn create( provider_name: &str, model: String, config: LLMConfig, ) -> LlmResult<Self>

Factory method to create UnifiedLLMClient with all parameters This is the primary constructor for production use

§Errors

Returns LlmError::UnsupportedProvider if the provider name is not recognized. Supported providers are: “anthropic”, “openai”, “lmstudio”, “ollama”.

Returns LlmError::ConfigurationError if:

  • The provider configuration type doesn’t match the provider name
  • Required configuration fields are missing (e.g., API key for OpenAI/Anthropic)
  • Configuration validation fails (e.g., invalid base URL format)
Source

pub fn from_env() -> LlmResult<Self>

Create a client using environment variables for configuration

§Errors

Returns LlmError::ConfigurationError if:

  • Required environment variables are missing
  • Environment variable values are invalid or malformed
  • Provider configuration validation fails
Source

pub fn from_config(config: LLMConfig) -> LlmResult<Self>

Create a client from an LLMConfig (backward compatibility)

§Errors

Returns LlmError::UnsupportedProvider if the provider name in the config is not recognized.

Returns LlmError::ConfigurationError if:

  • Provider configuration validation fails
  • Required provider-specific settings are missing

Trait Implementations§

Source§

impl LlmProvider for UnifiedLLMClient

Implement LlmProvider for UnifiedLLMClient Just delegates to the underlying provider - providers already handle events feature correctly

Source§

fn execute_llm<'life0, 'async_trait>( &'life0 self, request: UnifiedLLMRequest, current_tool_round: Option<ToolCallingRound>, config: Option<RequestConfig>, ) -> Pin<Box<dyn Future<Output = Result<Response>> + Send + 'async_trait>>
where Self: 'async_trait, 'life0: 'async_trait,

Execute an LLM request and return the response. Read more
Source§

fn execute_structured_llm<'life0, 'async_trait>( &'life0 self, request: UnifiedLLMRequest, current_tool_round: Option<ToolCallingRound>, schema: Value, config: Option<RequestConfig>, ) -> Pin<Box<dyn Future<Output = Result<Response>> + Send + 'async_trait>>
where Self: 'async_trait, 'life0: 'async_trait,

Execute an LLM request with structured JSON output. Read more
Source§

fn provider_name(&self) -> &'static str

Get the provider’s identifier. Read more

Auto Trait Implementations§

Blanket Implementations§

Source§

impl<T> Any for T
where T: 'static + ?Sized,

Source§

fn type_id(&self) -> TypeId

Gets the TypeId of self. Read more
Source§

impl<T> Borrow<T> for T
where T: ?Sized,

Source§

fn borrow(&self) -> &T

Immutably borrows from an owned value. Read more
Source§

impl<T> BorrowMut<T> for T
where T: ?Sized,

Source§

fn borrow_mut(&mut self) -> &mut T

Mutably borrows from an owned value. Read more
Source§

impl<T> From<T> for T

Source§

fn from(t: T) -> T

Returns the argument unchanged.

Source§

impl<T> Instrument for T

Source§

fn instrument(self, span: Span) -> Instrumented<Self>

Instruments this type with the provided Span, returning an Instrumented wrapper. Read more
Source§

fn in_current_span(self) -> Instrumented<Self>

Instruments this type with the current Span, returning an Instrumented wrapper. Read more
Source§

impl<T, U> Into<U> for T
where U: From<T>,

Source§

fn into(self) -> U

Calls U::from(self).

That is, this conversion is whatever the implementation of From<T> for U chooses to do.

Source§

impl<T> PolicyExt for T
where T: ?Sized,

Source§

fn and<P, B, E>(self, other: P) -> And<T, P>
where T: Policy<B, E>, P: Policy<B, E>,

Create a new Policy that returns Action::Follow only if self and other return Action::Follow. Read more
Source§

fn or<P, B, E>(self, other: P) -> Or<T, P>
where T: Policy<B, E>, P: Policy<B, E>,

Create a new Policy that returns Action::Follow if either self or other returns Action::Follow. Read more
Source§

impl<T, U> TryFrom<U> for T
where U: Into<T>,

Source§

type Error = Infallible

The type returned in the event of a conversion error.
Source§

fn try_from(value: U) -> Result<T, <T as TryFrom<U>>::Error>

Performs the conversion.
Source§

impl<T, U> TryInto<U> for T
where U: TryFrom<T>,

Source§

type Error = <U as TryFrom<T>>::Error

The type returned in the event of a conversion error.
Source§

fn try_into(self) -> Result<U, <U as TryFrom<T>>::Error>

Performs the conversion.
Source§

impl<T> WithSubscriber for T

Source§

fn with_subscriber<S>(self, subscriber: S) -> WithDispatch<Self>
where S: Into<Dispatch>,

Attaches the provided Subscriber to this type, returning a WithDispatch wrapper. Read more
Source§

fn with_current_subscriber(self) -> WithDispatch<Self>

Attaches the current default Subscriber to this type, returning a WithDispatch wrapper. Read more