Skip to main content

Crate erio_llm_client

Crate erio_llm_client 

Source
Expand description

§erio-llm-client

erio-llm-client is Erio’s provider abstraction for chat/completion style LLM calls. It includes request/response models, error handling, and an OpenAI- compatible provider implementation.

Use this crate to keep provider integration behind a trait while sharing a single request/response shape across agents and workflows.

§Quickstart

use erio_core::Message;
use erio_llm_client::{CompletionRequest, LlmProvider, OpenAiProvider};

async fn run() -> Result<(), Box<dyn std::error::Error>> {
    let provider = OpenAiProvider::new("https://api.openai.com/v1", "YOUR_API_KEY");

    let request = CompletionRequest::new("gpt-4o-mini")
        .message(Message::system("You are concise."))
        .message(Message::user("Say hello in one sentence."));

    let _response = provider.complete(request).await?;
    Ok(())
}

§API tour

  • Provider types: LlmProvider, OpenAiProvider
  • Request types: CompletionRequest, ToolDefinition
  • Response types: CompletionResponse, StreamChunk, Usage
  • Error type: LlmError
  • Modules: openai, provider, request, response, error

§Compatibility

  • MSRV: Rust 1.93
  • License: Apache-2.0 Erio LLM Client - LLM provider abstraction and adapters for the agent runtime.

Re-exports§

pub use error::LlmError;
pub use openai::OpenAiProvider;
pub use provider::LlmProvider;
pub use request::CompletionRequest;
pub use request::ToolDefinition;
pub use response::CompletionResponse;
pub use response::StreamChunk;
pub use response::Usage;

Modules§

error
LLM-specific error types.
openai
OpenAI-compatible LLM provider.
provider
LLM provider trait definition.
request
Completion request types for LLM providers.
response
Completion response types for LLM providers.