Crate rsai

Crate rsai 

Source
Expand description

§rsai

Predictable development for unpredictable models. Let the compiler handle the chaos.

§⚠️ WARNING

This is a pre-release version with an unstable API. Breaking changes may occur between versions. Use with caution and pin to specific versions in production applications.

§Quick Start

use rsai::{llm, Message, ChatRole, ApiKey, Provider, TextResponse, completion_schema};

#[completion_schema]
struct Analysis {
    sentiment: String,
    confidence: f32,
}

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let analysis = llm::with(Provider::OpenAI)
    .api_key(ApiKey::Default)?
    .model("gpt-4o-mini")
    .messages(vec![Message {
        role: ChatRole::User,
        content: "Analyze: 'This library is amazing!'".to_string(),
    }])
    .complete::<Analysis>()
    .await?;

let reply = llm::with(Provider::OpenAI)
    .api_key(ApiKey::Default)?
    .model("gpt-4o-mini")
    .messages(vec![
        Message {
            role: ChatRole::System,
            content: "You are friendly and concise.".to_string(),
        },
        Message {
            role: ChatRole::User,
            content: "Share a fun fact about Rust.".to_string(),
        },
    ])
    .complete::<TextResponse>()
    .await?;

println!("{}", reply.text);
Ok(())
}

Modules§

llm
Module containing the main entry point for building LLM requests

Macros§

toolset
Macro for creating a collection of tools from annotated [tool] functions.

Structs§

Ctx
Marker type for context/dependency injection in tools.
Format
GeminiClient
GeminiConfig
GenerationConfig
Configuration for text generation parameters
HttpClientConfig
Configuration for HTTP client resilience
InspectorConfig
Configuration for request/response inspection hooks.
LanguageModelUsage
LlmBuilder
A type-safe builder for constructing LLM requests using the builder pattern. The builder enforces correct construction order through phantom types.
Message
OpenAiClient
OpenAiConfig
OpenAI-specific configuration for the responses client
OpenRouterClient
OpenRouterConfig
OpenRouter-specific configuration for the responses client
ResponseMetadata
StructuredRequest
StructuredResponse
TextResponse
Tool
ToolCall
ToolCallResult
ToolCallingConfig
Configuration for tool calling behavior and limits
ToolCallingGuard
Guard for tracking tool call processing limits and preventing infinite loops
ToolConfig
Configuration for tool calling behavior
ToolRegistry
ToolSet
ToolSetBuilder
Builder for creating a ToolSet with context. Created by the toolset! macro when a context type is specified.

Enums§

ApiKey
Configuration for API key source
ChatRole
ConversationMessage
LlmError
Provider
ToolChoice

Traits§

CompletionTarget
LlmProvider
ToolFunction

Type Aliases§

BoxFuture
Inspector
Type alias for inspection callbacks that receive raw JSON payloads.
Result

Attribute Macros§

completion_schema
Attribute macro for types used with the rsai::llm()::complete::<T>() method.
tool
Attribute macro for marking functions as tools that can be called by LLMs.