Expand description
§rsai
Predictable development for unpredictable models. Let the compiler handle the chaos.
§⚠️ WARNING
This is a pre-release version with an unstable API. Breaking changes may occur between versions. Use with caution and pin to specific versions in production applications.
§Quick Start
use rsai::{llm, Message, ChatRole, ApiKey, Provider, TextResponse, completion_schema};
#[completion_schema]
struct Analysis {
sentiment: String,
confidence: f32,
}
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let analysis = llm::with(Provider::OpenAI)
.api_key(ApiKey::Default)?
.model("gpt-4o-mini")
.messages(vec![Message {
role: ChatRole::User,
content: "Analyze: 'This library is amazing!'".to_string(),
}])
.complete::<Analysis>()
.await?;
let reply = llm::with(Provider::OpenAI)
.api_key(ApiKey::Default)?
.model("gpt-4o-mini")
.messages(vec![
Message {
role: ChatRole::System,
content: "You are friendly and concise.".to_string(),
},
Message {
role: ChatRole::User,
content: "Share a fun fact about Rust.".to_string(),
},
])
.complete::<TextResponse>()
.await?;
println!("{}", reply.text);
Ok(())
}Modules§
- llm
- Module containing the main entry point for building LLM requests
Macros§
- toolset
- Macro for creating a collection of tools from annotated [
tool] functions.
Structs§
- Ctx
- Marker type for context/dependency injection in tools.
- Format
- Gemini
Client - Gemini
Config - Generation
Config - Configuration for text generation parameters
- Http
Client Config - Configuration for HTTP client resilience
- Inspector
Config - Configuration for request/response inspection hooks.
- Language
Model Usage - LlmBuilder
- A type-safe builder for constructing LLM requests using the builder pattern. The builder enforces correct construction order through phantom types.
- Message
- Open
AiClient - Open
AiConfig - OpenAI-specific configuration for the responses client
- Open
Router Client - Open
Router Config - OpenRouter-specific configuration for the responses client
- Response
Metadata - Structured
Request - Structured
Response - Text
Response - Tool
- Tool
Call - Tool
Call Result - Tool
Calling Config - Configuration for tool calling behavior and limits
- Tool
Calling Guard - Guard for tracking tool call processing limits and preventing infinite loops
- Tool
Config - Configuration for tool calling behavior
- Tool
Registry - ToolSet
- Tool
SetBuilder - Builder for creating a ToolSet with context.
Created by the
toolset!macro when a context type is specified.
Enums§
- ApiKey
- Configuration for API key source
- Chat
Role - Conversation
Message - LlmError
- Provider
- Tool
Choice
Traits§
Type Aliases§
Attribute Macros§
- completion_
schema - Attribute macro for types used with the
rsai::llm()::complete::<T>()method. - tool
- Attribute macro for marking functions as tools that can be called by LLMs.