Expand description
Β§OpenRouter Rust SDK
openrouter-rs
is a type-safe, async Rust SDK for the OpenRouter API,
providing easy access to 200+ AI models from providers like OpenAI, Anthropic, Google, and more.
Β§β¨ Key Features
- π Type Safety: Leverages Rustβs type system for compile-time error prevention
- β‘ Async/Await: Built on
tokio
for high-performance async operations - ποΈ Builder Pattern: Ergonomic client and request construction
- π‘ Streaming Support: Real-time response streaming with
futures
- π§ Reasoning Tokens: Advanced support for chain-of-thought reasoning
- βοΈ Model Presets: Pre-configured model groups for different use cases
- π― Full API Coverage: Complete OpenRouter API endpoint support
Β§π Quick Start
Add to your Cargo.toml
:
[dependencies]
openrouter-rs = "0.4.5"
tokio = { version = "1", features = ["full"] }
Β§Basic Chat Completion
use openrouter_rs::{
OpenRouterClient,
api::chat::{ChatCompletionRequest, Message},
types::Role,
};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Create client with builder pattern
let client = OpenRouterClient::builder()
.api_key("your_api_key")
.http_referer("https://yourapp.com")
.x_title("My App")
.build()?;
// Build chat request
let request = ChatCompletionRequest::builder()
.model("anthropic/claude-sonnet-4")
.messages(vec![
Message::new(Role::System, "You are a helpful assistant"),
Message::new(Role::User, "Explain Rust ownership in simple terms"),
])
.temperature(0.7)
.max_tokens(500)
.build()?;
// Send request and get response
let response = client.send_chat_completion(&request).await?;
println!("Response: {}", response.choices[0].content().unwrap_or(""));
Ok(())
}
Β§Streaming Responses
use futures_util::StreamExt;
use openrouter_rs::{OpenRouterClient, api::chat::*};
let client = OpenRouterClient::builder()
.api_key("your_api_key")
.build()?;
let request = ChatCompletionRequest::builder()
.model("google/gemini-2.5-flash")
.messages(vec![Message::new(Role::User, "Write a haiku about Rust")])
.build()?;
let mut stream = client.stream_chat_completion(&request).await?;
while let Some(result) = stream.next().await {
if let Ok(response) = result {
if let Some(content) = response.choices[0].content() {
print!("{}", content);
}
}
}
Β§Reasoning Tokens (Chain-of-Thought)
use openrouter_rs::{OpenRouterClient, api::chat::*, types::{Role, Effort}};
let client = OpenRouterClient::builder()
.api_key("your_api_key")
.build()?;
let request = ChatCompletionRequest::builder()
.model("deepseek/deepseek-r1")
.messages(vec![Message::new(Role::User, "What's bigger: 9.9 or 9.11?")])
.reasoning_effort(Effort::High) // Enable high-effort reasoning
.reasoning_max_tokens(1000) // Limit reasoning tokens
.build()?;
let response = client.send_chat_completion(&request).await?;
println!("Reasoning: {}", response.choices[0].reasoning().unwrap_or(""));
println!("Answer: {}", response.choices[0].content().unwrap_or(""));
Β§π Core Modules
client
- Client configuration and HTTP operationsapi
- OpenRouter API endpoints (chat, models, credits, etc.)types
- Request/response types and enumsconfig
- Configuration management and model presetserror
- Error types and handling
Β§π― Model Presets
The SDK includes curated model presets for different use cases:
programming
: Code generation and software developmentreasoning
: Advanced reasoning and problem-solvingfree
: Free-tier models for experimentation
use openrouter_rs::config::OpenRouterConfig;
let config = OpenRouterConfig::default();
println!("Available models: {:?}", config.get_resolved_models());
Β§π API Coverage
Feature | Status | Module |
---|---|---|
Chat Completions | β | api::chat |
Text Completions | β | api::completion |
Model Information | β | api::models |
Streaming | β | api::chat |
Reasoning Tokens | β | api::chat |
API Key Management | β | api::api_keys |
Credit Management | β | api::credits |
Generation Data | β | api::generation |
Authentication | β | api::auth |
Β§π Examples
Check out the examples/
directory for comprehensive usage examples:
- Basic chat completion
- Streaming responses
- Reasoning tokens
- Model management
- Error handling
- Advanced configurations
Β§π€ Contributing
Contributions are welcome! Please see our GitHub repository for issues and pull requests.
Re-exportsΒ§
pub use api::chat::Message;
pub use api::models::Model;
pub use client::OpenRouterClient;
ModulesΒ§
- api
- OpenRouter API Endpoints
- client
- config
- Configuration Management
- error
- Error Handling
- types
- Core Types and Data Structures
- utils