๐ง๏ธ Rainy SDK v0.3.0
The official Rust SDK for the Rainy API by Enosis Labs - a unified interface for multiple AI providers including OpenAI, Anthropic, Google Gemini, and more.
โจ Features
- ๐ฏ Full OpenAI Compatibility: Compatibility with the OpenAI SDK system.
- ๐ Unified Multi-Provider API: Single interface for OpenAI, Google Gemini, Groq, Cerebras and others.
- ๐ Type-Safe Authentication: Secure API key management with validation
- โก Async/Await: Full async support with Tokio runtime
- ๐ Rich Metadata: Response times, provider info, token usage, credit tracking
- ๐ก๏ธ Enhanced Error Handling: Comprehensive error types with retryability
- ๐ Intelligent Retry: Exponential backoff with jitter for resilience
- ๐ Rate Limiting: Optional governor-based rate limiting
- ๐ง Advanced Parameters: Support for reasoning_effort, response_format, tools, tool_choice
- ๐ Rich Documentation: Complete API documentation with practical examples
๐ฆ Installation
Add this to your Cargo.toml:
[]
= "0.2.5"
= { = "1.0", = ["full"] }
Or installation with cargo:
๐ฏ OpenAI Compatibility
Rainy SDK v0.3.0 provides 100% OpenAI API compatibility while extending support to additional providers. Use Rainy SDK as a drop-in replacement for the official OpenAI SDK:
use ;
// Works exactly like OpenAI SDK
let client = with_api_key?;
let request = new
.with_temperature
.with_response_format;
let = client.chat_completion.await?;
Supported Models (100% OpenAI Compatible)
| Provider | Models | OpenAI Compatibility |
|---|---|---|
| OpenAI | openai/gpt-4o, openai/gpt-5 |
โ Native |
google/gemini-2.5-pro, google/gemini-2.5-flash, google/gemini-2.5-flash-lite |
โ Via compatibility layer | |
| Groq | groq/llama-3.1-8b-instant |
โ OpenAI-compatible API |
| Cerebras | cerebras/llama3.1-8b |
โ OpenAI-compatible API |
Advanced OpenAI Features
- Tool Calling: Function calling with
toolsandtool_choice - Structured Output: JSON Schema enforcement with
response_format - Reasoning Control:
reasoning_effortparameter for Gemini models - Log Probabilities:
logprobsandtop_logprobssupport - Streaming: OpenAI-compatible delta format streaming
Optional Features
Enable additional features as needed:
[]
= { = "0.2.5", = ["rate-limiting", "tracing"] }
Available features:
rate-limiting: Built-in rate limiting with thegovernorcrate.tracing: Request/response logging with thetracingcrate.
๐ Quick Start
use ;
use Error;
async
๐ API Documentation
Authentication
The SDK uses API key authentication. It's recommended to load the key from an environment variable.
use RainyClient;
// Load API key from environment and create client
let api_key = var.expect;
let client = with_api_key?;
Core Operations
Health Check
Verify the API status.
# use RainyClient;
# async
Chat Completions
Create a standard chat completion.
# use ;
# async
Streaming Chat Completions
Receive the response as a stream of events.
# use ;
# use StreamExt;
# async
Usage Statistics
Get credit and usage statistics.
# use RainyClient;
# async
API Key Management
Manage API keys programmatically.
# use RainyClient;
# async
๐งช Examples
Explore the examples/ directory for comprehensive usage examples:
- Basic Usage (
examples/basic_usage.rs): Complete walkthrough of all SDK features. - Chat Completion (
examples/chat_completion.rs): Advanced chat completion patterns. - Error Handling (
examples/error_handling.rs): Demonstrates how to handle different error types.
Run examples with:
# Set your API key
# Run basic usage example
# Run chat completion example
๐ก๏ธ Security Considerations
-
API Key Management: This SDK utilizes the
secrecycrate to handle the API key, ensuring it is securely stored in memory and zeroed out upon being dropped. However, it is still crucial to manage theRainyClient's lifecycle carefully within your application to minimize exposure. -
Rate Limiting: The optional
rate-limitingfeature is intended as a client-side safeguard to prevent accidental overuse and to act as a "good citizen" towards the API. It is not a security mechanism and can be bypassed by a malicious actor. For robust abuse prevention, you must implement server-side monitoring, usage quotas, and API key management through your Enosis Labs dashboard. -
TLS Configuration: The client is hardened to use modern, secure TLS settings (TLS 1.2+ via the
rustlsbackend) and to only allow HTTPS connections, providing strong protection against network interception.
๐๏ธ Architecture
The SDK is built with a modular architecture:
src/
โโโ client.rs # Main API client with request handling
โโโ auth.rs # Authentication and authorization logic
โโโ models.rs # Data structures and serialization
โโโ error.rs # Comprehensive error handling
โโโ retry.rs # Retry logic with exponential backoff
โโโ endpoints/ # API endpoint implementations
โ โโโ chat.rs # Chat completion endpoints
โ โโโ health.rs # Health check and monitoring
โ โโโ keys.rs # API key operations
โ โโโ usage.rs # Usage statistics and billing
โ โโโ user.rs # User account management
โโโ lib.rs # Public API and module exports
๐ค Contributing
We welcome contributions! Please see our Contributing Guide for details on:
- Setting up your development environment
- Code style and standards
- Testing guidelines
- Submitting pull requests
๐ License
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
๐ Contact & Support
- Website: enosislabs.com
- Email: hello@enosislabs.com
- GitHub: github.com/enosislabs
- Documentation: docs.rs/rainy-sdk
โ ๏ธ Disclaimer
This SDK is developed by Enosis Labs and is not officially affiliated with any AI provider mentioned (OpenAI, Anthropic, Google, etc.). The Rainy API serves as an independent gateway service that provides unified access to multiple AI providers.