rainy-sdk 0.6.0

Official Rust SDK for Rainy API by Enosis Labs v0.5.3 - Cowork API key validation, Gemini 3 models with advanced thinking capabilities, and full OpenAI compatibility
Documentation
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
# ๐ŸŒง๏ธ Rainy SDK v0.5.3

[![Crates.io](https://img.shields.io/crates/v/rainy-sdk.svg)](https://crates.io/crates/rainy-sdk)
[![Documentation](https://docs.rs/rainy-sdk/badge.svg)](https://docs.rs/rainy-sdk)
[![License](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://opensource.org/licenses/Apache-2.0)
[![Ask DeepWiki](https://deepwiki.com/badge.svg)](https://deepwiki.com/enosislabs/rainy-sdk)
[![Rust Version](https://img.shields.io/badge/rust-1.92.0%2B-orange.svg)](https://www.rust-lang.org/)

The official Rust SDK for the **Rainy API by Enosis Labs** - a unified interface for multiple AI providers including OpenAI, Google Gemini, Groq, Cerebras, and Enosis Labs' own Astronomer models. Features advanced thinking capabilities, multimodal support, thought signatures, and full OpenAI compatibility.

## โœจ Features

- **๐ŸŽฏ Full OpenAI Compatibility**: Drop-in replacement for OpenAI SDK with enhanced features
- **๐Ÿš€ Unified Multi-Provider API**: Single interface for OpenAI, Google Gemini, Groq, Cerebras, and Enosis Labs Astronomer models
- **๐Ÿง  Advanced Thinking Capabilities**: Gemini 3 and 2.5 series models with configurable reasoning levels and thought signatures
- **๐Ÿ” Type-Safe Authentication**: Secure API key management with the `secrecy` crate
- **โšก Async/Await**: Full async support with Tokio runtime
- **๐Ÿ“Š Rich Metadata**: Response times, provider info, token usage, credit tracking, and thinking token counts
- **๐Ÿ›ก๏ธ Enhanced Error Handling**: Comprehensive error types with retryability and detailed diagnostics
- **๐Ÿ”„ Intelligent Retry**: Exponential backoff with jitter for resilience
- **๐Ÿ“ˆ Rate Limiting**: Optional governor-based rate limiting
- **๐Ÿ”ง Advanced Parameters**: Support for response_format, tools, tool_choice, reasoning_effort, logprobs, and streaming
- **๐ŸŒ Web Search Integration**: Built-in Tavily-powered web search with content extraction
- **๐Ÿ‘ฅ Cowork Integration**: Tier-based feature gating with Free/GoPlus/Plus/Pro/ProPlus plans
- **๐ŸŽจ Multimodal Support**: Image processing and multimodal capabilities (coming soon)
- **๐Ÿ“š Rich Documentation**: Complete API documentation with practical examples

## ๐Ÿ“ฆ Installation

Add this to your `Cargo.toml`:

```toml
[dependencies]
rainy-sdk = "0.5.3"
tokio = { version = "1.47", features = ["full"] }
```

Or installation with cargo:

```bash
cargo add rainy-sdk
```

### Requirements

- **Rust**: 1.92.0 or later
- **Platform Support**: macOS, Linux, Windows

### Optional Features

Enable additional features as needed:

```toml
[dependencies]
rainy-sdk = { version = "0.5.1", features = ["rate-limiting", "tracing", "cowork"] }
```

Available features:

- `rate-limiting`: Built-in rate limiting with the `governor` crate
- `tracing`: Request/response logging with the `tracing` crate
- `cowork`: Cowork integration for tier-based feature gating (enabled by default)

## ๐ŸŽฏ OpenAI Compatibility

Rainy SDK v0.5.1 provides **100% OpenAI API compatibility** while extending support to additional providers. Use Rainy SDK as a drop-in replacement for the official OpenAI SDK:

```rust
use rainy_sdk::{models, ChatCompletionRequest, ChatMessage, RainyClient};

// Works exactly like OpenAI SDK
let client = RainyClient::with_api_key("your-rainy-api-key")?;

let request = ChatCompletionRequest::new(
    models::model_constants::OPENAI_GPT_4O, // or GOOGLE_GEMINI_2_5_PRO
    vec![ChatMessage::user("Hello!")]
)
.with_temperature(0.7)
.with_response_format(models::ResponseFormat::JsonObject);

let (response, metadata) = client.chat_completion(request).await?;
```

### Supported Models (100% OpenAI Compatible)

| Provider | Models | Features |
|----------|--------|----------|
| **OpenAI** | `gpt-4o`, `gpt-5`, `gpt-5-pro`, `o3`, `o4-mini` | โœ… Native OpenAI API |
| **Google Gemini 3** | `gemini-3-pro-preview`, `gemini-3-flash-preview`, `gemini-3-pro-image-preview` | โœ… Thinking, Thought Signatures, Multimodal |
| **Google Gemini 2.5** | `gemini-2.5-pro`, `gemini-2.5-flash`, `gemini-2.5-flash-lite` | โœ… Thinking, Dynamic Reasoning |
| **Groq** | `llama-3.1-8b-instant`, `llama-3.3-70b-versatile` | โœ… OpenAI-compatible API |
| **Cerebras** | `llama3.1-8b` | โœ… OpenAI-compatible API |
| **Enosis Labs** | `astronomer-1`, `astronomer-1-max`, `astronomer-1.5`, `astronomer-2`, `astronomer-2-pro` | โœ… Native Rainy API |

### Advanced OpenAI Features

- **Tool Calling**: Function calling with `tools` and `tool_choice`
- **Structured Output**: JSON Schema enforcement with `response_format`
- **Reasoning Control**: `reasoning_effort` parameter for Gemini models
- **Log Probabilities**: `logprobs` and `top_logprobs` support
- **Streaming**: OpenAI-compatible delta format streaming with tool calls

## ๐Ÿง  Advanced Thinking Capabilities

Rainy SDK supports advanced thinking capabilities for Google Gemini 3 and 2.5 series models, enabling deeper reasoning and thought preservation across conversations.

### Gemini 3 Thinking Features

```rust
use rainy_sdk::{models, ChatCompletionRequest, ChatMessage, RainyClient, ThinkingConfig};

let request = ChatCompletionRequest::new(
    models::model_constants::GOOGLE_GEMINI_3_PRO,
    vec![ChatMessage::user("Solve this complex optimization problem step by step.")]
)
.with_thinking_config(ThinkingConfig::gemini_3(
    models::ThinkingLevel::High, // High reasoning for complex tasks
    true // Include thought summaries in response
));

let (response, metadata) = client.chat_completion(request).await?;
println!("Response: {}", response.choices[0].message.content);
// Access thinking token usage
if let Some(thinking_tokens) = metadata.thoughts_token_count {
    println!("Thinking tokens used: {}", thinking_tokens);
}
```

### Thought Signatures

Preserve reasoning context across conversation turns with encrypted thought signatures:

```rust
use rainy_sdk::{models::*, ChatMessage, EnhancedChatMessage};

let mut conversation = vec![
// Previous messages with thought signatures...
];

// New message with preserved reasoning context
let enhanced_message = EnhancedChatMessage::with_parts(
    MessageRole::User,
    vec![
        ContentPart::text("Now apply this reasoning to the next problem..."),
        // Include thought signature from previous response
        ContentPart::with_thought_signature("encrypted_signature_here".to_string())
    ]
);
```

### Gemini 2.5 Dynamic Thinking

```rust
let config = ThinkingConfig::gemini_2_5(
    -1, // Dynamic thinking budget
    true // Include thoughts
);

let request = ChatCompletionRequest::new(
    models::model_constants::GOOGLE_GEMINI_2_5_PRO,
    messages
)
.with_thinking_config(config);
```

## ๐ŸŒ Web Search Integration

Built-in web search powered by Tavily for real-time information retrieval:

```rust
use rainy_sdk::search::{SearchOptions, SearchResponse};

let search_options = SearchOptions {
    query: "latest developments in Rust programming".to_string(),
    max_results: Some(10),
    ..Default::default()
};

let search_results = client.search_web(search_options).await?;
for result in search_results.results {
    println!("{}: {}", result.title, result.url);
}

// Extract content from specific URLs
let extracted = client.extract_content(vec!["https://example.com/article".to_string()]).await?;
println!("Content: {}", extracted.content);
```

## ๐Ÿ‘ฅ Cowork Integration

Tier-based feature gating with Free/GoPlus/Plus/Pro/ProPlus plans:

```rust
use rainy_sdk::{CoworkStatus, CoworkClient};

let cowork_client = CoworkClient::new(client);
let status = cowork_client.get_cowork_status().await?;

println!("Plan: {:?}", status.plan);
println!("Remaining uses: {}", status.usage.remaining_uses);

// Check feature availability
if status.can_use_web_research() {
    // Enable web search features
}
if status.can_use_document_export() {
    // Enable document generation
}
```

## ๐Ÿš€ Quick Start

```rust
use rainy_sdk::{models, ChatCompletionRequest, ChatMessage, RainyClient};
use std::error::Error;

#[tokio::main]
async fn main() -> Result<(), Box<dyn Error>> {
    // Initialize client with your API key from environment variables
    let api_key = std::env::var("RAINY_API_KEY").expect("RAINY_API_KEY not set");
    let client = RainyClient::with_api_key(api_key)?;

    // Simple chat completion
    let response = client
        .simple_chat(
            models::model_constants::GPT_4O,
            "Hello! Tell me a short story.",
        )
        .await?;
    println!("Simple response: {}", response);

    // Advanced usage with metadata
    let request = ChatCompletionRequest::new(
        models::model_constants::CLAUDE_SONNET_4,
        vec![ChatMessage::user("Explain quantum computing in one sentence")],
    )
    .with_temperature(0.7)
    .with_max_tokens(100);

    let (response, metadata) = client.chat_completion(request).await?;
    println!("\nAdvanced response: {}", response.choices[0].message.content);
    println!("Provider: {:?}", metadata.provider.unwrap_or_default());
    println!("Response time: {}ms", metadata.response_time.unwrap_or_default());

    Ok(())
}
```

## ๐Ÿ“– API Documentation

### Authentication

The SDK uses API key authentication. It's recommended to load the key from an environment variable.

```rust
use rainy_sdk::RainyClient;

// Load API key from environment and create client
let api_key = std::env::var("RAINY_API_KEY").expect("RAINY_API_KEY not set");
let client = RainyClient::with_api_key(api_key)?;
```

### Core Operations

#### Health Check

Verify the API status.

```rust,no_run
# use rainy_sdk::RainyClient;
# async fn example() -> Result<(), Box<dyn std::error::Error>> {
# let client = RainyClient::with_api_key("dummy")?;
let health = client.health_check().await?;
println!("API Status: {}", health.status);
# Ok(())
# }
```

#### Chat Completions

Create a standard chat completion.

```rust,no_run
# use rainy_sdk::{RainyClient, ChatCompletionRequest, ChatMessage, models};
# async fn example() -> Result<(), Box<dyn std::error::Error>> {
# let client = RainyClient::with_api_key("dummy")?;
let messages = vec![
    ChatMessage::system("You are a helpful assistant."),
    ChatMessage::user("Explain quantum computing in simple terms."),
];

let request = ChatCompletionRequest::new(models::model_constants::GPT_4O, messages)
    .with_max_tokens(500)
    .with_temperature(0.7);

let (response, metadata) = client.chat_completion(request).await?;
if let Some(choice) = response.choices.first() {
    println!("Response: {}", choice.message.content);
}
# Ok(())
# }
```

#### Streaming Chat Completions

Receive the response as a stream of events.

```rust,no_run
# use rainy_sdk::{RainyClient, ChatCompletionRequest, ChatMessage, models};
# use futures::StreamExt;
# async fn example() -> Result<(), Box<dyn std::error::Error>> {
# let client = RainyClient::with_api_key("dummy")?;
let request = ChatCompletionRequest::new(
    models::model_constants::LLAMA_3_1_8B_INSTANT,
    vec![ChatMessage::user("Write a haiku about Rust programming")],
)
.with_stream(true);

let mut stream = client.create_chat_completion_stream(request).await?;

while let Some(chunk) = stream.next().await {
    match chunk {
        Ok(response) => {
            if let Some(choice) = response.choices.first() {
                print!("{}", choice.message.content);
            }
        }
        Err(e) => eprintln!("\nError in stream: {}", e),
    }
}
# Ok(())
# }
```

#### Usage Statistics

Get credit and usage statistics.

```rust,no_run
# use rainy_sdk::RainyClient;
# async fn example() -> Result<(), Box<dyn std::error::Error>> {
# let client = RainyClient::with_api_key("dummy")?;
// Get credit stats
let credits = client.get_credit_stats(None).await?;
println!("Current credits: {}", credits.current_credits);

// Get usage stats for the last 7 days
let usage = client.get_usage_stats(Some(7)).await?;
println!("Total requests (last 7 days): {}", usage.total_requests);
# Ok(())
# }
```

#### API Key Management

Manage API keys programmatically.

```rust,no_run
# use rainy_sdk::RainyClient;
# async fn example() -> Result<(), Box<dyn std::error::Error>> {
# let client = RainyClient::with_api_key("dummy")?;
// List all API keys
let keys = client.list_api_keys().await?;
for key in keys {
    println!("Key ID: {} - Active: {}", key.id, key.is_active);
}

// Create a new API key
let new_key = client.create_api_key("My new key", Some(30)).await?;
println!("Created key: {}", new_key.key);

// Delete the API key
client.delete_api_key(&new_key.id.to_string()).await?;
# Ok(())
# }
```

## ๐Ÿงช Examples

Explore the `examples/` directory for comprehensive usage examples:

- **Basic Usage** (`examples/basic_usage.rs`): Complete walkthrough of all SDK features.
- **Chat Completion** (`examples/chat_completion.rs`): Advanced chat completion patterns.
- **Error Handling** (`examples/error_handling.rs`): Demonstrates how to handle different error types.

Run examples with:

```bash
# Set your API key
export RAINY_API_KEY="your-api-key-here"

# Run basic usage example
cargo run --example basic_usage

# Run chat completion example
cargo run --example chat_completion
```

## ๐Ÿ›ก๏ธ Security Considerations

- **API Key Management**: This SDK utilizes the `secrecy` crate to handle the API key, ensuring it is securely stored in memory and zeroed out upon being dropped. However, it is still crucial to manage the `RainyClient`'s lifecycle carefully within your application to minimize exposure.

- **Rate Limiting**: The optional `rate-limiting` feature is intended as a client-side safeguard to prevent accidental overuse and to act as a "good citizen" towards the API. It **is not a security mechanism** and can be bypassed by a malicious actor. For robust abuse prevention, you **must** implement server-side monitoring, usage quotas, and API key management through your Enosis Labs dashboard.

- **TLS Configuration**: The client is hardened to use modern, secure TLS settings (TLS 1.2+ via the `rustls` backend) and to only allow HTTPS connections, providing strong protection against network interception.

## ๐Ÿ—๏ธ Architecture

The SDK is built with a modular architecture:

```
src/
โ”œโ”€โ”€ auth.rs            # Authentication and API key management
โ”œโ”€โ”€ client.rs          # Main API client with request handling
โ”œโ”€โ”€ cowork.rs          # Tier-based feature gating and capabilities
โ”œโ”€โ”€ endpoints/         # API endpoint implementations (internal)
โ”œโ”€โ”€ error.rs           # Comprehensive error handling
โ”œโ”€โ”€ models.rs          # Data structures and type definitions
โ”œโ”€โ”€ retry.rs           # Retry logic with exponential backoff
โ”œโ”€โ”€ search.rs          # Web search and content extraction
โ””โ”€โ”€ lib.rs             # Public API and module exports
```

### Key Modules

- **`client.rs`**: Core `RainyClient` with async HTTP handling and response processing
- **`models.rs`**: Complete type system including `ChatCompletionRequest`, `ThinkingConfig`, `EnhancedChatMessage`
- **`auth.rs`**: Secure authentication with the `secrecy` crate for API key management
- **`cowork.rs`**: Integration with Enosis Labs' tier system (Free/GoPlus/Plus/Pro/ProPlus)
- **`search.rs`**: Tavily-powered web search with content extraction capabilities
- **`endpoints/`**: Internal API endpoint implementations (chat, health, keys, usage, user)

## ๐Ÿค Contributing

We welcome contributions! Please see our [Contributing Guide](CONTRIBUTING.md) for details on:

- Setting up your development environment
- Code style and standards
- Testing guidelines
- Submitting pull requests

## ๐Ÿ“„ License

This project is licensed under the Apache License 2.0 - see the [LICENSE](LICENSE) file for details.

## ๐Ÿ“ž Contact & Support

- **Website**: [enosislabs.com]https://enosislabs.com
- **Email**: <hello@enosislabs.com>
- **GitHub**: [github.com/enosislabs]https://github.com/enosislabs
- **Documentation**: [docs.rs/rainy-sdk]https://docs.rs/rainy-sdk

## โš ๏ธ Disclaimer

This SDK is developed by Enosis Labs and is not officially affiliated with any AI provider mentioned (OpenAI, Anthropic, Google, etc.). The Rainy API serves as an independent gateway service that provides unified access to multiple AI providers.

---

<p align="center">
  Made with โค๏ธ by <a href="https://enosislabs.com">Enosis Labs</a>
</p>