1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
//! # Aquaregia
//!
//! A provider-agnostic Rust toolkit for building AI applications and tool-using agents.
//!
//! Aquaregia provides a unified API across OpenAI, Anthropic, Google, and OpenAI-compatible services,
//! with first-class support for reasoning-aware output, streaming events, multi-step tool execution,
//! and vision/image inputs.
//!
//! ## Features
//!
//! - **Unified Provider API**: One `LlmClient` binds to one provider configuration with support for
//! OpenAI, Anthropic, Google, and OpenAI-compatible endpoints.
//! - **Streaming & Non-Streaming**: Both `generate` and `stream` APIs with consistent event handling.
//! - **Reasoning Support**: First-class reasoning content extraction and streaming events.
//! - **Tool-Using Agents**: Multi-step agent loops with configurable tool execution and error handling.
//! - **Multimodal Vision**: Send images to vision-capable models via URL, base64, or raw bytes.
//! - **Cancellation**: All requests and agent runs support cancellation via `CancellationToken`.
//! - **Telemetry**: Optional `tracing` spans for generate, stream, and agent operations.
//!
//! ## Quick Start
//!
//! ```rust,no_run
//! use aquaregia::{GenerateTextRequest, LlmClient};
//!
//! #[tokio::main]
//! async fn main() -> Result<(), Box<dyn std::error::Error>> {
//! let client = LlmClient::openai_compatible("https://api.deepseek.com")
//! .api_key(std::env::var("DEEPSEEK_API_KEY")?)
//! .build()?;
//!
//! let out = client
//! .generate(GenerateTextRequest::from_user_prompt(
//! "deepseek-chat",
//! "Explain Rust ownership in 3 bullet points.",
//! ))
//! .await?;
//!
//! println!("{}", out.output_text);
//! Ok(())
//! }
//! ```
//!
//! ## Crate Features
//!
//! | Feature | Description |
//! | ----------- | -------------------------------------------------------------------- |
//! | `openai` | OpenAI adapter (default) |
//! | `anthropic` | Anthropic adapter (default) |
//! | `telemetry` | `tracing` spans for `generate`, `stream`, agent steps and tool calls |
//! | `axum` | Axum SSE bridge for converting streams into SSE responses |
//!
//! ## Architecture
//!
//! - [`LlmClient`]: Entry point for creating provider-bound clients.
//! - [`BoundClient`]: Reusable client for `generate`, `stream`, and agent loops.
//! - [`Agent`]: Multi-step tool-using agent with configurable hooks.
//! - [`ModelAdapter`]: Trait for provider-specific request/response handling.
//! - [`Tool`]: Executable tool definitions with JSON Schema validation.
/// Agent runtime and builder APIs.
/// Provider-bound client types and retry behavior.
/// Unified error types and HTTP-to-error mapping helpers.
/// Provider adapter traits and concrete provider implementations.
/// SSE frame parsing helpers used by streaming adapters.
/// Tool definition, execution, and registry types.
/// Shared request/response and event types.
/// Axum SSE bridge for converting [`TextStream`] into SSE responses.
///
/// This module provides integration with the Axum web framework, allowing
/// streaming responses to be converted into Server-Sent Events (SSE) for HTTP streaming.
/// Re-export of `schemars` for use in procedural macros.
///
/// This is an internal re-export used by the `#[tool]` macro to generate
/// JSON Schema implementations without requiring users to add `schemars`
/// as a direct dependency.
pub use schemars as __aquaregia_schemars;
/// Re-export of `serde` for use in procedural macros.
///
/// This is an internal re-export used by the `#[tool]` macro to generate
/// Deserialize implementations without requiring users to add `serde`
/// as a direct dependency.
pub use serde as __aquaregia_serde;
pub use ;
pub use tool;
pub use ;
pub use ;
pub use ModelAdapter;
pub use AnthropicAdapterSettings;
pub use GoogleAdapterSettings;
pub use OpenAiAdapterSettings;
pub use OpenAiCompatibleAdapterSettings;
pub use CancellationToken;
pub use ;
pub use ;
/// Creates a typed OpenAI model reference (`openai/<model>`).
///
/// This function provides a convenient way to create a [`ModelRef`] for OpenAI models
/// with compile-time provider type safety.
///
/// # Arguments
///
/// * `model` - The OpenAI model identifier (e.g., `"gpt-4o"`, `"gpt-4o-mini"`)
///
/// # Example
///
/// ```
/// use aquaregia::openai;
///
/// let model = openai("gpt-4o");
/// assert_eq!(model.id(), "openai/gpt-4o");
/// ```
/// Creates a typed Anthropic model reference (`anthropic/<model>`).
///
/// This function provides a convenient way to create a [`ModelRef`] for Anthropic models
/// with compile-time provider type safety.
///
/// # Arguments
///
/// * `model` - The Anthropic model identifier (e.g., `"claude-sonnet-4-5"`, `"claude-3-5-sonnet"`)
///
/// # Example
///
/// ```
/// use aquaregia::anthropic;
///
/// let model = anthropic("claude-sonnet-4-5");
/// assert_eq!(model.id(), "anthropic/claude-sonnet-4-5");
/// ```
/// Creates a typed Google model reference (`google/<model>`).
///
/// This function provides a convenient way to create a [`ModelRef`] for Google Generative AI models
/// with compile-time provider type safety.
///
/// # Arguments
///
/// * `model` - The Google model identifier (e.g., `"gemini-2.0-flash"`, `"gemini-1.5-pro"`)
///
/// # Example
///
/// ```
/// use aquaregia::google;
///
/// let model = google("gemini-2.0-flash");
/// assert_eq!(model.id(), "google/gemini-2.0-flash");
/// ```
/// Creates a typed OpenAI-compatible model reference (`openai-compatible/<model>`).
///
/// This function provides a convenient way to create a [`ModelRef`] for OpenAI-compatible
/// endpoints (e.g., DeepSeek, local LLM servers) with compile-time provider type safety.
///
/// # Arguments
///
/// * `model` - The model identifier for the compatible endpoint (e.g., `"deepseek-chat"`)
///
/// # Example
///
/// ```
/// use aquaregia::openai_compatible;
///
/// let model = openai_compatible("deepseek-chat");
/// assert_eq!(model.id(), "openai-compatible/deepseek-chat");
/// ```