Rust GenAI
A Rust client library for interacting with Google's Generative AI (Gemini) API using the Interactions API.
Features
- Simple, intuitive API for making requests to Google's Generative AI models
- Support for both single-shot and streaming interactions
- Stateful conversations with automatic context management via
previous_interaction_id - Function calling with both manual and automatic execution
- Automatic function discovery at compile time using procedural macros
- Structured output with JSON schema enforcement via
with_response_format() - Built-in tool support: Google Search grounding, URL context, and code execution
- Comprehensive error handling with detailed error types
- Async/await support with Tokio
External Documentation
For authoritative Gemini API documentation, consult these sources:
| Document | Description |
|---|---|
| Interactions API Reference | API specification and endpoint details |
| Interactions API Guide | Usage patterns and best practices |
| Function Calling Guide | Function declaration and execution |
| Thought Signatures | Reasoning and thought content |
Installation
Add this to your Cargo.toml:
[]
= "0.4.0"
= "0.4.0" # Only if using the procedural macros
= { = "1.0", = ["full"] }
= "1.0"
= "0.3" # Only if using streaming responses
Prerequisites
- Rust 1.88 or later (edition 2024)
- A Google AI API key with access to Gemini models (get one from Google AI Studio)
Usage
See examples/ for complete, runnable examples covering all features.
Simple Interaction
use Client;
use env;
async
Streaming Responses
use Client;
use StreamExt;
let stream = client
.interaction
.with_model
.with_text
.create_stream;
pin_mut!;
while let Some = stream.next.await
See docs/STREAMING_API.md for stream types, resume capability, and patterns.
Stateful Conversations
// First turn - set system instruction
let response1 = client
.interaction
.with_model
.with_text
.with_system_instruction
.with_store_enabled
.create
.await?;
// Second turn - chain with previous
let response2 = client
.interaction
.with_model
.with_previous_interaction
.with_text // Model remembers: "Alice"
.create
.await?;
Key inheritance rules:
systemInstruction: Inherited (only send on first turn)tools: NOT inherited (must resend on each user message turn)
See docs/MULTI_TURN_FUNCTION_CALLING.md for comprehensive patterns.
Function Calling
Three approaches for client-side function calling:
| Approach | State | Best For |
|---|---|---|
#[tool] macro |
Stateless | Simple tools, quick prototyping |
ToolService |
Stateful | DB connections, API clients, shared config |
| Manual | Flexible | Custom execution logic, rate limiting |
Automatic with #[tool] Macro
use tool;
/// Get the weather in a location
let result = client
.interaction
.with_model
.with_text
.with_function
.create_with_auto_functions
.await?;
println!;
Stateful with ToolService
For tools that need shared state (database pools, API clients, configuration):
use ;
let service = new;
let result = client.interaction
.with_tool_service
.create_with_auto_functions
.await?;
See examples/tool_service.rs for a complete example.
Built-in Tools
// Google Search grounding
let response = client.interaction
.with_google_search
.with_text
.create.await?;
// Code execution (Python sandbox)
let response = client.interaction
.with_code_execution
.with_text
.create.await?;
// URL context
let response = client.interaction
.with_url_context
.with_text
.create.await?;
Thinking Mode
use ThinkingLevel;
let response = client.interaction
.with_thinking_level
.with_text
.create.await?;
for thought in response.thoughts
println!;
Multimodal Input
// Add images, audio, video, or documents
let response = client.interaction
.with_model
.with_text
.add_image_file.await?
.create.await?;
All media types follow the same pattern: add_*_file(), add_*_data(), add_*_uri().
Logging
Enable debug logging:
RUST_LOG=genai_rs=debug
For wire-level API debugging without configuring a logging backend:
LOUD_WIRE=1
See docs/LOGGING_STRATEGY.md for details on log levels and LOUD_WIRE output.
Project Structure
genai-rs(root): Public API crate withClient,InteractionBuilder, HTTP layer (src/http/), and type modulesgenai-rs-macros/: Procedural macro for#[tool]
Forward Compatibility
This library follows the Evergreen spec philosophy: unknown API types deserialize into Unknown variants instead of failing. Always include wildcard arms in match statements:
match output
Error Handling
Two main error types:
GenaiError: API/network errors (Http, Parse, Json, Utf8, Api, Internal, InvalidInput, MalformedResponse, Timeout, ClientBuild)FunctionError: Function calling errors (ArgumentMismatch, ExecutionError)
Troubleshooting
- "API key not valid": Verify
GEMINI_API_KEYis set correctly - "Model not found": Use valid model name (e.g.,
gemini-3-flash-preview) - Functions not executing: Use
create_with_auto_functions()for automatic execution - Image URL blocked: Use Google Cloud Storage URLs or base64-encoded images
Testing
See CLAUDE.md for test assertion strategies and development guidelines.
License
MIT License - see LICENSE for details.
Contributing
Contributions welcome! See CLAUDE.md for development guidelines.