Please check the build logs for more information.
See Builds for ideas on how to fix a failed build, or Metadata for how to configure docs.rs builds.
If you believe this is docs.rs' fault, open an issue.
adk-gemini
Rust client library for Google's Gemini API — content generation, streaming, function calling, embeddings, image/speech generation, batch processing, caching, and Vertex AI.
Overview
adk-gemini is a comprehensive Rust client for the Google Gemini API, maintained as part of the ADK-Rust project. It provides full coverage of the Gemini API surface:
- Content generation (text, images, audio)
- Real-time streaming responses
- Function calling and tool integration (including Google Search and URL Context)
- Thinking mode (Gemini 2.5 / Gemini 3)
- Text embeddings
- Image generation and editing
- Text-to-speech (single and multi-speaker)
- Batch processing
- Content caching
- File upload and management
- Structured JSON output
- Grounding with Google Search
- Vertex AI (Google Cloud) support with ADC, service accounts, and WIF
- Multimodal input (images, video, PDF, audio)
Installation
[]
= "0.3.0"
Or through adk-model:
[]
= { = "0.3.0", = ["gemini"] }
Quick Start
use Gemini;
async
Client Constructors
| Constructor | Description |
|---|---|
Gemini::new(api_key) |
Default model (gemini-2.5-flash) via v1beta |
Gemini::pro(api_key) |
Gemini 2.5 Pro |
Gemini::with_model(api_key, model) |
Specific model |
Gemini::with_v1(api_key) |
Stable v1 API |
Gemini::with_model_v1(api_key, model) |
Specific model on v1 |
Gemini::with_base_url(api_key, url) |
Custom endpoint |
Gemini::with_google_cloud(api_key, project, location) |
Vertex AI |
Gemini::with_google_cloud_adc(project, location) |
Vertex AI with ADC |
Gemini::with_service_account_json(json) |
Service account (auto-detects project) |
Gemini::with_google_cloud_wif_json(json, project, location, model) |
Workload Identity Federation |
Examples
Streaming
use Gemini;
use TryStreamExt;
let client = new?;
let mut stream = client
.generate_content
.with_system_prompt
.with_user_message
.execute_stream
.await?;
while let Some = stream.try_next.await?
Function Calling
use *;
use JsonSchema;
use ;
let get_weather = new.;
let response = client
.generate_content
.with_user_message
.with_function
.with_function_calling_mode
.execute
.await?;
if let Some = response.function_calls.first
Google Search Grounding
use ;
let response = client
.generate_content
.with_user_message
.with_tool
.execute
.await?;
println!;
// Access grounding metadata
if let Some = response.candidates.first
.and_then
URL Context
use ;
let response = client
.generate_content
.with_user_message
.with_tool
.execute
.await?;
Thinking Mode (Gemini 2.5 / Gemini 3 Pro)
let client = pro?;
let response = client
.generate_content
.with_user_message
.with_thinking_budget
.with_thoughts_included
.execute
.await?;
// Access the model's reasoning
for thought in response.thoughts
println!;
Structured JSON Output
use json;
let schema = json!;
let response = client
.generate_content
.with_user_message
.with_response_mime_type
.with_response_schema
.execute
.await?;
let parsed: Value = from_str?;
Text Embeddings
use ;
let client = with_model?;
let response = client
.embed_content
.with_text
.with_task_type
.execute
.await?;
println!;
Image Generation
let client = with_model?;
let response = client
.generate_content
.with_user_message
.execute
.await?;
// Response contains inline image data (base64-encoded)
for candidate in &response.candidates
Text-to-Speech
use *;
let client = with_model?;
let response = client
.generate_content
.with_user_message
.with_generation_config
.execute
.await?;
Content Caching
use Duration;
let cache = client
.create_cache
.with_display_name?
.with_system_instruction
.with_user_message
.with_ttl
.execute
.await?;
// Reuse the cache across multiple queries
let response = client
.generate_content
.with_cached_content
.with_user_message
.execute
.await?;
Batch Processing
let request1 = client
.generate_content
.with_user_message
.build;
let request2 = client
.generate_content
.with_user_message
.build;
let batch = client
.batch_generate_content
.with_request
.with_request
.execute
.await?;
Multi-Turn Conversation
let response1 = client
.generate_content
.with_system_prompt
.with_user_message
.execute
.await?;
let response2 = client
.generate_content
.with_system_prompt
.with_user_message
.with_model_message
.with_user_message
.execute
.await?;
Vertex AI (Google Cloud)
// API key auth
let client = with_google_cloud?;
// Application Default Credentials
let client = with_google_cloud_adc?;
// Service account
let sa_json = read_to_string?;
let client = with_google_cloud_service_account_json?;
// Workload Identity Federation
let wif_json = read_to_string?;
let client = with_google_cloud_wif_json?;
Generation Config
use GenerationConfig;
let response = client
.generate_content
.with_user_message
.with_generation_config
.execute
.await?;
API Modules
| Module | Description |
|---|---|
generation |
Content generation (text, images, audio) |
embedding |
Text embedding generation |
batch |
Batch processing for multiple requests |
files |
File upload and management |
cache |
Content caching for reusable contexts |
safety |
Content moderation and safety settings |
tools |
Function calling and tool integration |
models |
Core primitive types (Content, Part, Role, Blob) |
prelude |
Convenient re-exports of commonly used types |
Environment Variables
# Gemini API
GEMINI_API_KEY=your-api-key
# or
GOOGLE_API_KEY=your-api-key
# Vertex AI (Google Cloud)
GOOGLE_CLOUD_PROJECT=my-project
GOOGLE_CLOUD_LOCATION=us-central1
GOOGLE_APPLICATION_CREDENTIALS=/path/to/service-account.json
Running Examples
Related Crates
- adk-rust - Meta-crate with all components
- adk-model - Multi-provider LLM integrations (uses adk-gemini internally)
- adk-core - Core
Llmtrait - adk-agent - Agent implementations
License
MIT
Original work Copyright (c) 2024 @flachesis Modifications Copyright (c) 2024 Zavora AI
Part of ADK-Rust
This crate is part of the ADK-Rust framework for building AI agents in Rust.
Attribution
This crate is a fork of the excellent gemini-rust library by @flachesis. We are deeply grateful for their work in creating and maintaining this high-quality Gemini API client.
Upstream Project
- Repository: github.com/flachesis/gemini-rust
- Crates.io: crates.io/crates/gemini-rust
- Original Author: @flachesis
Why a Fork?
The ADK-Rust project requires certain extensions for deep integration with the Agent Development Kit — exporting additional types (e.g., GroundingMetadata, GroundingChunk) for grounding support, future ADK-specific extensions for agent workflows, and workspace-level version management. We regularly sync with upstream to incorporate improvements and fixes.
Our Commitment
- Staying aligned with the upstream gemini-rust project as much as possible
- Contributing back any general improvements that benefit the broader community
- Maintaining attribution and respecting the original MIT license
- Minimizing divergence — only adding ADK-specific extensions when necessary
Acknowledgments
- @flachesis — Creator and maintainer of the original gemini-rust library
- @npatsakula — Major contributions to the upstream project