Expand description
A Rust client library for Google’s Gemini AI models.
§Overview
This library provides a fully-featured client for interacting with Google’s Gemini AI models, supporting all major API features including:
- Text generation and chat conversations
- JSON-structured outputs
- Function calling
- Safety settings and content filtering
- System instructions
- Model configuration (temperature, tokens, etc.)
§Authentication
The client requires a Gemini API key which can be provided in two ways:
- Environment variable:
GEMINI_API_KEY
- Programmatically:
Client::new(api_key)
§Basic Usage
#[tokio::main]
async fn main() -> gemini_rs::Result<()> {
// Simple chat interaction
let response = gemini_rs::chat("gemini-2.0-flash")
.send_message("What is Rust's ownership model?")
.await?;
println!("{}", response);
Ok(())
}
§Advanced Features
The library supports advanced Gemini features through the Client
and Chat
types:
- Model management (
client().models()
) - Custom generation settings (
chat.config_mut()
) - Safety settings (
chat.safety_settings()
) - System instructions (
chat.system_instruction()
) - Conversation history management (
chat.history_mut()
)
Modules§
- types
- Contains every type used in the library
Structs§
- Chat
- Simplest way to use gemini-rs, and covers 80% of use cases
- Client
- Covers the 20% of use cases that Chat doesn’t
- Route
Stream - Stream
Generate Content
Enums§
Functions§
- chat
- Creates a new chat session with the specified Gemini model.
- client
- Creates a new Gemini client instance using the default configuration.