Expand description
§Agent Development Kit (ADK) for Rust
A flexible and modular framework for developing and deploying AI agents in Rust. While optimized for Gemini and the Google ecosystem, ADK is model-agnostic, deployment-agnostic, and compatible with other frameworks.
§Quick Start
Create your first AI agent in minutes:
use adk_rust::prelude::*;
use adk_rust::Launcher;
use std::sync::Arc;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let api_key = std::env::var("GOOGLE_API_KEY")?;
let model = GeminiModel::new(&api_key, "gemini-2.5-flash")?;
let agent = LlmAgentBuilder::new("assistant")
.description("A helpful AI assistant")
.instruction("You are a friendly assistant. Answer questions concisely.")
.model(Arc::new(model))
.build()?;
// Run in interactive console mode
Launcher::new(Arc::new(agent)).run().await?;
Ok(())
}§Installation
Add to your Cargo.toml:
[dependencies]
adk-rust = "0.1"
tokio = { version = "1.40", features = ["full"] }
dotenv = "0.15" # For loading .env files§Feature Presets
# Full (default) - Everything included
adk-rust = "0.1"
# Minimal - Agents + Gemini + Runner only
adk-rust = { version = "0.1", default-features = false, features = ["minimal"] }
# Custom - Pick exactly what you need
adk-rust = { version = "0.1", default-features = false, features = [
"agents", "gemini", "tools", "sessions"
] }§Agent Types
ADK-Rust provides several agent types for different use cases:
§LlmAgent - AI-Powered Reasoning
The core agent type that uses Large Language Models for intelligent reasoning:
use adk_rust::prelude::*;
use std::sync::Arc;
let api_key = std::env::var("GOOGLE_API_KEY").map_err(|e| AdkError::Config(e.to_string()))?;
let model = GeminiModel::new(&api_key, "gemini-2.5-flash")?;
let agent = LlmAgentBuilder::new("researcher")
.description("Research assistant with web search")
.instruction("Search for information and provide detailed summaries.")
.model(Arc::new(model))
.tool(Arc::new(GoogleSearchTool::new())) // Add tools
.build()?;§Workflow Agents - Deterministic Pipelines
For predictable, multi-step workflows:
use adk_rust::prelude::*;
use std::sync::Arc;
// Sequential: Execute agents in order
let pipeline = SequentialAgent::new(
"content_pipeline",
vec![researcher, writer, reviewer]
);
// Parallel: Execute agents concurrently
let parallel = ParallelAgent::new(
"multi_analysis",
vec![analyst1, analyst2]
);
// Loop: Iterate until condition met
let loop_agent = LoopAgent::new("iterative_refiner", vec![refiner])
.with_max_iterations(5);§Multi-Agent Systems
Build hierarchical agent systems with automatic delegation:
use adk_rust::prelude::*;
use std::sync::Arc;
let coordinator = LlmAgentBuilder::new("coordinator")
.description("Development team coordinator")
.instruction("Delegate coding tasks to specialists.")
.model(model)
.sub_agent(code_agent) // Delegate to sub-agents
.sub_agent(test_agent)
.build()?;§Tools
Give your agents capabilities beyond conversation:
§Function Tools - Custom Operations
Convert any async function into a tool:
use adk_rust::prelude::*;
use adk_rust::serde_json::{json, Value};
use std::sync::Arc;
async fn get_weather(_ctx: Arc<dyn ToolContext>, args: Value) -> Result<Value> {
let city = args["city"].as_str().unwrap_or("Unknown");
// Your weather API call here
Ok(json!({
"temperature": 72.0,
"conditions": "Sunny",
"city": city
}))
}
let weather_tool = FunctionTool::new(
"get_weather",
"Get current weather for a city",
get_weather,
);§Built-in Tools
Ready-to-use tools included with ADK:
GoogleSearchTool- Web search via GoogleExitLoopTool- Control loop terminationLoadArtifactsTool- Access stored artifacts
§MCP Tools - External Integrations
Connect to Model Context Protocol servers using the rmcp crate:
use adk_rust::prelude::*;
use adk_rust::tool::McpToolset;
use rmcp::{ServiceExt, transport::TokioChildProcess};
use tokio::process::Command;
// Connect to an MCP server (e.g., filesystem, database)
let client = ().serve(TokioChildProcess::new(
Command::new("npx")
.arg("-y")
.arg("@anthropic/mcp-server-filesystem")
.arg("/path/to/dir")
)?).await?;
let mcp_tools = McpToolset::new(client);
// Add all MCP tools to your agent
let agent = builder.toolset(Arc::new(mcp_tools)).build()?;§Sessions & State
Manage conversation context and working memory:
use adk_rust::prelude::*;
use adk_rust::session::{SessionService, CreateRequest};
use adk_rust::serde_json::json;
use std::collections::HashMap;
let session_service = InMemorySessionService::new();
// Create a session
let session = session_service.create(CreateRequest {
app_name: "my_app".to_string(),
user_id: "user_123".to_string(),
session_id: None,
state: HashMap::new(),
}).await?;
// Read state (State trait provides read access)
let state = session.state();
let config = state.get("app:config"); // Returns Option<Value>§Callbacks
Intercept and customize agent behavior:
use adk_rust::prelude::*;
use std::sync::Arc;
let agent = LlmAgentBuilder::new("monitored_agent")
.model(model)
// Modify or inspect model responses
.after_model_callback(Box::new(|_ctx, response| {
Box::pin(async move {
println!("Model responded");
Ok(Some(response)) // Return modified response or None to keep original
})
}))
// Track tool usage
.before_tool_callback(Box::new(|_ctx| {
Box::pin(async move {
println!("Tool about to be called");
Ok(None) // Continue execution
})
}))
.build()?;§Artifacts
Store and retrieve binary data (images, files, etc.):
use adk_rust::prelude::*;
use adk_rust::artifact::{ArtifactService, SaveRequest, LoadRequest};
let artifact_service = InMemoryArtifactService::new();
// Save an artifact
let response = artifact_service.save(SaveRequest {
app_name: "my_app".to_string(),
user_id: "user_123".to_string(),
session_id: "session_456".to_string(),
file_name: "sales_chart.png".to_string(),
part: Part::Text { text: "chart data".to_string() },
version: None,
}).await?;
// Load an artifact
let loaded = artifact_service.load(LoadRequest {
app_name: "my_app".to_string(),
user_id: "user_123".to_string(),
session_id: "session_456".to_string(),
file_name: "sales_chart.png".to_string(),
version: None,
}).await?;§Deployment Options
§Console Mode (Interactive CLI)
use adk_rust::prelude::*;
use adk_rust::Launcher;
use std::sync::Arc;
// Interactive chat in terminal
Launcher::new(agent).run().await?;§Server Mode (REST API)
# Run your agent as a web server
cargo run -- serve --port 8080Provides endpoints:
POST /chat- Send messagesGET /sessions- List sessionsGET /health- Health check
§Agent-to-Agent (A2A) Protocol
Expose your agent for inter-agent communication:
use adk_rust::server::{create_app_with_a2a, ServerConfig};
use adk_rust::AgentLoader;
// Create server with A2A protocol support
let config = ServerConfig::new(agent_loader, session_service);
let app = create_app_with_a2a(config, Some("http://localhost:8080"));
// Run the server (requires axum dependency)
// let listener = tokio::net::TcpListener::bind("0.0.0.0:8080").await?;
// axum::serve(listener, app).await?;§Observability
Built-in OpenTelemetry support for production monitoring:
use adk_rust::telemetry::{init_telemetry, init_with_otlp};
// Basic telemetry with console logging
init_telemetry("my-agent-service")?;
// Or with OTLP export for distributed tracing
// init_with_otlp("my-agent-service", "http://localhost:4317")?;
// All agent operations now emit traces and metrics§Architecture
ADK-Rust uses a layered architecture for modularity:
┌─────────────────────────────────────────────────────────────┐
│ Application Layer │
│ Launcher • REST Server • A2A │
├─────────────────────────────────────────────────────────────┤
│ Runner Layer │
│ Agent Execution • Event Streaming │
├─────────────────────────────────────────────────────────────┤
│ Agent Layer │
│ LlmAgent • CustomAgent • Sequential • Parallel • Loop │
├─────────────────────────────────────────────────────────────┤
│ Service Layer │
│ Models • Tools • Sessions • Artifacts • Memory │
└─────────────────────────────────────────────────────────────┘§Feature Flags
| Feature | Description | Default |
|---|---|---|
agents | Agent implementations | ✅ |
models | Model integrations | ✅ |
gemini | Gemini model support | ✅ |
tools | Tool system | ✅ |
mcp | MCP integration | ✅ |
sessions | Session management | ✅ |
artifacts | Artifact storage | ✅ |
memory | Semantic memory | ✅ |
runner | Execution runtime | ✅ |
server | HTTP server | ✅ |
telemetry | OpenTelemetry | ✅ |
cli | CLI launcher | ✅ |
§Examples
The examples directory contains working examples for every feature:
- Agents: LLM agent, workflow agents, multi-agent systems
- Tools: Function tools, Google Search, MCP integration
- Sessions: State management, conversation history
- Callbacks: Logging, guardrails, caching
- Deployment: Console, server, A2A protocol
§Related Crates
ADK-Rust is composed of modular crates that can be used independently:
adk-core- Core traits and typesadk-agent- Agent implementationsadk-model- LLM integrationsadk-tool- Tool systemadk-session- Session managementadk-artifact- Artifact storageadk-runner- Execution runtimeadk-server- HTTP serveradk-telemetry- Observability
Re-exports§
pub use anyhow;pub use futures;pub use serde;pub use serde_json;pub use tokio;
Modules§
- agent
agents - Agent implementations (LLM, Custom, Workflow agents).
- agent_
loader - Core traits and types.
- artifact
artifacts - Artifact storage.
- callbacks
- Core traits and types.
- context
- Core traits and types.
- error
- Core traits and types.
- event
- Core traits and types.
- graph
graph - Graph-based workflow engine (LangGraph-inspired).
- instruction_
template - Core traits and types.
- memory
memory - Memory system with semantic search.
- model
models - Model integrations (Gemini, etc.).
- prelude
- Convenience prelude for common imports.
- runner
runner - Agent execution runtime.
- server
server - HTTP server (REST + A2A).
- session
sessions - Session management.
- telemetry
telemetry - Telemetry (OpenTelemetry integration).
- tool
tools - Tool system and built-in tools.
- types
- Core traits and types.
- ui
ui - Dynamic UI generation for agents.
Structs§
- Content
- Core traits and types.
- Event
- Core traits and types.
- Event
Actions - Core traits and types.
- Generate
Content Config - Core traits and types.
- Launcher
cli - CLI launcher for running agents.
- LlmRequest
- Core traits and types.
- LlmResponse
- Core traits and types.
- Memory
Entry - Core traits and types.
- Multi
Agent Loader - Core traits and types.
- RunConfig
- Core traits and types.
- Single
Agent Loader cli - CLI launcher for running agents.
- Usage
Metadata - Core traits and types.
Enums§
- AdkError
- Core traits and types.
- Before
Model Result - Core traits and types.
- Finish
Reason - Core traits and types.
- Include
Contents - Core traits and types.
- Part
- Core traits and types.
- Streaming
Mode - Core traits and types.
Constants§
- KEY_
PREFIX_ APP - Core traits and types.
- KEY_
PREFIX_ TEMP - Core traits and types.
- KEY_
PREFIX_ USER - Core traits and types.
Traits§
- Agent
- Core traits and types.
- Agent
Loader - Core traits and types.
- Artifacts
- Core traits and types.
- Callback
Context - Core traits and types.
- Invocation
Context - Core traits and types.
- Llm
- Core traits and types.
- Memory
- Core traits and types.
- Readonly
Context - Core traits and types.
- Readonly
State - Core traits and types.
- Session
- Core traits and types.
- State
- Core traits and types.
- Tool
- Core traits and types.
- Tool
Context - Core traits and types.
- Toolset
- Core traits and types.
Functions§
- inject_
session_ state - Core traits and types.
Type Aliases§
- After
Agent Callback - Core traits and types.
- After
Model Callback - Core traits and types.
- After
Tool Callback - Core traits and types.
- Before
Agent Callback - Core traits and types.
- Before
Model Callback - Core traits and types.
- Before
Tool Callback - Core traits and types.
- Event
Stream - Core traits and types.
- Global
Instruction Provider - Core traits and types.
- Instruction
Provider - Core traits and types.
- LlmResponse
Stream - Core traits and types.
- Result
- Core traits and types.
- Tool
Predicate - Core traits and types.