Crate adk_rust

Crate adk_rust 

Source
Expand description

§Agent Development Kit (ADK) for Rust

Crates.io Documentation License

A flexible and modular framework for developing and deploying AI agents in Rust. While optimized for Gemini and the Google ecosystem, ADK is model-agnostic, deployment-agnostic, and compatible with other frameworks.

§Quick Start

Create your first AI agent in minutes:

use adk_rust::prelude::*;
use adk_rust::Launcher;
use std::sync::Arc;

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
    let api_key = std::env::var("GOOGLE_API_KEY")?;
    let model = GeminiModel::new(&api_key, "gemini-2.5-flash")?;

    let agent = LlmAgentBuilder::new("assistant")
        .description("A helpful AI assistant")
        .instruction("You are a friendly assistant. Answer questions concisely.")
        .model(Arc::new(model))
        .build()?;

    // Run in interactive console mode
    Launcher::new(Arc::new(agent)).run().await?;
    Ok(())
}

§Installation

Add to your Cargo.toml:

[dependencies]
adk-rust = "0.1"
tokio = { version = "1.40", features = ["full"] }
dotenv = "0.15"  # For loading .env files

§Feature Presets

# Full (default) - Everything included
adk-rust = "0.1"

# Minimal - Agents + Gemini + Runner only
adk-rust = { version = "0.1", default-features = false, features = ["minimal"] }

# Custom - Pick exactly what you need
adk-rust = { version = "0.1", default-features = false, features = [
    "agents", "gemini", "tools", "sessions"
] }

§Agent Types

ADK-Rust provides several agent types for different use cases:

§LlmAgent - AI-Powered Reasoning

The core agent type that uses Large Language Models for intelligent reasoning:

use adk_rust::prelude::*;
use std::sync::Arc;

let api_key = std::env::var("GOOGLE_API_KEY").map_err(|e| AdkError::Config(e.to_string()))?;
let model = GeminiModel::new(&api_key, "gemini-2.5-flash")?;

let agent = LlmAgentBuilder::new("researcher")
    .description("Research assistant with web search")
    .instruction("Search for information and provide detailed summaries.")
    .model(Arc::new(model))
    .tool(Arc::new(GoogleSearchTool::new()))  // Add tools
    .build()?;

§Workflow Agents - Deterministic Pipelines

For predictable, multi-step workflows:

use adk_rust::prelude::*;
use std::sync::Arc;

// Sequential: Execute agents in order
let pipeline = SequentialAgent::new(
    "content_pipeline",
    vec![researcher, writer, reviewer]
);

// Parallel: Execute agents concurrently
let parallel = ParallelAgent::new(
    "multi_analysis",
    vec![analyst1, analyst2]
);

// Loop: Iterate until condition met
let loop_agent = LoopAgent::new("iterative_refiner", vec![refiner])
    .with_max_iterations(5);

§Multi-Agent Systems

Build hierarchical agent systems with automatic delegation:

use adk_rust::prelude::*;
use std::sync::Arc;

let coordinator = LlmAgentBuilder::new("coordinator")
    .description("Development team coordinator")
    .instruction("Delegate coding tasks to specialists.")
    .model(model)
    .sub_agent(code_agent)   // Delegate to sub-agents
    .sub_agent(test_agent)
    .build()?;

§Tools

Give your agents capabilities beyond conversation:

§Function Tools - Custom Operations

Convert any async function into a tool:

use adk_rust::prelude::*;
use adk_rust::serde_json::{json, Value};
use std::sync::Arc;

async fn get_weather(_ctx: Arc<dyn ToolContext>, args: Value) -> Result<Value> {
    let city = args["city"].as_str().unwrap_or("Unknown");
    // Your weather API call here
    Ok(json!({
        "temperature": 72.0,
        "conditions": "Sunny",
        "city": city
    }))
}

let weather_tool = FunctionTool::new(
    "get_weather",
    "Get current weather for a city",
    get_weather,
);

§Built-in Tools

Ready-to-use tools included with ADK:

§MCP Tools - External Integrations

Connect to Model Context Protocol servers using the rmcp crate:

use adk_rust::prelude::*;
use adk_rust::tool::McpToolset;
use rmcp::{ServiceExt, transport::TokioChildProcess};
use tokio::process::Command;

// Connect to an MCP server (e.g., filesystem, database)
let client = ().serve(TokioChildProcess::new(
    Command::new("npx")
        .arg("-y")
        .arg("@anthropic/mcp-server-filesystem")
        .arg("/path/to/dir")
)?).await?;

let mcp_tools = McpToolset::new(client);

// Add all MCP tools to your agent
let agent = builder.toolset(Arc::new(mcp_tools)).build()?;

§Sessions & State

Manage conversation context and working memory:

use adk_rust::prelude::*;
use adk_rust::session::{SessionService, CreateRequest};
use adk_rust::serde_json::json;
use std::collections::HashMap;

let session_service = InMemorySessionService::new();

// Create a session
let session = session_service.create(CreateRequest {
    app_name: "my_app".to_string(),
    user_id: "user_123".to_string(),
    session_id: None,
    state: HashMap::new(),
}).await?;

// Read state (State trait provides read access)
let state = session.state();
let config = state.get("app:config");  // Returns Option<Value>

§Callbacks

Intercept and customize agent behavior:

use adk_rust::prelude::*;
use std::sync::Arc;

let agent = LlmAgentBuilder::new("monitored_agent")
    .model(model)
    // Modify or inspect model responses
    .after_model_callback(Box::new(|_ctx, response| {
        Box::pin(async move {
            println!("Model responded");
            Ok(Some(response)) // Return modified response or None to keep original
        })
    }))
    // Track tool usage
    .before_tool_callback(Box::new(|_ctx| {
        Box::pin(async move {
            println!("Tool about to be called");
            Ok(None) // Continue execution
        })
    }))
    .build()?;

§Artifacts

Store and retrieve binary data (images, files, etc.):

use adk_rust::prelude::*;
use adk_rust::artifact::{ArtifactService, SaveRequest, LoadRequest};

let artifact_service = InMemoryArtifactService::new();

// Save an artifact
let response = artifact_service.save(SaveRequest {
    app_name: "my_app".to_string(),
    user_id: "user_123".to_string(),
    session_id: "session_456".to_string(),
    file_name: "sales_chart.png".to_string(),
    part: Part::Text { text: "chart data".to_string() },
    version: None,
}).await?;

// Load an artifact
let loaded = artifact_service.load(LoadRequest {
    app_name: "my_app".to_string(),
    user_id: "user_123".to_string(),
    session_id: "session_456".to_string(),
    file_name: "sales_chart.png".to_string(),
    version: None,
}).await?;

§Deployment Options

§Console Mode (Interactive CLI)

use adk_rust::prelude::*;
use adk_rust::Launcher;
use std::sync::Arc;

// Interactive chat in terminal
Launcher::new(agent).run().await?;

§Server Mode (REST API)

# Run your agent as a web server
cargo run -- serve --port 8080

Provides endpoints:

  • POST /chat - Send messages
  • GET /sessions - List sessions
  • GET /health - Health check

§Agent-to-Agent (A2A) Protocol

Expose your agent for inter-agent communication:

use adk_rust::server::{create_app_with_a2a, ServerConfig};
use adk_rust::AgentLoader;

// Create server with A2A protocol support
let config = ServerConfig::new(agent_loader, session_service);
let app = create_app_with_a2a(config, Some("http://localhost:8080"));

// Run the server (requires axum dependency)
// let listener = tokio::net::TcpListener::bind("0.0.0.0:8080").await?;
// axum::serve(listener, app).await?;

§Observability

Built-in OpenTelemetry support for production monitoring:

use adk_rust::telemetry::{init_telemetry, init_with_otlp};

// Basic telemetry with console logging
init_telemetry("my-agent-service")?;

// Or with OTLP export for distributed tracing
// init_with_otlp("my-agent-service", "http://localhost:4317")?;

// All agent operations now emit traces and metrics

§Architecture

ADK-Rust uses a layered architecture for modularity:

┌─────────────────────────────────────────────────────────────┐
│                    Application Layer                        │
│              Launcher • REST Server • A2A                   │
├─────────────────────────────────────────────────────────────┤
│                      Runner Layer                           │
│           Agent Execution • Event Streaming                 │
├─────────────────────────────────────────────────────────────┤
│                      Agent Layer                            │
│    LlmAgent • CustomAgent • Sequential • Parallel • Loop    │
├─────────────────────────────────────────────────────────────┤
│                     Service Layer                           │
│      Models • Tools • Sessions • Artifacts • Memory         │
└─────────────────────────────────────────────────────────────┘

§Feature Flags

FeatureDescriptionDefault
agentsAgent implementations
modelsModel integrations
geminiGemini model support
toolsTool system
mcpMCP integration
sessionsSession management
artifactsArtifact storage
memorySemantic memory
runnerExecution runtime
serverHTTP server
telemetryOpenTelemetry
cliCLI launcher

§Examples

The examples directory contains working examples for every feature:

  • Agents: LLM agent, workflow agents, multi-agent systems
  • Tools: Function tools, Google Search, MCP integration
  • Sessions: State management, conversation history
  • Callbacks: Logging, guardrails, caching
  • Deployment: Console, server, A2A protocol

ADK-Rust is composed of modular crates that can be used independently:

Re-exports§

pub use anyhow;
pub use futures;
pub use serde;
pub use serde_json;
pub use tokio;

Modules§

agentagents
Agent implementations (LLM, Custom, Workflow agents).
agent_loader
Core traits and types.
artifactartifacts
Artifact storage.
callbacks
Core traits and types.
context
Core traits and types.
error
Core traits and types.
event
Core traits and types.
graphgraph
Graph-based workflow engine (LangGraph-inspired).
instruction_template
Core traits and types.
memorymemory
Memory system with semantic search.
modelmodels
Model integrations (Gemini, etc.).
prelude
Convenience prelude for common imports.
runnerrunner
Agent execution runtime.
serverserver
HTTP server (REST + A2A).
sessionsessions
Session management.
telemetrytelemetry
Telemetry (OpenTelemetry integration).
tooltools
Tool system and built-in tools.
types
Core traits and types.
uiui
Dynamic UI generation for agents.

Structs§

Content
Core traits and types.
Event
Core traits and types.
EventActions
Core traits and types.
GenerateContentConfig
Core traits and types.
Launchercli
CLI launcher for running agents.
LlmRequest
Core traits and types.
LlmResponse
Core traits and types.
MemoryEntry
Core traits and types.
MultiAgentLoader
Core traits and types.
RunConfig
Core traits and types.
SingleAgentLoadercli
CLI launcher for running agents.
UsageMetadata
Core traits and types.

Enums§

AdkError
Core traits and types.
BeforeModelResult
Core traits and types.
FinishReason
Core traits and types.
IncludeContents
Core traits and types.
Part
Core traits and types.
StreamingMode
Core traits and types.

Constants§

KEY_PREFIX_APP
Core traits and types.
KEY_PREFIX_TEMP
Core traits and types.
KEY_PREFIX_USER
Core traits and types.

Traits§

Agent
Core traits and types.
AgentLoader
Core traits and types.
Artifacts
Core traits and types.
CallbackContext
Core traits and types.
InvocationContext
Core traits and types.
Llm
Core traits and types.
Memory
Core traits and types.
ReadonlyContext
Core traits and types.
ReadonlyState
Core traits and types.
Session
Core traits and types.
State
Core traits and types.
Tool
Core traits and types.
ToolContext
Core traits and types.
Toolset
Core traits and types.

Functions§

inject_session_state
Core traits and types.

Type Aliases§

AfterAgentCallback
Core traits and types.
AfterModelCallback
Core traits and types.
AfterToolCallback
Core traits and types.
BeforeAgentCallback
Core traits and types.
BeforeModelCallback
Core traits and types.
BeforeToolCallback
Core traits and types.
EventStream
Core traits and types.
GlobalInstructionProvider
Core traits and types.
InstructionProvider
Core traits and types.
LlmResponseStream
Core traits and types.
Result
Core traits and types.
ToolPredicate
Core traits and types.

Attribute Macros§

async_trait