Expand description
A lightweight, ergonomic framework for building AI agents in Rust.
Machi provides the building blocks for constructing AI agents that reason, use tools, and collaborate — powered by any LLM backend.
§Core Concepts
Agent— A self-contained unit with its own LLM provider, instructions, tools, and optional sub-agents.Runner— A stateless execution engine driving theReActloop (think → act → observe → repeat).Tool/DynTool— Capabilities that agents can invoke (filesystem, shell, web search, or custom).ChatProvider— Trait abstracting over LLM backends (OpenAI, Ollama, or custom).
§Feature Flags
| Feature | Description |
|---|---|
openai | OpenAI API backend |
ollama | Ollama local LLM backend |
derive | #[tool] proc-macro for deriving tools |
toolkit | Built-in filesystem, shell, and web search tools |
mcp | Model Context Protocol server integration |
a2a | Agent-to-Agent protocol support |
wallet | EVM wallet for blockchain interactions |
memory-sqlite | SQLite-backed session persistence |
schema | Structured output via JSON Schema generation |
full | All of the above (default) |
§Quick Start
use machi::agent::{Agent, RunConfig};
use machi::chat::ChatRequest;
use machi::message::Message;
// Build a chat request
let request = ChatRequest::new("gpt-4o")
.system("You are a helpful assistant.")
.user("Hello!")
.temperature(0.7);
// Configure an agent
let agent = Agent::new("assistant")
.instructions("You are a helpful assistant.")
.model("gpt-4o");
// Construct messages manually
let msgs = vec![
Message::system("You are helpful."),
Message::user("What is Rust?"),
];Re-exports§
pub use error::Error;pub use error::Result;pub use agent::AgentError;pub use llms::LlmError;pub use memory::MemoryError;pub use tool::ToolError;pub use wallet::WalletError;
Modules§
- a2a
- A2A (Agent-to-Agent) protocol integration for machi agents.
- agent
- Agent module — core abstractions for building AI agents.
- audio
- Audio processing types and provider traits.
- callback
- Callback hooks for agent lifecycle events.
- chat
- Chat types, traits, and utilities for LLM operations.
- embedding
- Embedding provider trait and types.
- error
- Unified error type for the machi framework.
- guardrail
- Guardrail module — safety checks for agent inputs and outputs.
- llms
- LLM backend implementations.
- mcp
- MCP (Model Context Protocol) integration for machi agents.
- memory
- Session-based memory for AI agents.
- message
- Unified message types for LLM chat interactions.
- prelude
- Prelude module for convenient imports.
- stream
- Streaming response types for LLM operations.
- tool
- Tool trait and utilities for defining agent tools.
- tools
- Built-in tools for agents.
- usage
- Token usage tracking for LLM operations.
- wallet
- Wallet module for AI agent blockchain interactions.
Attribute Macros§
- tool
- Transforms a function into a [
machi::tool::Tool] implementation.