agentix
A Rust framework for building LLM agents and multi-agent pipelines. Supports DeepSeek, OpenAI, Anthropic (Claude), and Google Gemini out of the box — plus any OpenAI-compatible endpoint.
Agents are actor-style: send a message, observe a stream of events. Multiple agents wire together into a Graph via typed channels.
Quickstart
[]
= "0.3"
= { = "1", = ["full"] }
use Msg;
async
Providers
Four built-in providers, all using the same builder API:
// DeepSeek (default model: deepseek-chat)
let agent = deepseek
.model;
// OpenAI (default model: gpt-4o)
let agent = openai;
// Anthropic / Claude (default model: claude-opus-4-5)
let agent = anthropic;
// Gemini (default model: gemini-2.0-flash)
let agent = gemini;
// Any OpenAI-compatible endpoint
use ;
let agent = new;
Builder chain
All configuration methods return Self, so the whole setup is one expression:
let agent = deepseek
.model
.system_prompt
.temperature
.max_tokens
.tool
.memory;
Msg — the event type
Every event that flows through an [EventBus] is a Msg:
| Variant | When |
|---|---|
TurnStart |
Generation turn begins |
Done |
Turn (including all tool rounds) complete |
User(Vec<UserContent>) |
User message submitted — text and/or images |
Token(String) |
LLM output — one chunk in streaming, full text in assembled view |
Reasoning(String) |
Reasoning trace (e.g. DeepSeek-R1) — same streaming/assembled duality |
ToolCall { id, name, args } |
Complete tool invocation request |
ToolResult { call_id, name, result } |
Tool execution result |
Error(String) |
Error during generation |
Custom(Arc<dyn CustomMsg>) |
Application-defined payload |
Streaming vs assembled
Subscribe to an [EventBus] in two ways:
// Raw streaming — Token arrives as individual chunks
let mut rx = agent.subscribe; // broadcast::Receiver<Msg>
// Assembled — Token chunks folded into one Token(full_text) before Done
let stream = agent.event_bus.subscribe_assembled; // impl Stream<Item = Msg>
The assembled view looks identical to what a non-streaming provider emits — same variant names, just complete content.
Defining tools
Annotate impl Tool for YourStruct with #[tool]. Each method becomes a callable tool:
use tool;
use ;
;
let agent = deepseek
.tool;
- Doc comment → tool description
/// param: descriptionlines → argument descriptions- Return type just needs to implement
serde::Serialize
Memory
Two built-in memory backends, or implement [Memory] yourself:
use ;
// Keep all history (default)
let agent = deepseek.memory;
// Keep only the last N turns
let agent = deepseek.memory;
EventBus — observability
Every agent publishes all events to its [EventBus]. Tap any bus without affecting the agent:
// Subscribe (get a Receiver)
let mut rx = agent.subscribe;
// Tap with an async callback (spawns a background task)
agent.event_bus.tap;
// Assembled stream — one Token per turn instead of many chunks
use StreamExt;
let mut stream = agent.event_bus.subscribe_assembled;
while let Some = stream.next.await
Graph — multi-agent pipelines
Wire [Node]s together with [Graph]. Each agent is a Node (has input() and output()).
Graph::edge(&from, &to) reads from's assembled output and feeds it as a user message into to's input:
use ;
// Simple two-agent chain
let summariser = deepseek.system_prompt;
let translator = deepseek.system_prompt;
new
.edge;
summariser.send.await;
// translator automatically receives the summarised text
PromptTemplate
A lightweight [Node] that renders a template before forwarding:
let prompt = new
.var;
let agent = deepseek;
new.edge;
// Send a raw user message into the template
prompt.input.send.await.unwrap;
// agent receives: "Translate the following to Japanese:\nHello world"
Variables: {input} is replaced by the incoming Msg::User text; other {key} placeholders are pre-set with .var(key, value).
OutputParser
A lightweight [Node] that transforms assembled text before forwarding:
let agent = deepseek
.system_prompt;
let parser = new;
new.edge;
// parser.output() emits Msg::User(vec!["7".into()]) (or whatever the model returned)
Middleware
Middlewares run on every message crossing any edge. Return None to drop:
new
.middleware
.middleware
.edge
.edge;
Full pipeline
let prompt = new;
let scorer = deepseek.system_prompt;
let parser = new;
let logger = deepseek.system_prompt;
new
.middleware
.edge
.edge
.edge;
prompt.input.send.await.unwrap;
Custom Node
Implement [Node] to plug any async processor into a graph:
use ;
use mpsc;
Multimodal (vision)
Send images alongside text using send_parts:
use ;
// URL image
agent.send_parts.await;
// Base64 image
let bytes = read.unwrap;
agent.send_parts.await;
For plain text, agent.send("…") still works unchanged.
MCP tools
Use external processes as tools via the Model Context Protocol:
[]
= { = "0.3", = ["mcp"] }
use McpTool;
let agent = deepseek
.tool;
Exposing tools as an MCP server
[]
= { = "0.3", = ["mcp-server"] }
Stdio (Claude Desktop / MCP Studio)
use ;
;
async
HTTP (Streamable HTTP transport)
new
.serve_http
.await?;
Custom Axum routing
use ;
let service = new
.into_http_service;
let router = new.nest_service;
serve.await?;
Contributing
PRs welcome. Keep changes focused; update public API docs when behaviour changes.
License
MIT OR Apache-2.0