agentix
A Rust framework for building LLM agents. Supports DeepSeek, OpenAI, Anthropic (Claude), and Google Gemini out of the box — plus any OpenAI-compatible endpoint. Define tools in plain Rust, plug them into an agent, and consume a stream of events as the model thinks, calls tools, and responds.
Quickstart
Set your API key and add the dependency:
# Cargo.toml
[]
= "0.1"
= "0.3"
= { = "1", = ["full"] }
= { = "1", = ["derive"] }
use ;
use StreamExt;
use ;
;
async
The agent runs the full loop for you: it calls the model, dispatches any tool calls, feeds the results back, and keeps going until the model stops requesting tools.
Defining tools
Annotate an impl Tool for YourStruct block with #[tool]. Each method becomes a callable tool:
- Doc comment on each method → tool description
/// param: descriptionlines → argument descriptions- Return type just needs to be
serde::Serialize— the macro handles the JSON schema
use tool;
use ;
;
// or just add it to an async fn
/// Divide two numbers.
/// a: first number
/// b: second number
async
One struct can have multiple methods — they register as separate tools. Stack as many tools as you need with .with_tool(...).
Streaming
Call .streaming() to get token-by-token output instead of waiting for the full response:
let mut stream = new
.streaming
.with_tool
.chat;
while let Some = stream.next.await
AgentEvent reference
| Variant | When | Notes |
|---|---|---|
Token(String) |
Model is speaking | Streaming: one fragment per chunk. Non-streaming: whole reply at once. |
ReasoningToken(String) |
Model is thinking | Only from reasoning models (e.g. deepseek-reasoner). |
ToolCall(ToolCallChunk) |
Tool call in progress | chunk.id, chunk.name, chunk.delta. Streaming: multiple per call. Non-streaming: one per call. |
ToolResult(ToolCallResult) |
Tool finished | result.name, result.args, result.result. |
Using a different model or provider
Four providers are built in, each with its own typed agent and correct wire format:
use ;
// DeepSeek (default base URL: https://api.deepseek.com)
let agent = new; // deepseek-chat
let agent = new.with_model;
// DeepSeek via a custom endpoint (e.g. OpenRouter)
let agent = custom;
// OpenAI — official API
let agent = official;
// OpenAI — any compatible endpoint
let agent = new;
// Anthropic (Claude) — official API
let agent = official;
// Anthropic — custom endpoint
let agent = new;
// Gemini — official API
let agent = official;
// Gemini — custom endpoint
let agent = new;
All four agent types share the same builder API (.streaming(), .with_tool(), .with_system_prompt(), etc.) and produce the same AgentEvent stream.
Custom top-level request fields (extra_body)
The extra_body mechanism merges arbitrary top-level JSON fields into the HTTP request body. Useful for provider-specific or experimental options not modelled by the typed request structure.
Fields are flattened into the top-level JSON, so they appear as peers to messages, model, etc. Avoid colliding with those reserved keys.
use json;
use DeepSeekAgent;
// Merge a map of fields
let agent = new
.extra_body;
// Or set a single field
let agent = new
.extra_field;
Injecting messages mid-run
Call agent.interrupt_sender() to get a channel sender that injects user messages into the running agent loop — useful when the user types something while tools are executing.
let agent = new
.streaming
.with_tool;
// Grab the sender before consuming the agent into a stream.
let tx = agent.interrupt_sender;
// In another task, send an interrupt at any time.
spawn;
let mut stream = agent.chat;
Behaviour:
- Between turns: queued interrupts are drained before the next API call.
- During tool execution: the running tool future is cancelled, a placeholder error result is recorded, and the injected message is appended to history before the next API turn.
- The sender is
tokio::sync::mpsc::UnboundedSender<String>— cheap to clone, non-blocking.
MCP tools
MCP (Model Context Protocol) lets you use external processes as tools — Node scripts, Python services, anything that speaks MCP over stdio:
[]
= { = "0.1", = ["mcp"] }
use ;
let agent = new
.with_tool;
Exposing tools as an MCP server
The mcp-server feature lets you turn any ToolBundle into a standalone MCP server so other LLM clients (Claude Desktop, MCP Studio, etc.) can call your Rust tools.
[]
= { = "0.1", = ["mcp-server"] }
= { = "1", = ["full"] }
Stdio mode (Claude Desktop / MCP Studio)
use ;
;
async
Register it in claude_desktop_config.json:
HTTP mode (Streamable HTTP transport)
use ;
;
async
Custom routing
For custom Axum routing, use into_http_service() to get a Tower-compatible service:
use ;
use StreamableHttpServerConfig;
let service = new
.into_http_service;
let router = new
.nest_service
.route;
let listener = bind.await?;
serve.await?;
Tool Bundle
ToolBundle groups multiple Tool implementations and builds a name→index map for O(1) dispatch.
use ;
let tools = new
.with
.with
.with;
let agent = new
.with_tool
.with_tool;
Contributing
PRs welcome. Keep changes focused; update public API docs when behaviour changes.
License
MIT OR Apache-2.0