Expand description
agentix — Multi-provider LLM agent framework for Rust.
Supports DeepSeek, OpenAI, Anthropic, and Gemini out of the box.
Agents are actor-style handles; multiple agents wire together into a
Graph via typed Msg channels.
§Quickstart
use agentix::Msg;
#[tokio::main]
async fn main() {
let agent = agentix::deepseek(std::env::var("DEEPSEEK_API_KEY").unwrap())
.system_prompt("You are helpful.")
.max_tokens(1024);
let mut rx = agent.subscribe();
agent.send("Hello!").await;
while let Ok(msg) = rx.recv().await {
match msg {
Msg::Token(t) => print!("{t}"),
Msg::Done => break,
_ => {}
}
}
}§Multi-agent pipeline
use agentix::{Graph, Node, PromptTemplate, OutputParser, Msg};
#[tokio::main]
async fn main() {
let prompt = PromptTemplate::new("Score this review 0-10:\n{input}");
let scorer = agentix::deepseek(std::env::var("KEY").unwrap())
.system_prompt("Respond with only JSON: {\"score\": N}");
let parser = OutputParser::new(|s| {
serde_json::from_str::<serde_json::Value>(&s)
.ok()
.and_then(|v| v["score"].as_i64().map(|n| n.to_string()))
.unwrap_or_else(|| "0".into())
});
let _handle = Graph::new()
.middleware(|msg| { eprintln!("[edge] {msg:?}"); Some(msg) })
.edge(&prompt, &scorer)
.edge(&scorer, &parser)
.into_handle();
prompt.input()
.send(Msg::User(vec!["Great product, fast shipping!".into()]))
.await
.unwrap();
}§Assembled vs streaming events
Every EventBus can be consumed two ways:
let agent = agentix::deepseek(std::env::var("KEY").unwrap());
// Raw — one Token per streaming chunk
let mut rx = agent.subscribe();
// Assembled — many Token chunks folded into one Token(full_text)
let mut stream = Box::pin(agent.event_bus().subscribe_assembled());
while let Some(msg) = stream.next().await {
match msg {
Msg::Token(full) => println!("{full}"),
Msg::Done => break,
_ => {}
}
}Re-exports§
pub use bus::EventBus;pub use client::LlmClient;pub use config::AgentConfig;pub use error::ApiError;pub use memory::InMemory;pub use memory::Memory;pub use memory::SlidingWindow;pub use msg::CustomMsg;pub use msg::Msg;pub use provider::AnthropicProvider;pub use provider::DeepSeekProvider;pub use provider::GeminiProvider;pub use provider::OpenAIProvider;pub use provider::Provider;pub use request::ImageContent;pub use request::ImageData;pub use request::Message;pub use request::Request;pub use request::ResponseFormat;pub use request::ToolChoice;pub use request::UserContent;pub use node::Graph;pub use node::GraphHandle;pub use node::MiddlewareFn;pub use node::Node;pub use node::OutputParser;pub use node::PromptTemplate;pub use tool_trait::Tool;pub use tool_trait::ToolBundle;pub use schemars;pub use serde;pub use serde_json;pub use async_trait;
Modules§
- bus
- client
- config
- context
- error
- markers
- memory
- msg
- node
- provider
- raw
- Raw API data structures
- request
- Unified request layer.
- tool_
trait - types
- Shared types used across the agent, raw provider, and request layers.
Structs§
- Agent
- A clonable, actor-style agent handle.
Functions§
- anthropic
- Create an agent backed by the Anthropic Messages API. Default model:
claude-opus-4-5. - deepseek
- Create an agent backed by the DeepSeek API. Default model:
deepseek-chat. - gemini
- Create an agent backed by the Google Gemini API. Default model:
gemini-2.0-flash. - openai
- Create an agent backed by the OpenAI API (or any compatible endpoint). Default model:
gpt-4o.