Skip to main content

Crate agentix

Crate agentix 

Source
Expand description

agentix — Multi-provider LLM agent framework for Rust.

Supports DeepSeek, OpenAI, Anthropic, and Gemini out of the box. Built on a pure stream-based architecture where Nodes are stream transformers.

§Quickstart

use agentix::AgentEvent;
use futures::StreamExt;

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
    let mut agent = agentix::deepseek(std::env::var("DEEPSEEK_API_KEY")?)
        .system_prompt("You are helpful.");

    let mut stream = agent.chat("Hello!").await?;

    while let Some(event) = stream.next().await {
        match event {
            AgentEvent::Token(t) => print!("{t}"),
            _                    => {}
        }
    }
    Ok(())
}

§Agent API

All interaction goes through Agent. The runtime starts lazily on first use.

use agentix::{AgentEvent, AgentInput};
use futures::StreamExt;

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
    let mut agent = agentix::deepseek("sk-...")
        .system_prompt("You are helpful.")
        .max_tokens(1024);

    // chat() — lazy stream, ends at Done
    let mut stream = agent.chat("What is 2 + 2?").await?;
    while let Some(ev) = stream.next().await {
        if let AgentEvent::Token(t) = ev { print!("{t}"); }
    }

    // send() — fire-and-forget, accepts &str, String, or AgentInput
    agent.send("Follow up question").await?;
    agent.send(AgentInput::Abort).await?;          // abort

    // subscribe() — continuous stream, never stops at Done
    let mut rx = agent.subscribe();
    while let Some(ev) = rx.next().await { /* ... */ }

    // sender() — share the input channel with spawned tasks
    let tx = agent.sender();
    tokio::spawn(async move {
        tx.send(AgentInput::Abort).await.ok();
    });

    // add_tool() — add tools even after the first interaction
    agent.add_tool(agentix::tool_trait::ToolBundle::new()).await;

    // usage() — accumulated token counts across all turns
    println!("{:?}", agent.usage());
    Ok(())
}

Re-exports§

pub use agent::Agent;
pub use agent::AgentNode;
pub use client::LlmClient;
pub use config::AgentConfig;
pub use error::ApiError;
pub use memory::InMemory;
pub use memory::Memory;
pub use memory::SlidingWindow;
pub use memory::TokenSlidingWindow;
pub use memory::LlmSummarizer;
pub use msg::CustomEvent;
pub use msg::LlmEvent;
pub use msg::AgentEvent;
pub use msg::AgentInput;
pub use provider::AnthropicProvider;
pub use provider::DeepSeekProvider;
pub use provider::GeminiProvider;
pub use provider::OpenAIProvider;
pub use provider::Provider;
pub use request::ImageContent;
pub use request::ImageData;
pub use request::Message;
pub use request::Request;
pub use request::ResponseFormat;
pub use request::ToolChoice;
pub use request::UserContent;
pub use request::ToolCall;
pub use context::SharedContext;
pub use node::Node;
pub use node::TapNode;
pub use node::PromptNode;
pub use types::UsageStats;
pub use tool_trait::Tool;
pub use tool_trait::ToolBundle;
pub use tool_trait::ToolOutput;
pub use schemars;
pub use serde;
pub use serde_json;
pub use async_trait;
pub use futures;

Modules§

agent
client
config
context
error
markers
memory
msg
node
provider
raw
Raw API data structures
request
Unified request layer.
tool_trait
types
Shared types used across the agent, raw provider, and request layers.

Functions§

anthropic
Create an agent backed by the Anthropic Messages API. Default model: claude-opus-4-5.
deepseek
Create an agent backed by the DeepSeek API. Default model: deepseek-chat.
gemini
Create an agent backed by the Google Gemini API. Default model: gemini-2.0-flash.
openai
Create an agent backed by the OpenAI API (or any compatible endpoint). Default model: gpt-4o.

Attribute Macros§

tool
Annotate an impl Tool for X block (or a single fn) to generate the full Tool trait implementation.