agentix
Multi-provider LLM client for Rust — streaming, non-streaming, tool calls, and MCP support.
DeepSeek · OpenAI · Anthropic · Gemini — one unified API.
Installation
[]
= "0.7"
# Optional: Model Context Protocol (MCP) tool support
# agentix = { version = "0.7", features = ["mcp"] }
Quick Start
use ;
use StreamExt;
async
Providers
Four built-in providers, all using the same API:
use ;
// DeepSeek (default model: deepseek-chat)
let req = new;
// OpenAI (default model: gpt-4o)
let req = new;
// Anthropic / Claude (default model: claude-sonnet-4-20250514)
let req = new;
// Gemini (default model: gemini-2.0-flash)
let req = new;
// Any OpenAI-compatible endpoint (e.g. OpenRouter)
let req = new
.base_url
.model;
Request API
Request is a self-contained value type — it carries provider, credentials, model,
messages, tools, and tuning. Call stream() or complete() with a shared reqwest::Client.
stream() — streaming completion
let http = new;
let mut stream = new
.system_prompt
.user
.stream
.await?;
while let Some = stream.next.await
complete() — non-streaming completion
let resp = new
.user
.complete
.await?;
println!;
println!;
println!;
println!;
Builder methods
let req = new
.model
.base_url
.system_prompt
.max_tokens
.temperature
.retries // max retries, initial delay ms
.user // convenience for adding a user message
.message // add any Message variant
.messages // set full history
.tools; // set tool definitions
LlmEvent (what you receive from stream())
Token(String)— incremental response textReasoning(String)— thinking/reasoning trace (e.g. DeepSeek-R1)ToolCallChunk(ToolCallChunk)— partial tool call for real-time UIToolCall(ToolCall)— completed tool callUsage(UsageStats)— token usage for the turnDone— stream endedError(String)— provider error
Defining Tools
Annotate impl Tool for YourStruct with #[tool]. Each method becomes a callable tool.
use tool;
use ;
;
- Doc comment → tool description
/// param: descriptionlines → argument descriptionsResult::Errautomatically propagates as{"error": "..."}to the LLM
Streaming tools
Add #[streaming] to yield ToolOutput::Progress / ToolOutput::Result incrementally:
use ;
;
Normal and streaming methods can be freely mixed in the same #[tool] block.
MCP Tools
Use external processes as tools via the Model Context Protocol:
use McpTool;
use Duration;
let tool = stdio.await?
.with_timeout;
// Add to a ToolBundle alongside regular tools
let mut bundle = new;
bundle.push;
Reliability
- Automatic retries — exponential backoff for 429 / 5xx responses
- Usage tracking — per-request token accounting across all providers
Changelog
0.8.0
- Replaced
LlmClientwithRequest— self-contained value type with builder pattern - Replaced
Providertrait withProviderenum —DeepSeek,OpenAI,Anthropic,Gemini - Removed shared mutable state —
RequestisClone,Send,Sync; caller passes&reqwest::Client - Removed
AgentConfigfrom public API — all config lives inRequestfields
0.7.0
- Removed
Agentstruct —LlmClientis now the sole entry point; callers own the loop - Removed
Memorytrait —InMemory,SlidingWindow,TokenSlidingWindow,LlmSummarizerremoved - Removed
AgentEvent/AgentInput— onlyLlmEventremains - New
LlmClient::complete()— native non-streaming API for all four providers - New
CompleteResponse— content, reasoning, tool_calls, usage in one struct
0.6.0
- Non-streaming
complete()method onProvidertrait post_jsonhelper for non-streaming HTTP POST with retryCompleteResponsetype
0.5.0
AgentAPI withchat(),send(),subscribe(),add_tool(),abort(),usage()- Concurrent tool execution via
FuturesUnordered SlidingWindowfix for orphaned tool messages- Default HTTP timeouts (10 s connect, 120 s response)
0.4.x
- Initial multi-turn API
- DeepSeek, OpenAI, Anthropic, Gemini providers
#[tool]and#[streaming]macros- Memory backends, MCP tool support
License
MIT OR Apache-2.0