ds-api
A Rust SDK for building LLM agents on top of DeepSeek (and any OpenAI-compatible API). Define tools in plain Rust, plug them into an agent, and consume a stream of events as the model thinks, calls tools, and responds.
Quickstart
Set your API key and add the dependency:
# Cargo.toml
[]
= "0.6.0"
= "0.3"
= { = "1", = ["full"] }
= { = "1", = ["derive"] }
use ;
use StreamExt;
use ;
;
async
The agent runs the full loop for you: it calls the model, dispatches any tool calls, feeds the results back, and keeps going until the model stops requesting tools.
Defining tools
Annotate an impl Tool for YourStruct block with #[tool]. Each method becomes a callable tool:
- Doc comment on the impl block → tool description
/// param: descriptionlines in each method's doc comment → argument descriptions- Return type just needs to be
serde::Serialize— the macro handles the JSON schema
use tool;
use ;
;
One struct can have multiple methods — they register as separate tools. Stack as many tools as you need with .add_tool(...).
Streaming
Call .with_streaming() to get token-by-token output instead of waiting for the full response:
let mut stream = new
.with_streaming
.add_tool
.chat;
while let Some = stream.next.await
AgentEvent reference
| Variant | When | Notes |
|---|---|---|
Token(String) |
Model is speaking | Streaming: one fragment per chunk. Non-streaming: whole reply at once. |
ReasoningToken(String) |
Model is thinking | Only from reasoning models (e.g. deepseek-reasoner). |
ToolCall(ToolCallChunk) |
Tool call in progress | chunk.id, chunk.name, chunk.delta. Streaming: multiple per call. Non-streaming: one per call. |
ToolResult(ToolCallResult) |
Tool finished | result.name, result.args, result.result. |
Using a different model or provider
Any OpenAI-compatible endpoint works:
// OpenRouter
let agent = custom;
// deepseek-reasoner (think before responding)
let agent = new
.with_model;
Injecting messages mid-run
You can send a message into a running agent loop — useful when the user types something while the agent is still executing tools:
let = new
.with_streaming
.add_tool
.with_interrupt_channel;
// From any task, at any time:
tx.send.unwrap;
// The agent picks it up after the current tool-execution round finishes.
let mut stream = agent.chat;
MCP tools
MCP (Model Context Protocol) lets you use external processes as tools — Node scripts, Python services, anything that speaks MCP over stdio:
// Requires the `mcp` feature
let agent = new
.add_tool;
familiar
This repo includes familiar, a full Discord + web chat app built on ds-api. It shows persistent conversation history, multi-tool agents, streaming UI, and MCP integration in a real app. See familiar/ for details.
Contributing
PRs welcome. Keep changes focused; update public API docs when behaviour changes.
License
MIT OR Apache-2.0