Rust LangGraph
Graph-native LLM workflows in Rust — inspired by LangGraph, built by the community
Not affiliated with LangChain. This is an independent Rust library with a similar programming model.
Table of contents
- Quick reference (implementers & AI agents)
- What this crate is
- Who should use it
- Installation
- Copy-paste
Cargo.tomlrecipes - Environment variables
- Five-minute tutorial
- Core concepts
- Graph API reference
- LLMs and agents
- Feature flags (detailed)
- Prelude and conditional exports
- Common mistakes & compile errors
- Verification commands
- Project layout
- Examples
- Documentation
- Comparison with Python LangGraph
- License & acknowledgments
Quick reference (implementers & AI agents)
Use this block as a single source of truth before writing or generating code.
| Fact | Correct value |
|---|---|
Cargo package name (in [dependencies]) |
rust-langgraph (hyphen) |
Rust crate path (in use) |
rust_langgraph (underscore) |
| Wrong | langgraph:: — that is not this crate’s name |
| Async runtime | Tokio required (#[tokio::main] or equivalent) |
| Edition | Rust 2021 |
| Default Cargo features | memory-checkpoint (enables MemorySaver) |
| LLM modules | Not in the build unless you add the matching feature |
Rule: If you use rust_langgraph::llm::ollama::..., your Cargo.toml must include features = ["ollama"] (same for openai, openrouter, anthropic). If you use create_react_agent / Tool, you must enable prebuilt and at least one LLM feature for a real model.
Human + agent doc: AGENTS.md — patterns, signatures, and pitfalls in compact form.
What this crate is
Rust LangGraph (crate name: rust-langgraph, Rust import: rust_langgraph) helps you build stateful workflows as a directed graph:
- Nodes are async functions (or types implementing
Node) that read and return state. - Edges connect nodes: fixed edges or conditional edges that choose the next node from state.
- Execution follows a Pregel-style loop: run nodes, merge state, optionally checkpoint, repeat until done.
Use it for multi-step LLM apps, tool-calling agents, branching pipelines, and anything that fits “steps + shared state + optional loops.”
Who should use it
| You want… | Use… |
|---|---|
| A small graph without LLMs | StateGraph + custom State |
| Chat + tools (ReAct-style) | prebuilt + an LLM feature: create_react_agent, Tool, ToolNode |
| Local models | ollama → llm::ollama::OllamaAdapter |
| OpenAI API | openai → llm::openai::OpenAIAdapter |
| OpenRouter (many providers, one API) | openrouter → llm::openrouter::OpenRouterAdapter |
| Anthropic API | anthropic → llm::anthropic::AnthropicAdapter |
| Persistence between runs | MemorySaver (default feature) or sqlite / postgres |
Installation
Minimal Cargo.toml (graph core only — checkpoints in memory):
[]
= "0.1"
= { = "1", = ["full"] }
= { = "1", = ["derive"] }
= "0.3" # for StreamExt when using CompiledGraph::stream
Import:
use *;
Enable optional features as needed (see Copy-paste recipes and Feature flags).
Requirements:
- Rust 2021
- Tokio — the library is async-first
Copy-paste Cargo.toml recipes
Replace version pins if your workspace pins differently.
Graph + in-memory checkpoints only (default)
[]
= "0.1"
= { = "1", = ["full"] }
= { = "1", = ["derive"] }
= "0.3"
+ Ollama (local HTTP API)
= { = "0.1", = ["ollama"] }
+ OpenAI (OPENAI_API_KEY for OpenAIAdapter::new)
= { = "0.1", = ["openai"] }
+ OpenRouter (quickstart)
= { = "0.1", = ["openrouter"] }
+ Anthropic (pass key via AnthropicAdapter::with_api_key — no standard env in adapter)
= { = "0.1", = ["anthropic"] }
ReAct agent (tools + graph) + Ollama
= { = "0.1", = ["ollama", "prebuilt"] }
ReAct + OpenRouter
= { = "0.1", = ["openrouter", "prebuilt"] }
All optional LLM adapters (for examples or experimentation)
= { = "0.1", = [
"ollama", "openai", "openrouter", "anthropic", "prebuilt"
] }
SQLite checkpoints
= { = "0.1", = ["sqlite"] }
PostgreSQL checkpoints
= { = "0.1", = ["postgres"] }
Environment variables
| Variable | Used by | Notes |
|---|---|---|
OPENAI_API_KEY |
OpenAIAdapter::new(...) |
with_api_key bypasses env |
OPENROUTER_API_KEY |
OpenRouterAdapter::new(...) |
with_api_key bypasses env |
| (none by default) | AnthropicAdapter |
Use AnthropicAdapter::with_api_key("sk-ant-...") |
| (none by default) | OllamaAdapter |
Default base http://localhost:11434; override with with_base_url |
Set secrets in the environment or inject keys explicitly in code — do not commit API keys.
Five-minute tutorial
1. Define state
State must implement State: Clone, Serialize/Deserialize, Debug, and merge (how updates from multiple nodes combine).
use *;
use ;
2. Build a graph
let mut graph = new;
graph.add_node;
graph.set_entry_point;
graph.set_finish_point;
let mut app = graph.compile?;
let out = app.invoke.await?;
// out.n == 1
3. Conditional routing (optional)
Use pregel::BranchResult: single("node_id"), end(), or more advanced variants.
use BranchResult;
graph.add_conditional_edges;
Core concepts
Mental model
- You declare nodes by name and pass a closure or a type implementing
Node<S>. - You connect nodes with
add_edgeoradd_conditional_edges. - You set
set_entry_point(where execution starts) and usuallyset_finish_point(terminal nodes). compileproduces aCompiledGraphyou callinvokeorstreamon.
State
State— your domain data;mergedefines reducer semantics when multiple writes occur.MessagesState— built-in chat history for LLM flows (messages: Vec<Message>).Message,ToolCall— rolesuser,assistant,system,tool; tool calls and tool results.
Graph types
| Type | Role |
|---|---|
StateGraph<S> |
Builder: add_node, add_edge, add_conditional_edges, compile |
CompiledGraph<S> |
Runnable: invoke, stream, checkpoint helpers when configured |
Checkpointing
Pass a BaseCheckpointSaver (e.g. MemorySaver with feature memory-checkpoint) into compile(Some(checkpointer)). Use Config::with_thread_id so each conversation/thread has isolated checkpoints.
Streaming
Use CompiledGraph::stream with StreamMode and handle StreamEvent variants. Add futures to your app for StreamExt:
use StreamExt;
use *;
// let mut app: CompiledGraph<MyState> = ...;
let mut stream = app
.stream
.await?;
while let Some = stream.next.await
For token-level LLM streams, call ChatModel::stream on OllamaAdapter / OpenAIAdapter / OpenRouterAdapter / AnthropicAdapter (with the matching feature enabled).
Graph API reference
| Method | Purpose |
|---|---|
StateGraph::new() |
Empty graph |
add_node(name, node) |
Register a node (impl Node<S> or closure) |
add_edge(from, to) |
Always go from → to |
add_conditional_edges(from, branch) |
branch returns BranchResult (next node(s) or end) |
set_entry_point(name) |
First node(s) to run |
set_finish_point(name) |
Mark terminal nodes |
compile(checkpointer) |
Build CompiledGraph |
Node closure shape:
|state: S, config: &Config| async move
Use an explicit &Config parameter (not _) if the compiler complains about lifetimes in complex graphs.
LLMs and agents
Enable features: ollama, openai, openrouter, anthropic, and often prebuilt for agents.
Direct chat (no graph)
Local (Ollama):
use OllamaAdapter;
use ChatModel;
use Message;
let model = new;
let reply = model.invoke.await?;
OpenRouter — set OPENROUTER_API_KEY and use a router model id (e.g. openai/gpt-4o-mini):
use OpenRouterAdapter;
use ChatModel;
use Message;
let model = with_api_key;
let reply = model.invoke.await?;
ReAct agent (graph with agent ↔ tools loop)
- Define
Toolinstances withTool::new(...).with_schema(json_schema). - Bind the same tools to the model (e.g.
OllamaAdapter::with_tools(vec![t.to_tool_info(), ...])orOpenAIAdapter::bind_tools/OpenRouterAdapter::bind_tools). - Call
create_react_agent(model, tools)→ get aCompiledGraph<MessagesState>(requiresprebuilt). invoke(MessagesState { messages: vec![Message::user("...")] }, Config::default()).
See examples/06_react_agent_ollama.rs for a full runnable flow.
Validation
prebuilt::validate_chat_history checks that every assistant tool_calls entry has a matching tool message (aligned with common LangGraph-style rules).
Feature flags (detailed)
= { = "0.1", = ["ollama", "prebuilt", "openai", "openrouter"] }
| Feature | Enables | Pulls in (transitively) |
|---|---|---|
memory-checkpoint |
Default. In-memory MemorySaver |
(no extra crates beyond core) |
sqlite |
SQLite checkpoint backend | sqlx + SQLite |
postgres |
PostgreSQL checkpoint backend | sqlx + Postgres |
openai |
llm::openai::OpenAIAdapter |
reqwest, async-openai |
openrouter |
llm::openrouter::OpenRouterAdapter |
reqwest, async-openai |
anthropic |
llm::anthropic::AnthropicAdapter |
reqwest |
ollama |
llm::ollama::OllamaAdapter |
reqwest |
prebuilt |
create_react_agent, Tool, ToolNode, validate_chat_history |
(no extra deps) |
Import ↔ feature gate:
| You import | Required feature |
|---|---|
rust_langgraph::llm::ollama::* |
ollama |
rust_langgraph::llm::openai::* |
openai |
rust_langgraph::llm::openrouter::* |
openrouter |
rust_langgraph::llm::anthropic::* |
anthropic |
rust_langgraph::prelude::ChatModel |
one of ollama, openai, openrouter, anthropic |
rust_langgraph::prelude::{create_react_agent, Tool, ToolNode} |
prebuilt |
rust_langgraph::prelude::MemorySaver |
memory-checkpoint (default) |
Prelude and conditional exports
use *;
Always available (with default features): Config, Error, Result, State, MessagesState, Message, add_messages, Node, StateGraph, CompiledGraph, Checkpoint, BaseCheckpointSaver, StreamMode, StreamEvent, Send, Command, and MemorySaver if memory-checkpoint is on.
If prebuilt: create_react_agent, Tool, ToolNode.
If any LLM feature (openai | openrouter | anthropic | ollama): ChatModel in the prelude.
Otherwise import traits explicitly, e.g. use rust_langgraph::llm::ChatModel only compiles when an LLM feature is enabled.
Common mistakes & compile errors
| Symptom | Cause | Fix |
|---|---|---|
could not find llm::ollama |
Feature off | Add features = ["ollama"] (or the adapter you need) |
ChatModel not found in prelude |
No LLM feature | Enable ollama, openai, openrouter, or anthropic |
create_react_agent not found |
Feature off | Add features = ["prebuilt"] |
Wrong crate in use |
Confusion with Python | Use rust_langgraph, not langgraph |
| Lifetime errors in conditional edges | Capturing &state into async move |
Clone needed fields before the async block (see AGENTS.md) |
invoke borrow errors |
Missing mut |
let mut app = graph.compile(...)? |
| Example fails to link | Wrong features | Use the --features from the examples table |
Verification commands
From the crate root (rust-langgraph/):
Integration tests (real Ollama server; marked ignore):
Run a single example:
Project layout
src/
lib.rs # Crate root, prelude
graph/ # StateGraph, CompiledGraph
pregel/ # Execution engine, Branch, BranchResult
state.rs # State, Message, MessagesState
nodes.rs # Node trait
checkpoint/ # Checkpoint types & saver trait
checkpoint_backends/
llm/ # ChatModel, Ollama / OpenAI / OpenRouter / Anthropic (feature-gated)
prebuilt/ # ReAct agent, tools (feature-gated)
examples/ # Runnable examples (see table below)
tests/ # Integration tests (e.g. Ollama, --ignored)
AGENTS.md # Short agent/contributor cheat sheet
API reference: docs.rs/rust-langgraph or cargo doc --open.
Examples
| Example | Command | Features |
|---|---|---|
| Minimal graph | cargo run --example simple_graph |
default |
| Branching | cargo run --example conditional_edges |
default |
| Checkpoints | cargo run --example checkpointing |
default |
| Streaming | cargo run --example streaming |
default |
| Ollama chat | cargo run --example ollama_chat |
ollama |
| ReAct + Ollama | cargo run --example react_agent_ollama |
ollama, prebuilt |
| OpenRouter chat | cargo run --example openrouter_chat |
openrouter |
| Custom state | cargo run --example custom_state |
default |
# OpenRouter (Windows PowerShell)
# OpenRouter (Unix)
Documentation
- README (this file) — install, env vars, features, recipes, troubleshooting.
AGENTS.md— condensed rules for contributors and AI coding agents (naming, signatures, pitfalls).- Rustdoc —
cargo doc -p rust-langgraph --no-deps --open.
When publishing a fork, update the repository URL in Cargo.toml to your Git host.
Comparison with Python LangGraph
This crate targets similar ideas (state graph, checkpoints, agents) but is a separate implementation. APIs and wire formats are aligned where practical; behavior may differ in edge cases. For the official Python stack, use LangChain’s LangGraph.
| Area | Rust LangGraph | Python LangGraph |
|---|---|---|
| Language | Rust | Python |
| Package | rust-langgraph / rust_langgraph |
langgraph |
| Official? | Community | LangChain |
Contributing
Issues and PRs are welcome. Please keep changes focused and match existing style.
License
MIT — see LICENSE.
Acknowledgments
- Inspired by LangGraph (LangChain).
- Execution model influenced by Google’s Pregel.