Synaptic
A Rust agent framework with LangChain-compatible architecture and Rust-native async interfaces.
Features
- LLM Adapters — OpenAI, Anthropic, Gemini, Ollama with streaming, retry, rate limiting
- LCEL Composition —
Runnabletrait with pipe operator (|), streaming, bind, parallel, branch, fallbacks - Graph Orchestration — LangGraph-style
StateGraphwith conditional edges, checkpointing, human-in-the-loop, streaming - ReAct Agent —
create_react_agent(model, tools)with automatic tool dispatch - Tool System —
Tooltrait,ToolRegistry,SerialToolExecutor,tool_choicecontrol - Memory — Buffer, Window, Summary, SummaryBuffer, TokenBuffer strategies
- Prompt Templates — Chat templates, few-shot prompting, placeholder interpolation
- Output Parsers — String, JSON, Structured, List, Enum — all composable as
Runnable - RAG Pipeline — Document loaders, text splitters, embeddings, vector stores, 7 retriever types
- Caching — In-memory, semantic (embedding similarity),
CachedChatModelwrapper - Evaluation — ExactMatch, JsonValidity, Regex, EmbeddingDistance, LLMJudge evaluators
- Structured Output —
StructuredOutputChatModel<T>with JSON schema enforcement - Observability —
TracingCallback(structured spans),CompositeCallback(multi-handler)
Installation
Add synaptic to your Cargo.toml:
[]
= "0.1"
By default, all features are enabled. You can select specific features:
[]
= { = "0.1", = false, = ["models", "runnables", "graph"] }
Available features: models, runnables, prompts, parsers, tools, memory, callbacks, graph, retrieval, loaders, splitters, embeddings, vectorstores, cache, eval.
Quick Start
use ;
use ;
use ;
// LCEL pipe composition
let chain = step1.boxed | step2.boxed | step3.boxed;
let result = chain.invoke.await?;
// ReAct agent
let graph = create_react_agent?;
let state = MessageState ;
let result = graph.invoke.await?;
Workspace Layout
17 library crates in crates/, 13 examples in examples/:
| Crate | Description |
|---|---|
synaptic-core |
Shared traits and types: ChatModel, Message, ToolChoice, SynapticError |
synaptic-models |
LLM provider adapters + retry/rate-limit/structured-output wrappers |
synaptic-runnables |
LCEL: Runnable, BoxRunnable, pipe, Lambda, Parallel, Branch, Assign, Pick, Fallbacks |
synaptic-prompts |
ChatPromptTemplate, FewShotChatMessagePromptTemplate |
synaptic-parsers |
Str, Json, Structured, List, Enum output parsers |
synaptic-tools |
ToolRegistry, SerialToolExecutor |
synaptic-memory |
Buffer, Window, Summary, SummaryBuffer, TokenBuffer, RunnableWithMessageHistory |
synaptic-callbacks |
RecordingCallback, TracingCallback, CompositeCallback |
synaptic-retrieval |
BM25, MultiQuery, Ensemble, Compression, SelfQuery, ParentDocument retrievers |
synaptic-loaders |
Text, JSON, CSV, Directory document loaders |
synaptic-splitters |
Character, Recursive, Markdown, Token text splitters |
synaptic-embeddings |
Embeddings trait, Fake, OpenAI, Ollama providers |
synaptic-vectorstores |
VectorStore trait, InMemory (cosine), VectorStoreRetriever |
synaptic-graph |
StateGraph, CompiledGraph (with stream), ToolNode, create_react_agent |
synaptic-cache |
InMemory, Semantic caches, CachedChatModel |
synaptic-eval |
Evaluator trait, 5 evaluators, Dataset, batch evaluate() |
synaptic |
Unified facade re-exporting all crates |
Examples
All examples use ScriptedChatModel and FakeEmbeddings — no API keys required.
Documentation
- Book: dnw3.github.io/synaptic — tutorials, how-to guides, concepts
- API Reference: docs.rs/synaptic — full Rustdoc API documentation
Design Principles
- Core abstractions first, feature crates expanded incrementally
- LangChain concept compatibility with Rust-idiomatic APIs
- All traits are async via
#[async_trait], runtime is tokio - Type-erased composition via
BoxRunnablewith|pipe operator Arc<RwLock<_>>for shared registries, session-keyed memory isolation
Contributing
See CONTRIBUTING.md for guidelines, or the full guide.
License
MIT — see LICENSE for details.