docs.rs failed to build synaptic-qdrant-0.2.7
Please check the build logs for more information.
See Builds for ideas on how to fix a failed build, or Metadata for how to configure docs.rs builds.
If you believe this is docs.rs' fault, open an issue.
Please check the build logs for more information.
See Builds for ideas on how to fix a failed build, or Metadata for how to configure docs.rs builds.
If you believe this is docs.rs' fault, open an issue.
Synaptic
A Rust agent framework with LangChain-compatible architecture and Rust-native async interfaces.
Features
- LLM Providers — OpenAI, Anthropic, Gemini, Ollama, AWS Bedrock, Groq, Mistral AI, DeepSeek, and any OpenAI-compatible API (xAI, Together, Fireworks, OpenRouter…)
- LCEL Composition —
Runnabletrait with pipe operator (|), streaming, bind, parallel, branch, assign/pick, fallbacks - Graph Orchestration — LangGraph-style
StateGraphwith conditional edges, persistent checkpointing (Redis + PostgreSQL), human-in-the-loop, streaming - ReAct Agent —
create_react_agent(model, tools)with automatic tool dispatch - Tool System —
Tooltrait,ToolRegistry,SerialToolExecutor,ParallelToolExecutor, built-in tools (Tavily, DuckDuckGo, Wikipedia, SQL Toolkit) - Memory — Buffer, Window, Summary, SummaryBuffer, TokenBuffer strategies with
RunnableWithMessageHistory - Prompt Templates — Chat templates, few-shot prompting, placeholder interpolation
- Output Parsers — String, JSON, Structured<T>, List, Boolean, Enum, XML — all composable as
Runnable - RAG Pipeline — Document loaders (Text, JSON, CSV, Markdown, Directory, Web, PDF), text splitters, embeddings (OpenAI, Ollama, Cohere, HuggingFace), vector stores (InMemory, Qdrant, pgvector, Pinecone, Chroma, MongoDB, Elasticsearch, Weaviate), 7 retriever types
- Caching — In-memory (TTL), semantic (embedding similarity), Redis, SQLite,
CachedChatModelwrapper - Evaluation — ExactMatch, JsonValidity, Regex, EmbeddingDistance, LLMJudge evaluators
- Structured Output —
StructuredOutputChatModel<T>with JSON schema enforcement - Observability —
TracingCallback(structured spans),CompositeCallback,StdOutCallback - MCP —
MultiServerMcpClientwith Stdio/SSE/HTTP transport adapters - Macros —
#[tool],#[chain],#[entrypoint],#[task],#[traceable]proc-macros - Deep Agent — Filesystem-aware deep research agent harness (
create_deep_agent) - Middleware —
AgentMiddlewaretrait: Retry, PII redaction, Prompt Caching, Summarization
Installation
[]
= { = "0.2", = ["agent"] }
Feature flags
| Feature | Contents |
|---|---|
default |
runnables + prompts + parsers + tools + callbacks |
openai |
OpenAI ChatModel + Embeddings |
anthropic |
Anthropic Claude ChatModel |
gemini |
Google Gemini ChatModel |
ollama |
Ollama ChatModel + Embeddings |
bedrock |
AWS Bedrock ChatModel |
groq |
Groq ChatModel (ultra-fast LPU inference) |
mistral |
Mistral AI ChatModel + Embeddings |
deepseek |
DeepSeek ChatModel (R1 reasoning + V3) |
models |
All chat model providers |
qdrant |
Qdrant vector store |
pgvector |
PostgreSQL + pgvector store + graph checkpointer |
redis |
Redis store + LLM cache + graph checkpointer |
weaviate |
Weaviate vector store |
pinecone |
Pinecone vector store |
chroma |
Chroma vector store |
mongodb |
MongoDB Atlas vector search |
elasticsearch |
Elasticsearch vector store |
sqlite |
SQLite LLM cache |
huggingface |
HuggingFace Inference API embeddings |
cohere |
Cohere reranker + embeddings |
tavily |
Tavily search tool |
sqltoolkit |
SQL database toolkit (ListTables, DescribeTable, ExecuteQuery) |
pdf |
PDF document loader |
graph |
LangGraph-style StateGraph |
memory |
Conversation memory strategies |
retrieval |
Retriever types (BM25, Ensemble, etc.) |
cache |
LLM response caching |
eval |
Evaluators |
mcp |
MCP server client |
macros |
Proc-macros |
deep |
Deep Agent harness |
agent |
default + openai + graph + memory + middleware + store |
rag |
default + openai + embeddings + retrieval + loaders + splitters + vectorstores |
full |
Everything |
Quick Start
use ;
use ;
use ;
// LCEL pipe composition
let chain = step1.boxed | step2.boxed | step3.boxed;
let result = chain.invoke.await?;
// ReAct agent
let graph = create_react_agent?;
let state = MessageState ;
let result = graph.invoke.await?;
Workspace Layout
45+ library crates in crates/, examples in examples/:
Core
| Crate | Description |
|---|---|
synaptic-core |
Shared traits and types: ChatModel, Message, ToolChoice, SynapticError |
synaptic-models |
ProviderBackend + HttpBackend + FakeBackend, wrappers (Retry, RateLimit, StructuredOutput) |
synaptic-runnables |
LCEL: Runnable, BoxRunnable, pipe, Lambda, Parallel, Branch, Assign, Pick, Fallbacks |
synaptic-prompts |
ChatPromptTemplate, FewShotChatMessagePromptTemplate |
synaptic-parsers |
Str, JSON, Structured, List, Boolean, Enum, XML output parsers |
synaptic-tools |
ToolRegistry, SerialToolExecutor, ParallelToolExecutor, DuckDuckGo, Wikipedia |
synaptic-memory |
Buffer, Window, Summary, SummaryBuffer, TokenBuffer, RunnableWithMessageHistory |
synaptic-callbacks |
RecordingCallback, TracingCallback, CompositeCallback |
synaptic-graph |
StateGraph, CompiledGraph, ToolNode, create_react_agent, MemorySaver |
synaptic-retrieval |
BM25, MultiQuery, Ensemble, Compression, SelfQuery, ParentDocument retrievers |
synaptic-loaders |
Text, JSON, CSV, Markdown, Directory, Web, PDF document loaders |
synaptic-splitters |
Character, Recursive, Markdown, Token, HTML text splitters |
synaptic-embeddings |
Embeddings trait, FakeEmbeddings, CacheBackedEmbeddings |
synaptic-vectorstores |
VectorStore trait, InMemoryVectorStore, VectorStoreRetriever |
synaptic-cache |
InMemory + Semantic LLM caches, CachedChatModel |
synaptic-eval |
Evaluator trait, 5 evaluators, Dataset, batch evaluate() |
synaptic-store |
InMemoryStore with semantic search |
synaptic-middleware |
AgentMiddleware trait, PII, Retry, Prompt Caching, Summarization |
synaptic-mcp |
MultiServerMcpClient, Stdio/SSE/HTTP transports |
synaptic-macros |
Proc-macros: #[tool], #[chain], #[entrypoint], #[task], #[traceable] |
synaptic-deep |
Deep Agent harness with filesystem tools + create_deep_agent() |
synaptic |
Unified facade with feature-gated re-exports |
Chat Model Providers
| Crate | Provider |
|---|---|
synaptic-openai |
OpenAI (GPT-4o, o1, o3…) + Azure OpenAI + 9 OpenAI-compatible APIs |
synaptic-anthropic |
Anthropic (Claude 4.6, Claude Haiku…) |
synaptic-gemini |
Google Gemini (1.5 Pro, 2.0 Flash…) |
synaptic-ollama |
Ollama (local models) |
synaptic-bedrock |
AWS Bedrock (Titan, Claude, Llama via Bedrock) |
synaptic-groq |
Groq (Llama 3.3 70B, Mixtral 8x7B — fastest inference via LPU) |
synaptic-mistral |
Mistral AI (Mistral Large, Codestral, Mistral NeMo) |
synaptic-deepseek |
DeepSeek (V3 chat + R1 reasoning — ultra-low cost) |
Embeddings
| Crate | Provider |
|---|---|
synaptic-openai |
OpenAI text-embedding-3-small/large |
synaptic-ollama |
Ollama local embedding models |
synaptic-cohere |
Cohere embed-english-v3.0, embed-multilingual-v3.0 |
synaptic-huggingface |
HuggingFace Inference API (BAAI/bge, sentence-transformers…) |
Vector Stores
| Crate | Backend |
|---|---|
synaptic-vectorstores |
In-memory (cosine similarity) |
synaptic-qdrant |
Qdrant |
synaptic-pgvector |
PostgreSQL + pgvector |
synaptic-pinecone |
Pinecone |
synaptic-chroma |
Chroma |
synaptic-mongodb |
MongoDB Atlas Vector Search |
synaptic-elasticsearch |
Elasticsearch |
synaptic-weaviate |
Weaviate |
Store, Cache & Graph Persistence
| Crate | Backend |
|---|---|
synaptic-redis |
Redis Store + LLM Cache + Graph Checkpointer |
synaptic-pgvector |
PostgreSQL + pgvector (also Graph Checkpointer) |
synaptic-sqlite |
SQLite LLM Cache |
Tools
| Crate | Tools |
|---|---|
synaptic-tavily |
Tavily AI search (API key required) |
synaptic-tools |
DuckDuckGo search, Wikipedia (no API key required) |
synaptic-sqltoolkit |
ListTables, DescribeTable, ExecuteQuery (read-only SQL) |
Examples
All examples use ScriptedChatModel and FakeEmbeddings — no API keys required.
Documentation
- Book: dnw3.github.io/synaptic — tutorials, how-to guides, concepts, integration reference
- API Reference: docs.rs/synaptic — full Rustdoc API documentation
Design Principles
- Core abstractions first, feature crates expanded incrementally
- LangChain concept compatibility with Rust-idiomatic APIs
- All traits are async via
#[async_trait], runtime is tokio - Type-erased composition via
BoxRunnablewith|pipe operator Arc<RwLock<_>>for shared registries, session-keyed memory isolation- MSRV: 1.88
Contributing
See CONTRIBUTING.md for guidelines, or the full guide.
License
MIT — see LICENSE for details.