neuron
Rust library for production AI agents. Add one dependency, enable the features you need, and compose providers, tools, context strategies, and a runtime into agents that work.
Why neuron?
Most AI agent libraries are Python-first, framework-shaped, and opinionated. neuron is none of those.
- Rust-native — no Python interop, no runtime overhead
- Composable — use one crate or all of them, no buy-in required
- Model-agnostic — Anthropic, OpenAI, Ollama, or bring your own
- Context-aware — sliding window, compaction, and token counting built in
- MCP-native — first-class Model Context Protocol support
- No magic — it's a while loop with tools attached, not a framework
High-Level Features
- Multi-provider LLM support — Anthropic Claude, OpenAI GPT, Ollama local models, or implement the
Providertrait for your own - Composable tool middleware — axum-style middleware pipeline for tool calls: logging, auth, rate limiting, retries
- Context compaction — sliding window, tool result clearing, LLM summarization, and composite strategies to keep conversations within token limits
- Model Context Protocol — full MCP client and server, stdio and HTTP transports, automatic tool bridging
- Input/output guardrails — safety checks that run before input reaches the LLM or before output reaches the user, with tripwire semantics
- Sessions and sub-agents — persist conversations, spawn isolated sub-agents with filtered tool sets and depth guards
- Durable execution — wrap side effects for crash recovery via Temporal, Restate, or Inngest
- Streaming — real-time token streaming with hook integration across all providers
Installation
Quick Start
use *;
use Anthropic;
async
Feature Flags
| Feature | Enables | Default |
|---|---|---|
anthropic |
neuron::anthropic (Anthropic Claude) |
yes |
openai |
neuron::openai (OpenAI GPT) |
no |
ollama |
neuron::ollama (Ollama local) |
no |
mcp |
neuron::mcp (Model Context Protocol) |
no |
runtime |
neuron::runtime (sessions, guardrails) |
no |
full |
All of the above | no |
Module Map
| Module | Underlying Crate | Contents |
|---|---|---|
neuron::types |
neuron-types |
Messages, traits, errors, streaming |
neuron::tool |
neuron-tool |
ToolRegistry, middleware pipeline |
neuron::context |
neuron-context |
Token counting, compaction strategies |
neuron::r#loop |
neuron-loop |
AgentLoop, LoopConfig, AgentResult |
neuron::anthropic |
neuron-provider-anthropic |
Anthropic client (feature-gated) |
neuron::openai |
neuron-provider-openai |
OpenAi client (feature-gated) |
neuron::ollama |
neuron-provider-ollama |
Ollama client (feature-gated) |
neuron::mcp |
neuron-mcp |
McpClient, McpToolBridge (feature-gated) |
neuron::runtime |
neuron-runtime |
Sessions, guardrails (feature-gated) |
Note:
loopis a Rust keyword, so the loop module is accessed asneuron::r#loop. In practice, import types directly from the prelude or fromneuron_loop.
Ecosystem
Each block is also available as a standalone crate:
| Crate | docs.rs |
|---|---|
neuron-types |
Core traits — Provider, Tool, ContextStrategy |
neuron-tool |
Tool registry with composable middleware |
neuron-tool-macros |
#[neuron_tool] derive macro |
neuron-context |
Token counting and compaction strategies |
neuron-loop |
Agentic while loop |
neuron-provider-anthropic |
Anthropic Claude (Messages API, streaming, prompt caching) |
neuron-provider-openai |
OpenAI GPT (Chat Completions API, streaming) |
neuron-provider-ollama |
Ollama (local NDJSON streaming) |
neuron-mcp |
MCP client + server |
neuron-runtime |
Sessions, sub-agents, guardrails, durability |
Comparison
How neuron compares to the two most established Rust alternatives:
| Capability | neuron | Rig | genai |
|---|---|---|---|
| Crate independence | One crate per provider | All providers in rig-core |
Single crate |
| LLM providers | 3 | 18+ | 16 |
| Tool middleware | Composable chain | None | None |
| Context compaction | 4 strategies, token-aware | None | None |
| MCP (full spec) | Client + server + bridge | Client (rmcp) | None |
| Durable execution | DurableContext trait |
None | None |
| Guardrails / sandbox | InputGuardrail, OutputGuardrail, PermissionPolicy, Sandbox |
None | None |
| Sessions | SessionStorage trait + impls |
None | None |
| Vector stores / RAG | None | 13 integrations | None |
| Embeddings | None | EmbeddingModel trait |
Yes |
Where others lead today: Rig ships 18+ providers, 13 vector store integrations, and 80+ examples. genai covers 16 providers in one ergonomic crate. neuron ships 3 providers and zero vector stores — the architecture is ahead, the ecosystem is behind.
Prelude Contents
The neuron::prelude module re-exports the most commonly used types:
CompletionRequest,CompletionResponse,Message,Role,ContentBlock,ContentItem,SystemPrompt,TokenUsage,StopReason— conversation primitives.Provider— the LLM provider trait.Tool,ToolDyn,ToolDefinition,ToolContext,ToolOutput,ToolError— tool system types.ToolRegistry— tool registration and dispatch.SlidingWindowStrategy— context compaction.AgentLoop,AgentLoopBuilder,AgentResult,LoopConfig— the agentic loop.
Learning Path
Run examples in this order to learn neuron incrementally:
neuron-provider-anthropic/examples/basic.rs— single completionneuron-provider-anthropic/examples/streaming.rs— real-time token streamingneuron-tool/examples/custom_tool.rs— define and register toolsneuron-tool/examples/middleware.rs— composable tool middlewareneuron-loop/examples/agent_loop.rs— multi-turn agent with tools (no API key)neuron-loop/examples/multi_turn.rs— conversation accumulation (no API key)neuron-context/examples/compaction.rs— token counting and compactionneuron/examples/full_agent.rs— end-to-end production agentneuron/examples/structured_output.rs— JSON Schema outputneuron/examples/multi_provider.rs— swap providers at runtimeneuron-runtime/examples/guardrails.rs— input/output safety checksneuron-runtime/examples/sub_agents.rs— spawn isolated sub-agentsneuron-runtime/examples/sessions.rs— conversation persistenceneuron-mcp/examples/mcp_client.rs— MCP server integration
Part of neuron
This is the root crate of neuron. For
maximum independence, depend on individual block crates (neuron-types,
neuron-provider-anthropic, etc.) directly.
License
Licensed under either of Apache License, Version 2.0 or MIT License at your option.