Synapse Core π§
A high-performance neuro-symbolic semantic engine designed for agentic AI.
Features β’ Installation β’ Usage β’ API Reference β’ Architecture
π Overview
Synapse Core provides the foundational semantic memory layer for AI agents. It combines the structured precision of Knowledge Graphs (using Oxigraph) with RDF/SPARQL standards, allowing agents to reason about data, maintain long-term context, and query knowledge using industry-standard semantic web technologies.
It is designed to work seamlessly with OpenClaw and other agentic frameworks via the Model Context Protocol (MCP) or as a standalone gRPC service.
π Features
- RDF Triple Store: Built on Oxigraph for standards-compliant RDF storage and querying
- SPARQL Support: Full SPARQL 1.1 query language support for complex graph queries
- Multi-Namespace Architecture: Isolated knowledge bases for different contexts (work, personal, projects)
- Dual Protocol Support:
- gRPC API for high-performance programmatic access
- MCP Server for seamless LLM agent integration
- OWL Reasoning: Built-in support for OWL 2 RL reasoning via
reasonablecrate - Hybrid Search: Combines vector similarity with graph traversal (using local HNSW index)
- HuggingFace API Integration: High-performance embeddings without local GPU/CPU heavy lifting
- High Performance: Written in Rust with async I/O and efficient HNSW indexing
- Persistent Storage: Automatic persistence with namespace-specific storage paths
- Granular Security: Token-based authorization for Read, Write, Delete, and Reason operations.
- Robust MCP: Strict JSON Schema validation for all Model Context Protocol tool calls.
π¦ Installation
As a Rust Library
Add to your Cargo.toml:
[]
= "0.2.0"
As a Binary
Install the CLI tool:
For OpenClaw
One-click install as an MCP server:
π οΈ Usage
1. Standalone gRPC Server
Run Synapse as a high-performance gRPC server:
# Start the server (default: localhost:50051)
# With custom storage path
GRAPH_STORAGE_PATH=/path/to/data
The gRPC server exposes 7 RPC methods for semantic operations (see API Reference).
2. Model Context Protocol (MCP) Server
Run in MCP mode for integration with LLM agents:
This exposes 3 MCP tools via JSON-RPC over stdio:
query_graph- Retrieve all triples from a namespaceingest_triple- Add a new triple to the knowledge graphquery_sparql- Execute SPARQL queries
3. Rust Library Integration
Embed the engine directly into your application:
use MySemanticEngine;
use *;
use Request;
async
4. Hybrid Search
Retrieve entities matching both semantic similarity (vector) and structural relationship (graph):
use ;
let request = HybridSearchRequest ;
let response = engine.hybrid_search.await?;
5. Automated Reasoning
Apply OWL-RL or RDFS reasoning to derive implicit knowledge:
use ;
let request = ReasoningRequest ;
let response = engine.apply_reasoning.await?;
println!;
6. SPARQL Queries
Query your knowledge graph using SPARQL:
use SparqlRequest;
let sparql_query = r#"
SELECT ?subject ?predicate ?object
WHERE {
?subject ?predicate ?object .
}
LIMIT 10
"#;
let request = SparqlRequest ;
let response = engine.query_sparql.await?;
println!;
7. Multi-Namespace Usage
Isolate different knowledge domains:
// Work-related knowledge
engine.ingest_triples.await?;
// Personal knowledge
engine.ingest_triples.await?;
// Query specific namespace
let work_data = engine.get_all_triples.await?;
π API Reference
gRPC API
The SemanticEngine service provides the following RPC methods:
| Method | Request | Response | Description |
|---|---|---|---|
IngestTriples |
IngestRequest |
IngestResponse |
Add RDF triples to the graph |
GetNeighbors |
NodeRequest |
NeighborResponse |
Graph traversal (supports edge & type filters) |
Search |
SearchRequest |
SearchResponse |
Legacy vector search |
ResolveId |
ResolveRequest |
ResolveResponse |
Resolve URI string to internal node ID |
GetAllTriples |
EmptyRequest |
TriplesResponse |
Retrieve all triples from a namespace |
QuerySparql |
SparqlRequest |
SparqlResponse |
Execute SPARQL 1.1 queries |
DeleteNamespaceData |
EmptyRequest |
DeleteResponse |
Delete all data in a namespace |
HybridSearch |
HybridSearchRequest |
SearchResponse |
AI Search (Vector + Graph) |
ApplyReasoning |
ReasoningRequest |
ReasoningResponse |
Trigger deductive inference |
Proto Definition: See semantic_engine.proto
MCP Tools
When running in --mcp mode, the engine exposes a rich set of tools via tools/list and tools/call.
All tool inputs are strictly validated against their JSON Schema definitions.
query_graph
Retrieve all triples from a namespace.
Input Schema:
ingest_triple
Add a new RDF triple to the knowledge graph.
Input Schema:
query_sparql
Execute a SPARQL query on the knowledge graph.
Input Schema:
Security & Authorization
Synapse implements a token-based authorization system. When using gRPC, tokens are extracted from the Authorization: Bearer <token> header.
Permissions are defined via the SYNAPSE_AUTH_TOKENS environment variable (JSON format).
Supported permissions:
read: Query data (GetNeighbors,Search,SparqlQuery, etc.)write: Ingest data (IngestTriples,IngestFile)delete: Delete data (DeleteNamespaceData)reason: Trigger reasoning (ApplyReasoning)
ποΈ Architecture
Storage Layer
- Oxigraph: RDF triple store with SPARQL 1.1 support
- Namespace Isolation: Each namespace gets its own persistent storage directory
- URI Mapping: Automatic conversion between URIs and internal node IDs for gRPC compatibility
Reasoning Engine
- Reasonable: OWL RL reasoning for automatic inference
- Deductive Capabilities: Derive new facts from existing triples using ontological rules
Dual-Mode Operation
βββββββββββββββββββββββββββββββββββββββ
β Synapse Core Engine β
βββββββββββββββββββββββββββββββββββββββ€
β β
β ββββββββββββββββ βββββββββββββββ β
β β gRPC Server β β MCP Server β β
β β (Port 50051)β β (stdio) β β
β ββββββββ¬ββββββββ ββββββββ¬βββββββ β
β β β β
β ββββββββββ¬βββββββββ β
β β β
β ββββββββββΌβββββββββ β
β β MySemanticEngineβ β
β ββββββββββ¬βββββββββ β
β β β
β ββββββββββΌβββββββββ β
β β SynapseStore β β
β β (per namespace)β β
β ββββββββββ¬βββββββββ β
β β β
β ββββββββββΌβββββββββ β
β β Oxigraph RDF β β
β β Triple Store β β
β βββββββββββββββββββ β
βββββββββββββββββββββββββββββββββββββββ
Namespace Management
Each namespace is completely isolated with its own:
- Storage directory (
{GRAPH_STORAGE_PATH}/{namespace}) - Oxigraph store instance
- URI-to-ID mapping tables
This enables multi-tenant scenarios and context separation.
βοΈ Configuration
Environment Variables
| Variable | Default | Description |
|---|---|---|
GRAPH_STORAGE_PATH |
data/graphs |
Root directory for namespace storage |
HUGGINGFACE_API_TOKEN |
(optional) |
Token for Inference API (higher rate limits) |
Storage Structure
data/graphs/
βββ default/ # Default namespace
βββ work/ # Work namespace
βββ personal/ # Personal namespace
π€ Contributing
Contributions are welcome! Please check the repository for guidelines.
π License
This project is licensed under the MIT License.
Built with β€οΈ using Rust, Oxigraph, and Tonic