synapse-core 0.5.0

A neuro-symbolic semantic engine for OpenClaw, combining graph databases with vector operations.
Documentation
synapse-core-0.5.0 has been yanked.

Synapse Core 🧠

Crates.io Documentation License

A high-performance neuro-symbolic semantic engine designed for agentic AI.

Features β€’ Installation β€’ Usage β€’ API Reference β€’ Architecture


πŸ“– Overview

Synapse Core provides the foundational semantic memory layer for AI agents. It combines the structured precision of Knowledge Graphs (using Oxigraph) with RDF/SPARQL standards, allowing agents to reason about data, maintain long-term context, and query knowledge using industry-standard semantic web technologies.

It is designed to work seamlessly with OpenClaw and other agentic frameworks via the Model Context Protocol (MCP) or as a standalone gRPC service.

πŸš€ Features

  • RDF Triple Store: Built on Oxigraph for standards-compliant RDF storage and querying
  • SPARQL Support: Full SPARQL 1.1 query language support for complex graph queries
  • Multi-Namespace Architecture: Isolated knowledge bases for different contexts (work, personal, projects)
  • Dual Protocol Support:
    • gRPC API for high-performance programmatic access
    • MCP Server for seamless LLM agent integration
  • OWL Reasoning: Built-in support for OWL 2 RL reasoning via reasonable crate
  • Hybrid Search: Combines vector similarity with graph traversal (using local HNSW index)
  • HuggingFace API Integration: High-performance embeddings without local GPU/CPU heavy lifting
  • High Performance: Written in Rust with async I/O and efficient HNSW indexing
  • Persistent Storage: Automatic persistence with namespace-specific storage paths

πŸ“¦ Installation

As a Rust Library

Add to your Cargo.toml:

[dependencies]
synapse-core = "0.2.0"

As a Binary

Install the CLI tool:

cargo install synapse-core

For OpenClaw

One-click install as an MCP server:

npx skills install pmaojo/synapse-engine

πŸ› οΈ Usage

1. Standalone gRPC Server

Run Synapse as a high-performance gRPC server:

# Start the server (default: localhost:50051)
synapse

# With custom storage path
GRAPH_STORAGE_PATH=/path/to/data synapse

The gRPC server exposes 7 RPC methods for semantic operations (see API Reference).

2. Model Context Protocol (MCP) Server

Run in MCP mode for integration with LLM agents:

synapse --mcp

This exposes 3 MCP tools via JSON-RPC over stdio:

  • query_graph - Retrieve all triples from a namespace
  • ingest_triple - Add a new triple to the knowledge graph
  • query_sparql - Execute SPARQL queries

3. Rust Library Integration

Embed the engine directly into your application:

use synapse_core::server::MySemanticEngine;
use synapse_core::server::semantic_engine::*;
use tonic::Request;

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
    // Initialize the engine
    let engine = MySemanticEngine::new("data/my_graph");

    // Ingest triples
    let triple = Triple {
        subject: "Alice".to_string(),
        predicate: "knows".to_string(),
        object: "Bob".to_string(),
        provenance: None,
    };

    let request = IngestRequest {
        triples: vec![triple],
        namespace: "social".to_string(),
    };

    let response = engine.ingest_triples(Request::new(request)).await?;
    println!("Added {} triples", response.into_inner().nodes_added);

    Ok(())
}

4. Hybrid Search

Retrieve entities matching both semantic similarity (vector) and structural relationship (graph):

use synapse_core::server::proto::{HybridSearchRequest, SearchMode};

let request = HybridSearchRequest {
    query: "What are the latest findings on neuro-symbolic AI?".to_string(),
    namespace: "research".to_string(),
    vector_k: 10,       // Top-K vectors
    graph_depth: 2,    // Expand graph 2 levels deep from results
    mode: SearchMode::Hybrid as i32,
    limit: 5,
};

let response = engine.hybrid_search(Request::new(request)).await?;

5. Automated Reasoning

Apply OWL-RL or RDFS reasoning to derive implicit knowledge:

use synapse_core::server::proto::{ReasoningRequest, ReasoningStrategy};

let request = ReasoningRequest {
    namespace: "ontology".to_string(),
    strategy: ReasoningStrategy::Owlrl as i32,
    materialize: true, // Save inferred triples to storage
};

let response = engine.apply_reasoning(Request::new(request)).await?;
println!("Inferred {} new facts", response.into_inner().triples_inferred);

6. SPARQL Queries

Query your knowledge graph using SPARQL:

use synapse_core::server::semantic_engine::SparqlRequest;

let sparql_query = r#"
    SELECT ?subject ?predicate ?object
    WHERE {
        ?subject ?predicate ?object .
    }
    LIMIT 10
"#;

let request = SparqlRequest {
    query: sparql_query.to_string(),
    namespace: "default".to_string(),
};

let response = engine.query_sparql(Request::new(request)).await?;
println!("Results: {}", response.into_inner().results_json);

7. Multi-Namespace Usage

Isolate different knowledge domains:

// Work-related knowledge
engine.ingest_triples(Request::new(IngestRequest {
    triples: work_triples,
    namespace: "work".to_string(),
})).await?;

// Personal knowledge
engine.ingest_triples(Request::new(IngestRequest {
    triples: personal_triples,
    namespace: "personal".to_string(),
})).await?;

// Query specific namespace
let work_data = engine.get_all_triples(Request::new(EmptyRequest {
    namespace: "work".to_string(),
})).await?;

πŸ“š API Reference

gRPC API

The SemanticEngine service provides the following RPC methods:

Method Request Response Description
IngestTriples IngestRequest IngestResponse Add RDF triples to the graph
GetNeighbors NodeRequest NeighborResponse Graph traversal (get connected nodes)
Search SearchRequest SearchResponse Legacy vector search
ResolveId ResolveRequest ResolveResponse Resolve URI string to internal node ID
GetAllTriples EmptyRequest TriplesResponse Retrieve all triples from a namespace
QuerySparql SparqlRequest SparqlResponse Execute SPARQL 1.1 queries
DeleteNamespaceData EmptyRequest DeleteResponse Delete all data in a namespace
HybridSearch HybridSearchRequest SearchResponse AI Search (Vector + Graph)
ApplyReasoning ReasoningRequest ReasoningResponse Trigger deductive inference

Proto Definition: See semantic_engine.proto

MCP Tools

When running in --mcp mode, the following tools are exposed:

query_graph

Retrieve all triples from a namespace.

Input Schema:

{
  "namespace": "string (default: robin_os)"
}

ingest_triple

Add a new RDF triple to the knowledge graph.

Input Schema:

{
  "subject": "string (required)",
  "predicate": "string (required)",
  "object": "string (required)",
  "namespace": "string (default: robin_os)"
}

query_sparql

Execute a SPARQL query on the knowledge graph.

Input Schema:

{
  "query": "string (required)",
  "namespace": "string (default: robin_os)"
}

πŸ—οΈ Architecture

Storage Layer

  • Oxigraph: RDF triple store with SPARQL 1.1 support
  • Namespace Isolation: Each namespace gets its own persistent storage directory
  • URI Mapping: Automatic conversion between URIs and internal node IDs for gRPC compatibility

Reasoning Engine

  • Reasonable: OWL RL reasoning for automatic inference
  • Deductive Capabilities: Derive new facts from existing triples using ontological rules

Dual-Mode Operation

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚      Synapse Core Engine            β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚                                     β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚
β”‚  β”‚  gRPC Server β”‚  β”‚  MCP Server β”‚ β”‚
β”‚  β”‚  (Port 50051)β”‚  β”‚  (stdio)    β”‚ β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”˜  β””β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”˜ β”‚
β”‚         β”‚                 β”‚         β”‚
β”‚         β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”˜         β”‚
β”‚                  β”‚                  β”‚
β”‚         β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”€β”€β”€β”         β”‚
β”‚         β”‚ MySemanticEngineβ”‚         β”‚
β”‚         β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”˜         β”‚
β”‚                  β”‚                  β”‚
β”‚         β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”€β”€β”€β”         β”‚
β”‚         β”‚  SynapseStore   β”‚         β”‚
β”‚         β”‚  (per namespace)β”‚         β”‚
β”‚         β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”˜         β”‚
β”‚                  β”‚                  β”‚
β”‚         β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”€β”€β”€β”         β”‚
β”‚         β”‚   Oxigraph RDF  β”‚         β”‚
β”‚         β”‚   Triple Store  β”‚         β”‚
β”‚         β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜         β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Namespace Management

Each namespace is completely isolated with its own:

  • Storage directory ({GRAPH_STORAGE_PATH}/{namespace})
  • Oxigraph store instance
  • URI-to-ID mapping tables

This enables multi-tenant scenarios and context separation.

βš™οΈ Configuration

Environment Variables

Variable Default Description
GRAPH_STORAGE_PATH data/graphs Root directory for namespace storage
HUGGINGFACE_API_TOKEN (optional) Token for Inference API (higher rate limits)

Storage Structure

data/graphs/
β”œβ”€β”€ default/          # Default namespace
β”œβ”€β”€ work/             # Work namespace
└── personal/         # Personal namespace

🀝 Contributing

Contributions are welcome! Please check the repository for guidelines.

πŸ“„ License

This project is licensed under the MIT License.


Built with ❀️ using Rust, Oxigraph, and Tonic