Please check the build logs for more information.
See Builds for ideas on how to fix a failed build, or Metadata for how to configure docs.rs builds.
If you believe this is docs.rs' fault, open an issue.
ruvector-graph-transformer
A graph neural network where every operation is mathematically proven correct before it runs.
Most graph neural networks let you modify data freely — add nodes, change weights, update edges — with no safety guarantees. If a bug corrupts your graph, you find out later (or never). This crate takes a different approach: every mutation to graph state requires a formal proof that the operation is valid. No proof, no access. Think of it like a lock on every piece of data that can only be opened with the right mathematical key.
On top of that safety layer, 8 specialized modules bring cutting-edge graph intelligence: attention that scales to millions of nodes without checking every pair, physics simulations that conserve energy by construction, neurons that only fire when they should, training that automatically rolls back bad gradient steps, and geometry that works in curved spaces instead of assuming everything is flat.
The result is a graph transformer you can trust: if it produces an answer, that answer was computed correctly.
| Standard GNN | ruvector-graph-transformer | |
|---|---|---|
| Mutation safety | Unchecked | Proof-gated: no mutation without formal witness |
| Attention complexity | O(n^2) | O(n log n) sublinear via LSH/PPR/spectral |
| Training guarantees | Hope for the best | Verified: certificates, delta-apply rollback, fail-closed |
| Geometry | Euclidean only | Product manifolds Sn x Hm x R^k |
| Causality | No enforcement | Temporal masking + Granger causality extraction |
| Incentive alignment | Not considered | Nash equilibrium + Shapley attribution |
| Platforms | Python only | Rust + WASM + Node.js (NAPI-RS) |
Modules
8 feature-gated modules, each backed by an Architecture Decision Record:
| Module | Feature Flag | ADR | What It Does |
|---|---|---|---|
| Proof-Gated Mutation | always on | ADR-047 | ProofGate<T>, MutationLedger, ProofScope, EpochBoundary |
| Sublinear Attention | sublinear |
ADR-048 | LSH-bucket, PPR-sampled, spectral sparsification |
| Physics-Informed | physics |
ADR-051 | Hamiltonian dynamics, gauge equivariant MP, Lagrangian attention, conservative PDE |
| Biological | biological |
ADR-052 | Spiking attention, Hebbian/STDP learning, dendritic branching, inhibition strategies |
| Self-Organizing | self-organizing |
— | Morphogenetic fields, developmental programs, graph coarsening |
| Verified Training | verified-training |
ADR-049 | Training certificates, delta-apply rollback, LossStabilityBound, EnergyGate |
| Manifold | manifold |
ADR-055 | Product manifolds, Riemannian Adam, geodesic MP, Lie group equivariance |
| Temporal-Causal | temporal |
ADR-053 | Causal masking, retrocausal attention, continuous-time ODE, Granger causality |
| Economic | economic |
ADR-054 | Nash equilibrium attention, Shapley attribution, incentive-aligned MPNN |
Quick Start
[]
= "2.0"
# Or with all modules:
= { = "2.0", = ["full"] }
Proof-Gated Mutation
Every mutation to graph state passes through a proof gate:
use ;
use ProofEnvironment;
// Create a proof environment and graph transformer
let mut env = new;
let gt = with_defaults;
// Gate a value behind a proof
let gate: = gt.create_gate;
// Mutation requires proof — no proof, no access
let proof_id = prove_dim_eq.unwrap;
let mutated = gate.mutate_with_proof.unwrap;
Sublinear Attention
use SublinearGraphAttention;
use SublinearConfig;
let config = SublinearConfig ;
let attn = new;
// O(n log n) instead of O(n^2)
let features = vec!;
let outputs = attn.lsh_attention.unwrap;
Verified Training
use ;
use VerifiedTrainingConfig;
let config = VerifiedTrainingConfig ;
let mut trainer = new;
// Delta-apply: gradients go to scratch buffer, commit only if invariants pass
let result = trainer.step.unwrap;
assert!; // BLAKE3-hashed training certificate
Physics-Informed Layers
use HamiltonianGraphNet;
use PhysicsConfig;
let config = default;
let mut hgn = new;
// Symplectic leapfrog preserves energy
let = hgn.step;
assert!; // formal conservation proof
Manifold Operations
use ;
use ManifoldConfig;
let config = ManifoldConfig ;
let attn = new;
// Attention in S^64 x H^32 x R^32
let outputs = attn.forward.unwrap;
Feature Flags
[]
= ["sublinear", "verified-training"]
= ["sublinear", "physics", "biological", "self-organizing",
"verified-training", "manifold", "temporal", "economic"]
| Flag | Default | Adds |
|---|---|---|
sublinear |
yes | LSH, PPR, spectral attention |
verified-training |
yes | Training certificates, delta-apply rollback |
physics |
no | Hamiltonian, gauge, Lagrangian, PDE layers |
biological |
no | Spiking, Hebbian, STDP, dendritic layers |
self-organizing |
no | Morphogenetic fields, developmental programs |
manifold |
no | Product manifolds, Riemannian Adam, Lie groups |
temporal |
no | Causal masking, Granger causality, ODE |
economic |
no | Nash equilibrium, Shapley, incentive-aligned MPNN |
Architecture
ruvector-graph-transformer
├── proof_gated.rs ← ProofGate<T>, MutationLedger, attestation chains
├── sublinear_attention.rs ← O(n log n) attention via LSH/PPR/spectral
├── physics.rs ← Energy-conserving Hamiltonian/Lagrangian dynamics
├── biological.rs ← Spiking networks, Hebbian plasticity, STDP
├── self_organizing.rs ← Morphogenetic fields, reaction-diffusion growth
├── verified_training.rs ← Certified training with delta-apply rollback
├── manifold.rs ← Product manifold S^n × H^m × R^k geometry
├── temporal.rs ← Causal masking, Granger causality, ODE integration
├── economic.rs ← Nash equilibrium, Shapley values, mechanism design
├── config.rs ← Per-module configuration with sensible defaults
├── error.rs ← Unified error composing 4 sub-crate errors
└── lib.rs ← Unified entry point with feature-gated re-exports
Dependencies
ruvector-graph-transformer
├── ruvector-verified ← formal proofs, attestations, gated routing
├── ruvector-gnn ← base GNN message passing
├── ruvector-attention ← scaled dot-product attention
├── ruvector-mincut ← graph structure operations
├── ruvector-solver ← sparse linear systems
└── ruvector-coherence ← coherence measurement
Bindings
| Platform | Package | Install |
|---|---|---|
| WASM | ruvector-graph-transformer-wasm |
wasm-pack build |
| Node.js | ruvector-graph-transformer-node |
npm install @ruvector/graph-transformer |
Tests
# Default features (sublinear + verified-training)
# All modules
# Individual module
163 unit tests + 23 integration tests = 186 total, all passing.
ADR Documentation
| ADR | Title |
|---|---|
| ADR-046 | Unified Graph Transformer Architecture |
| ADR-047 | Proof-Gated Mutation Protocol |
| ADR-048 | Sublinear Graph Attention |
| ADR-049 | Verified Training Pipeline |
| ADR-050 | WASM + Node.js Bindings |
| ADR-051 | Physics-Informed Graph Layers |
| ADR-052 | Biological Graph Layers |
| ADR-053 | Temporal-Causal Graph Layers |
| ADR-054 | Economic Graph Layers |
| ADR-055 | Manifold Graph Layers |
License
MIT