@ruvector/graph-transformer
Node.js bindings for RuVector Graph Transformer — proof-gated graph attention, verified training, and 8 specialized graph layers via NAPI-RS.
Use graph transformers from JavaScript and TypeScript with native Rust performance. Every graph operation — adding nodes, computing attention, training weights — produces a formal proof receipt proving it was done correctly. The heavy computation runs in compiled Rust via NAPI-RS, so you get sub-millisecond proof verification without leaving the Node.js ecosystem.
Install
Prebuilt binaries are provided for:
| Platform | Architecture | Package |
|---|---|---|
| Linux | x64 (glibc) | @ruvector/graph-transformer-linux-x64-gnu |
| Linux | x64 (musl) | @ruvector/graph-transformer-linux-x64-musl |
| Linux | ARM64 (glibc) | @ruvector/graph-transformer-linux-arm64-gnu |
| macOS | x64 (Intel) | @ruvector/graph-transformer-darwin-x64 |
| macOS | ARM64 (Apple Silicon) | @ruvector/graph-transformer-darwin-arm64 |
| Windows | x64 | @ruvector/graph-transformer-win32-x64-msvc |
Quick Start
const = require;
const gt = ;
console.log; // "2.0.4"
// Proof-gated mutation
const gate = gt.;
console.log; // 128
// Prove dimension equality
const proof = gt.;
console.log; // true
// Create attestation (82-byte proof receipt)
const attestation = gt.;
console.log; // 82
API Reference
Proof-Gated Operations
// Create a proof gate for a dimension
const gate = gt.;
// Prove two dimensions are equal
const proof = gt.;
// Create 82-byte attestation for embedding in RVF witness chains
const bytes = gt.;
// Verify attestation from bytes
const valid = gt.;
// Compose a pipeline of type-checked stages
const composed = gt.;
Sublinear Attention
// O(n log n) graph attention via PPR sparsification
const result = gt.;
console.log;
// Raw PPR scores
const scores = gt.;
Physics-Informed Layers
// Symplectic leapfrog step (energy-conserving)
const state = gt.;
console.log;
// With graph interactions
const state2 = gt.;
console.log; // true
Biological Layers
// Spiking neural attention (event-driven)
const output = gt.;
// Hebbian weight update (Hebb's rule)
const weights = gt.;
// Full spiking step over feature matrix
const result = gt.;
Verified Training
// Single verified SGD step with proof receipt
const result = gt.;
console.log;
// Full training step with features and targets
const step = gt.;
console.log;
Manifold Operations
// Product manifold distance (mixed curvatures)
const d = gt.;
// Product manifold attention
const result = gt.;
Temporal-Causal Attention
// Causal attention (no future information leakage)
const scores = gt.;
// Causal attention over graph
const output = gt.;
// Granger causality extraction
const dag = gt.;
console.log; // [{ source, target, f_statistic, is_causal }]
Economic / Game-Theoretic
// Nash equilibrium attention
const result = gt.;
console.log;
Stats & Control
// Aggregate statistics
const stats = gt.;
console.log;
// Reset all internal state
gt.;
Building from Source
# Install NAPI-RS CLI
# Build native module
# Run tests
Related Packages
| Package | Description |
|---|---|
ruvector-graph-transformer |
Core Rust crate |
ruvector-graph-transformer-wasm |
WASM bindings for browsers |
@ruvector/gnn |
Base GNN operations |
@ruvector/attention |
46 attention mechanisms |
License
MIT