Expand description
RuVector DAG - Directed Acyclic Graph structures for query plan optimization
This crate provides efficient DAG data structures and algorithms for representing and manipulating query execution plans with neural learning capabilities.
§Features
- DAG Data Structures: Efficient directed acyclic graph representation for query plans
- 7 Attention Mechanisms: Topological, Causal Cone, Critical Path, MinCut Gated, and more
- SONA Learning: Self-Optimizing Neural Architecture with MicroLoRA adaptation (non-WASM only)
- MinCut Optimization: Subpolynomial O(n^0.12) bottleneck detection
- Self-Healing: Autonomous anomaly detection and repair (non-WASM only)
- QuDAG Integration: Quantum-resistant distributed pattern learning (non-WASM only)
§Quick Start
use ruvector_dag::{QueryDag, OperatorNode, OperatorType};
use ruvector_dag::attention::{TopologicalAttention, DagAttention};
// Build a query DAG
let mut dag = QueryDag::new();
let scan = dag.add_node(OperatorNode::seq_scan(0, "users"));
let filter = dag.add_node(OperatorNode::filter(1, "age > 18"));
dag.add_edge(scan, filter).unwrap();
// Compute attention scores
let attention = TopologicalAttention::new(Default::default());
let scores = attention.forward(&dag).unwrap();§Modules
dag- Core DAG data structures and algorithmsattention- Neural attention mechanisms for node importancesona- Self-Optimizing Neural Architecture with adaptive learning (requiresfullfeature)mincut- Subpolynomial bottleneck detection and optimizationhealing- Self-healing system with anomaly detection (requiresfullfeature)qudag- QuDAG network integration for distributed learning (requiresfullfeature)
Re-exports§
pub use dag::BfsIterator;pub use dag::DagDeserializer;pub use dag::DagError;pub use dag::DagSerializer;pub use dag::DfsIterator;pub use dag::OperatorNode;pub use dag::OperatorType;pub use dag::QueryDag;pub use dag::TopologicalIterator;pub use mincut::Bottleneck;pub use mincut::BottleneckAnalysis;pub use mincut::DagMinCutEngine;pub use mincut::FlowEdge;pub use mincut::LocalKCut;pub use mincut::MinCutConfig;pub use mincut::MinCutResult;pub use mincut::RedundancyStrategy;pub use mincut::RedundancySuggestion;pub use attention::AttentionConfig;pub use attention::AttentionError;pub use attention::AttentionScores;pub use attention::CausalConeAttention;pub use attention::CausalConeConfig;pub use attention::CriticalPathAttention;pub use attention::CriticalPathConfig;pub use attention::DagAttention;pub use attention::FlowCapacity;pub use attention::MinCutConfig as AttentionMinCutConfig;pub use attention::MinCutGatedAttention;pub use attention::TopologicalAttention;pub use attention::TopologicalConfig;pub use qudag::QuDagClient;pub use qudag::crypto::check_crypto_security;pub use qudag::crypto::is_production_ready;pub use qudag::crypto::security_status;pub use qudag::crypto::SecurityStatus;pub use healing::Anomaly;pub use healing::AnomalyConfig;pub use healing::AnomalyDetector;pub use healing::AnomalyType;pub use healing::DriftMetric;pub use healing::DriftTrend;pub use healing::HealingCycleResult;pub use healing::HealingOrchestrator;pub use healing::HealthStatus;pub use healing::IndexCheckResult;pub use healing::IndexHealth;pub use healing::IndexHealthChecker;pub use healing::IndexThresholds;pub use healing::IndexType;pub use healing::LearningDriftDetector;pub use healing::RepairResult;pub use healing::RepairStrategy;pub use sona::DagPattern;pub use sona::DagReasoningBank;pub use sona::DagSonaEngine;pub use sona::DagTrajectory;pub use sona::DagTrajectoryBuffer;pub use sona::EwcConfig;pub use sona::EwcPlusPlus;pub use sona::MicroLoRA;pub use sona::MicroLoRAConfig;pub use sona::ReasoningBankConfig;
Modules§
- attention
- DAG Attention Mechanisms
- dag
- Core DAG data structures and algorithms
- healing
- Self-Healing System for Neural DAG Learning
- mincut
- MinCut Optimization: Subpolynomial bottleneck detection
- qudag
- QuDAG Integration - Quantum-Resistant Distributed Pattern Learning
- sona
- SONA: Self-Optimizing Neural Architecture for DAG Learning