Expand description
§RuVector GNN
Graph Neural Network capabilities for RuVector, providing tensor operations, GNN layers, compression, and differentiable search.
§Forgetting Mitigation (Issue #17)
This crate includes comprehensive forgetting mitigation for continual learning:
- Adam Optimizer: Full implementation with momentum and bias correction
- Replay Buffer: Experience replay with reservoir sampling for uniform coverage
- EWC (Elastic Weight Consolidation): Prevents catastrophic forgetting
- Learning Rate Scheduling: Multiple strategies including warmup and plateau detection
§Usage Example
ⓘ
use ruvector_gnn::{
training::{Optimizer, OptimizerType},
replay::ReplayBuffer,
ewc::ElasticWeightConsolidation,
scheduler::{LearningRateScheduler, SchedulerType},
};
// Create Adam optimizer
let mut optimizer = Optimizer::new(OptimizerType::Adam {
learning_rate: 0.001,
beta1: 0.9,
beta2: 0.999,
epsilon: 1e-8,
});
// Create replay buffer for experience replay
let mut replay = ReplayBuffer::new(10000);
// Create EWC for preventing forgetting
let mut ewc = ElasticWeightConsolidation::new(0.4);
// Create learning rate scheduler
let mut scheduler = LearningRateScheduler::new(
SchedulerType::CosineAnnealing { t_max: 100, eta_min: 1e-6 },
0.001
);Re-exports§
pub use compress::CompressedTensor;pub use compress::CompressionLevel;pub use compress::TensorCompress;pub use error::GnnError;pub use error::Result;pub use ewc::ElasticWeightConsolidation;pub use layer::RuvectorLayer;pub use query::QueryMode;pub use query::QueryResult;pub use query::RuvectorQuery;pub use query::SubGraph;pub use replay::DistributionStats;pub use replay::ReplayBuffer;pub use replay::ReplayEntry;pub use scheduler::LearningRateScheduler;pub use scheduler::SchedulerType;pub use search::cosine_similarity;pub use search::differentiable_search;pub use search::hierarchical_forward;pub use training::info_nce_loss;pub use training::local_contrastive_loss;pub use training::sgd_step;pub use training::Loss;pub use training::LossType;pub use training::OnlineConfig;pub use training::Optimizer;pub use training::OptimizerType;pub use training::TrainConfig;pub use mmap::AtomicBitmap;pub use mmap::MmapGradientAccumulator;pub use mmap::MmapManager;
Modules§
- compress
- Tensor compression with adaptive level selection
- error
- Error types for the GNN module.
- ewc
- layer
- GNN Layer Implementation for HNSW Topology
- mmap
- Memory-mapped embedding management for large-scale GNN training.
- query
- Query API for RuVector GNN
- replay
- Experience Replay Buffer for GNN Training
- scheduler
- Learning rate scheduling for Graph Neural Networks
- search
- tensor
- Tensor operations for GNN computations.
- training
- Training utilities for GNN models.