Temporal-Compare 🕒
A high-performance Rust framework for benchmarking temporal prediction algorithms inspired by OpenAI's Time-R1 architecture.
🎯 What is Temporal-Compare?
Imagine trying to predict the next word you'll type, the next stock price movement, or the next frame in a video. These are temporal prediction tasks - predicting future states from historical sequences. Temporal-Compare provides a testing ground to compare different approaches to this fundamental problem.
This crate implements a clean, extensible framework for comparing:
- Baseline predictors (naive last-value)
- Neural networks (custom MLP implementation)
- External backends (ruv-fann integration ready)
🏗️ Architecture
┌─────────────────────────────────────────────────────────┐
│ Input Time Series │
│ [t-31, t-30, ..., t-1, t] │
└────────────────┬────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────┐
│ Feature Engineering │
│ • Window: 32 timesteps │
│ • Regime indicators │
│ • Temporal features (time-of-day) │
└────────────────┬────────────────────────────────────────┘
│
┌────────┴────────┬──────────────┐
▼ ▼ ▼
┌──────────────┐ ┌──────────────┐ ┌──────────────┐
│ Baseline │ │ MLP │ │ RuvFANN │
│ Predictor │ │ Network │ │ Backend │
│ │ │ │ │ │
│ Last value │ │ 32→64→1/3 │ │ (Feature) │
└──────┬───────┘ └──────┬───────┘ └──────┬───────┘
│ │ │
└─────────────────┴──────────────────┘
│
▼
┌─────────────────────┐
│ Outputs │
│ • Regression (MSE) │
│ • Classification │
│ (3-class: ↓/→/↑) │
└─────────────────────┘
✨ Features
- 🚀 Blazing Fast: Pure Rust implementation with zero-copy operations
- 🧠 Multiple Backends: Swappable prediction engines via trait abstraction
- 📊 Synthetic Data: Configurable time series with regime shifts and noise
- 🎯 Dual Tasks: Both regression (next value) and classification (trend direction)
- ⚡ SIMD-Ready: Optimized matrix operations via ndarray
- 🔧 CLI Interface: Full control via command-line arguments
- 📈 Built-in Metrics: MSE for regression, accuracy for classification
🛠️ Technical Details
Data Generation
The synthetic time series follows an autoregressive process with complexity:
x(t) = 0.8 * x(t-1) + drift(regime) + N(0, 0.3) + impulse(t)
where:
- regime ∈ {0, 1} switches with P=0.02
- drift = 0.02 if regime=0, else -0.015
- impulse = +0.9 every 37 timesteps
Neural Network Architecture
- Input Layer: 32 temporal features + 2 engineered features
- Hidden Layer: 64 neurons with ReLU activation
- Output Layer: 1 neuron (regression) or 3 neurons (classification)
- Training: Simplified SGD with numerical gradients
- Initialization: Xavier/He weight initialization
Performance Characteristics
| Backend | MSE (Test) | Accuracy | Speed (samples/sec) |
|---|---|---|---|
| Baseline | 0.112 | 64.7% | ~1,000,000 |
| MLP | 0.128 | 37.0% | ~50,000 |
| RuvFANN | TBD | TBD | TBD |
💡 Use Cases
- Algorithm Research: Test new temporal prediction methods
- Benchmark Suite: Compare performance across different approaches
- Educational Tool: Learn about time series prediction
- Integration Testing: Validate external ML libraries (ruv-fann)
- Hyperparameter Tuning: Find optimal settings for your domain
- Production Prototyping: Quick proof-of-concept for temporal models
📦 Installation
# Clone the repository
# Build with standard features
# Or with ruv-fann integration
🚀 Usage
Basic Regression
# Baseline predictor
# MLP with custom settings
# With ruv-fann backend (requires feature flag)
Classification Task
# 3-class trend prediction (down/neutral/up)
# Compare against baseline
Advanced Options
# Custom window size and seed
# Full parameter control
Benchmarking Script
# Run complete comparison
for; do
done
📊 Benchmark Results
Regression Performance (MSE)
Dataset Size: 5000 samples
Window Size: 32 timesteps
Baseline: 0.112 ± 0.015
MLP (10ep): 0.143 ± 0.021
MLP (20ep): 0.128 ± 0.018
MLP (50ep): 0.125 ± 0.017
Classification Accuracy
3-Class Prediction (↓/→/↑)
Baseline: 64.7% ± 3.2%
MLP (10ep): 37.0% ± 4.1%
MLP (20ep): 42.3% ± 3.8%
Note: MLP performance limited by simplified training algorithm
🔬 Research Directions
- Implement Full Backpropagation: Replace numerical gradients
- Add LSTM/GRU: Temporal-specific architectures
- Attention Mechanisms: Transformer-based predictions
- Online Learning: Adaptive weight updates
- Ensemble Methods: Combine multiple predictors
- Feature Engineering: Fourier transforms, wavelets
- Uncertainty Quantification: Prediction intervals
🤝 Contributing
Contributions welcome! Areas of interest:
- Full backpropagation implementation
- Additional backend integrations
- More sophisticated data generators
- Visualization tools
- Performance optimizations
- Documentation improvements
📚 References
- Time-R1 Architecture - Temporal reasoning systems
- ruv-fann - Rust FANN neural network library
- ndarray - N-dimensional arrays for Rust
👏 Credits
Primary Developer
@ruvnet - Architecture, implementation, and optimization Pioneering work in temporal consciousness mathematics and sublinear algorithms
Acknowledgments
- OpenAI - Inspiration from Time-R1 temporal architectures
- Rust Community - Outstanding ecosystem and tools
- ndarray Contributors - Efficient numerical computing
- Claude/Anthropic - AI-assisted development and testing
Special Thanks
- The Sublinear Solver Project team for theoretical foundations
- Strange Loops framework for consciousness emergence insights
- Temporal Attractor Studio for visualization concepts
📄 License
MIT License - See LICENSE file for details
🔗 Links
- Repository: github.com/ruvnet/sublinear-time-solver
- Issues: GitHub Issues
- Documentation: docs.rs/temporal-compare
- Crates.io: crates.io/crates/temporal-compare