Expand description
Advanced Training Methods for Time Series
This module has been refactored into a modular structure for better maintainability and organization. All original functionality is preserved through re-exports.
§Refactored Module Structure
The advanced training functionality has been organized into focused sub-modules:
- Configuration: Common data structures and configurations
- Meta-Learning: MAML and other meta-learning algorithms
- Neural ODEs: Continuous-time neural networks with ODE solvers
- Variational Methods: VAEs for probabilistic time series modeling
- Transformers: Attention-based sequence modeling
- Hyperparameter Optimization: Automated parameter tuning
- Few-Shot Learning: Prototypical Networks and REPTILE
- Memory-Augmented Networks: External memory architectures
- Meta-Optimization: Learned optimizers for adaptive updates
§Usage
All original APIs are preserved. You can continue to use this module exactly as before:
use scirs2_series::advanced_training::{MAML, TimeSeriesVAE, TimeSeriesTransformer};
// Create a MAML instance
let mut maml = MAML::<f64>::new(4, 8, 2, 0.01, 0.1, 5);
// Create a VAE for time series
let vae = TimeSeriesVAE::<f64>::new(10, 3, 5, 16, 16);
// Create a transformer for forecasting
let transformer = TimeSeriesTransformer::<f64>::new(12, 6, 64, 8, 4, 256);§Advanced Usage
For more specific functionality, you can also import directly from sub-modules:
use scirs2_series::advanced_training::few_shot::{PrototypicalNetworks, REPTILE};
use scirs2_series::advanced_training::hyperparameter_optimization::{
HyperparameterOptimizer, OptimizationMethod
};Modules§
- config
- Configuration types for advanced training methods
- few_
shot - Few-Shot Learning Algorithms
- hyperparameter_
optimization - Hyperparameter Optimization Framework
- memory_
augmented - Memory-Augmented Neural Networks
- meta_
learning - Meta-learning algorithms for few-shot time series forecasting
- neural_
ode - Neural Ordinary Differential Equations for continuous-time modeling
- optimization
- Meta-Optimization Algorithms
- transformers
- Transformer-based Time Series Forecasting
- variational
- Variational Autoencoder for Time Series
Structs§
- FewShot
Episode - Few-shot learning episode data structure
- Hyperparameter
Optimizer - Hyperparameter Optimization Framework
- Hyperparameter
Set - Set of hyperparameters
- MAML
- Model-Agnostic Meta-Learning (MAML) for few-shot time series forecasting
- MANN
- Memory-Augmented Neural Network (MANN)
- Meta
Optimizer - Meta-Optimizer using LSTM to generate parameter updates
- NeuralODE
- Neural Ordinary Differential Equation (NODE) implementation
- ODESolver
Config - Configuration for ODE solver
- Optimization
Problem - Optimization problem for meta-optimizer training
- Optimization
Results - Optimization results
- Optimization
Step - Single optimization step
- Prototypical
Networks - Prototypical Networks for Few-Shot Learning
- REPTILE
- REPTILE Algorithm for Meta-Learning
- Search
Space - Search space for hyperparameters
- Task
Data - Task data structure for meta-learning
- Time
Series Transformer - Transformer model for time series forecasting with multi-head attention
- Time
SeriesVAE - Variational Autoencoder for Time Series with Uncertainty Quantification
- VAEOutput
- VAE output structure
Enums§
- Integration
Method - Integration methods for ODE solving
- Optimization
Method - Hyperparameter optimization methods