Module neural_forecasting

Module neural_forecasting 

Source
Expand description

Neural Forecasting Models for Time Series

This module provides cutting-edge implementations for neural network-based time series forecasting, including LSTM, GRU, Transformer, Mamba/State Space Models, Temporal Fusion Transformers, and Mixture of Experts architectures. These implementations focus on core algorithmic components and can be extended with actual neural network frameworks.

§Advanced Architectures

  • LSTM Networks: Long Short-Term Memory networks for sequence modeling
  • Transformer Models: Self-attention based architectures
  • N-BEATS: Neural basis expansion analysis for time series forecasting
  • Mamba/State Space Models: Linear complexity for long sequences with selective state spaces
  • Flash Attention: Memory-efficient attention computation for transformers
  • Temporal Fusion Transformers: Specialized architecture for time series forecasting
  • Mixture of Experts: Conditional computation for model scaling

Re-exports§

pub use self::attention::*;
pub use self::config::*;
pub use self::lstm::*;
pub use self::mamba::*;
pub use self::mixture_of_experts::*;
pub use self::nbeats::*;
pub use self::temporal_fusion::*;
pub use self::transformer::*;

Modules§

attention
Advanced Attention Mechanisms
config
Configuration and Common Types for Neural Forecasting
lstm
LSTM Network Components for Time Series Forecasting
mamba
Mamba/State Space Models for Time Series
mixture_of_experts
Mixture of Experts for Conditional Computation
nbeats
N-BEATS Neural Basis Expansion Analysis for Time Series
temporal_fusion
Temporal Fusion Transformer Components
transformer
Transformer Networks for Time Series Forecasting