Expand description
axonml-optim - Optimization Algorithms
§File
crates/axonml-optim/src/lib.rs
§Author
Andrew Jewell Sr - AutomataNexus
§Updated
March 8, 2026
§Disclaimer
Use at own risk. This software is provided “as is”, without warranty of any kind, express or implied. The author and AutomataNexus shall not be held liable for any damages arising from the use of this software.
Re-exports§
pub use adam::Adam;pub use adam::AdamW;pub use grad_scaler::GradScaler;pub use grad_scaler::GradScalerState;pub use health::AlertKind;pub use health::AlertSeverity;pub use health::HealthReport;pub use health::LossTrend;pub use health::MonitorConfig;pub use health::TrainingAlert;pub use health::TrainingMonitor;pub use lamb::LAMB;pub use lr_scheduler::CosineAnnealingLR;pub use lr_scheduler::ExponentialLR;pub use lr_scheduler::LRScheduler;pub use lr_scheduler::MultiStepLR;pub use lr_scheduler::OneCycleLR;pub use lr_scheduler::ReduceLROnPlateau;pub use lr_scheduler::StepLR;pub use lr_scheduler::WarmupLR;pub use optimizer::Optimizer;pub use rmsprop::RMSprop;pub use sgd::SGD;
Modules§
- adam
- Adam Optimizer - Adaptive Moment Estimation
- grad_
scaler - Gradient Scaler for Mixed Precision Training
- health
- Training Health Monitor - Real-time Training Diagnostics
- lamb
- LAMB Optimizer - Layer-wise Adaptive Moments
- lr_
scheduler - Learning Rate Schedulers
- optimizer
- Optimizer Trait - Core Optimizer Interface
- prelude
- Common imports for optimization.
- rmsprop
RMSpropOptimizer- sgd
- SGD Optimizer - Stochastic Gradient Descent