Expand description
Neural network building blocks for AxonML.
Module trait (forward, parameters, train/eval, zero_grad), Parameter
(named gradient-tracked weight), Sequential container, 40+ layer types
in layers (Linear, Conv1d/2d, ConvTranspose2d, MaxPool/AvgPool/
AdaptiveAvgPool, BatchNorm1d/2d, LayerNorm, GroupNorm, InstanceNorm2d,
RMSNorm, Dropout/Dropout2d, RNN/LSTM/GRU + cell variants, MultiHead/
Cross/DifferentialAttention, Embedding, TernaryLinear, Transformer
encoder/decoder, Seq2SeqTransformer, ResidualBlock, MoE, GCN/GAT,
FFT/STFT), activations (ReLU, Sigmoid, Tanh, GELU, SiLU, ELU,
LeakyReLU, Mish, Softmax, LogSoftmax), losses (MSE, CrossEntropy, BCE,
BCEWithLogits, L1, SmoothL1, NLL, CTC, Focal, Triplet, ArcFace),
initialization (Xavier, Kaiming, Glorot, He, orthogonal, sparse),
differentiable structured sparsity (SparseLinear, GroupSparsity,
LotteryTicket), and functional helpers.
§File
crates/axonml-nn/src/lib.rs
§Author
Andrew Jewell Sr. — AutomataNexus LLC ORCID: 0009-0005-2158-7060
§Updated
April 14, 2026 11:15 PM EST
§Disclaimer
Use at own risk. This software is provided “as is”, without warranty of any kind, express or implied. The author and AutomataNexus shall not be held liable for any damages arising from the use of this software.
Re-exports§
pub use module::Module;pub use module::ModuleList;pub use parameter::Parameter;pub use sequential::Sequential;pub use layers::AdaptiveAvgPool2d;pub use layers::AvgPool1d;pub use layers::AvgPool2d;pub use layers::BatchNorm1d;pub use layers::BatchNorm2d;pub use layers::Conv1d;pub use layers::Conv2d;pub use layers::ConvTranspose2d;pub use layers::CrossAttention;pub use layers::DifferentialAttention;pub use layers::Dropout;pub use layers::Embedding;pub use layers::Expert;pub use layers::FFT1d;pub use layers::GATConv;pub use layers::GCNConv;pub use layers::GRU;pub use layers::GRUCell;pub use layers::GroupNorm;pub use layers::GroupSparsity;pub use layers::InstanceNorm2d;pub use layers::LSTM;pub use layers::LSTMCell;pub use layers::LayerNorm;pub use layers::Linear;pub use layers::LotteryTicket;pub use layers::MaxPool1d;pub use layers::MaxPool2d;pub use layers::MoELayer;pub use layers::MoERouter;pub use layers::MultiHeadAttention;pub use layers::PackedTernaryWeights;pub use layers::RNN;pub use layers::RNNCell;pub use layers::ResidualBlock;pub use layers::STFT;pub use layers::Seq2SeqTransformer;pub use layers::SparseLinear;pub use layers::TernaryLinear;pub use layers::TransformerDecoder;pub use layers::TransformerDecoderLayer;pub use layers::TransformerEncoder;pub use layers::TransformerEncoderLayer;pub use activation::ELU;pub use activation::Flatten;pub use activation::GELU;pub use activation::Identity;pub use activation::LeakyReLU;pub use activation::LogSoftmax;pub use activation::ReLU;pub use activation::SiLU;pub use activation::Sigmoid;pub use activation::Softmax;pub use activation::Tanh;pub use loss::BCELoss;pub use loss::BCEWithLogitsLoss;pub use loss::CrossEntropyLoss;pub use loss::L1Loss;pub use loss::MSELoss;pub use loss::NLLLoss;pub use loss::Reduction;pub use loss::SmoothL1Loss;pub use init::InitMode;pub use init::constant;pub use init::diag;pub use init::eye;pub use init::glorot_normal;pub use init::glorot_uniform;pub use init::he_normal;pub use init::he_uniform;pub use init::kaiming_normal;pub use init::kaiming_uniform;pub use init::normal;pub use init::ones;pub use init::orthogonal;pub use init::randn;pub use init::sparse;pub use init::uniform;pub use init::uniform_range;pub use init::xavier_normal;pub use init::xavier_uniform;pub use init::zeros;
Modules§
- activation
- Activation function modules implementing the
Moduletrait. - functional
- Functional API — stateless free functions for common nn operations.
- init
- Parameter initialization strategies.
- layers
- Neural network layer modules — 15 submodules re-exported here.
- loss
- Loss functions for training neural networks.
- module
Moduletrait — the core interface for all neural network layers.- parameter
Parameter— named, gradient-tracked learnable weight.- prelude
- Common imports for neural network development.
- sequential
Sequential— ordered container that chains Module forward passes.