Expand description
GhostFlow Neural Network Layers
High-level building blocks for neural networks.
Re-exports§
pub use module::Module;pub use linear::Linear;pub use conv::Conv1d;pub use conv::Conv2d;pub use norm::BatchNorm1d;pub use norm::BatchNorm2d;pub use norm::LayerNorm;pub use dropout::Dropout;pub use attention::MultiHeadAttention;pub use attention::scaled_dot_product_attention;pub use transformer::TransformerEncoder;pub use transformer::TransformerEncoderLayer;pub use transformer::TransformerDecoderLayer;pub use transformer::FeedForward;pub use transformer::PositionalEncoding;pub use transformer::RotaryEmbedding;pub use embedding::Embedding;pub use activation::*;pub use loss::*;pub use pooling::*;
Modules§
- activation
- Activation function modules
- attention
- Attention mechanisms
- conv
- Convolutional layers
- dropout
- Dropout regularization
- embedding
- Embedding layers
- init
- Weight initialization strategies
- linear
- Linear (fully connected) layer
- loss
- Loss functions
- module
- Base Module trait for neural network layers
- norm
- Normalization layers
- pooling
- Pooling layers
- prelude
- Prelude for convenient imports
- transformer
- Transformer architecture components