Expand description
Neural network module.
Modules§
- Attention module
- Cache module
- Convolution module
- Gated Recurrent Unit module.
- Loss module
- Long Short-Term Memory module.
- Pooling module
- Transformer module
Structs§
- Applies Batch Normalization over a tensor as described in the paper Batch Normalization
- Configuration to create a BatchNorm layer.
- The record type for the module.
- The record item type for the module.
- Set at random some elements of the input tensor to zero during training.
- Configuration to create a Dropout layer.
- Lookup table to store a fix number of vectors.
- Configuration to create an Embedding layer.
- The record type for the module.
- The record item type for the module.
- A GateController represents a gate in an LSTM cell. An LSTM cell generally contains three gates: an input gate, forget gate, and cell gate.
- The record type for the module.
- The record item type for the module.
- Applies the Gaussian Error Linear Units function element-wise.
- Applies Group Normalization over a mini-batch of inputs.
- Configuration to create a GroupNorm layer.
- The record type for the module.
- The record item type for the module.
- Applies Instance Normalization over a tensor as described in the paper Instance Normalization
- Configuration to create a InstanceNorm layer.
- The record type for the module.
- The record item type for the module.
- Applies Layer Normalization over an input tensor as described in the paper Layer Normalization.
- Configuration to create a LayerNorm layer.
- The record type for the module.
- The record item type for the module.
- Leaky ReLu layer.
- Configuration to create a Leaky Relu layer.
- The record type for the module.
- The record item type for the module.
- Applies a linear transformation to the input tensor:
- Configuration to create a Linear layer.
- The record type for the module.
- The record item type for the module.
- The Lstm module. This implementation is for a unidirectional, stateless, Lstm.
- The configuration for a lstm module.
- The record type for the module.
- The record item type for the module.
- Parametric Relu layer.
- Configuration to create a Parametric Relu layer.
- The record type for the module.
- The record item type for the module.
- Positional encoding layer for transformer models.
- Configuration to create an PositionalEncoding layer.
- The record type for the module.
- The record item type for the module.
- Applies the rectified linear unit function element-wise:
- Applies RMS Normalization over an input tensor along the last dimension.
- Configuration to create a RMS Norm layer.
- The record type for the module.
- The record item type for the module.
- A module that applies rotary positional encoding to a tensor. Rotary Position Encoding or Embedding (RoPE), is a type of position embedding which encodes absolute positional information with rotation matrix and naturally incorporates explicit relative position dependency in self-attention formulation.
- Configuration to create a RotaryEncoding layer.
- The record type for the module.
- The record item type for the module.
- Applies the SwiGLU or Swish Gated Linear Unit to the input tensor. The SwiGLU activation function is defined as:
SwiGLU(x) = Swish(W_inner * x + b_inner) * (W_outer * x + b_outer)
- Configuration to create a SwiGlu activation layer.
- The record type for the module.
- The record item type for the module.
- Four-dimensional unfolding.
- Configuration to create an unfold 4D layer.
Enums§
- Enum specifying with what values a tensor should be initialized
- Padding configuration for 1D operators.
- Padding configuration for 2D operators.
Functions§
- Returns sinusoids for positional embedding introduced in Attention is all you need.