Expand description
Neural network module.
Modules§
- attention
- Attention module
- cache
- Cache module
- conv
- Convolution module
- gru
- Gated Recurrent Unit module.
- interpolate
- Interpolate module
- loss
- Loss module
- lstm
- Long Short-Term Memory module.
- pool
- Pooling module
- transformer
- Transformer module
Structs§
- Batch
Norm - Applies Batch Normalization over a tensor as described in the paper Batch Normalization
- Batch
Norm Config - Configuration to create a BatchNorm layer using the init function.
- Batch
Norm Record - The record type for the module.
- Batch
Norm Record Item - The record item type for the module.
- BiLstm
- The BiLstm module. This implementation is for Bidirectional LSTM.
- BiLstm
Config - Configuration to create a BiLstm module using the init function.
- BiLstm
Record - The record type for the module.
- BiLstm
Record Item - The record item type for the module.
- Dropout
- Set at random some elements of the input tensor to zero during training.
- Dropout
Config - Configuration to create a Dropout layer using the init function.
- Embedding
- Lookup table to store a fix number of vectors.
- Embedding
Config - Configuration to create an Embedding layer using the init function.
- Embedding
Record - The record type for the module.
- Embedding
Record Item - The record item type for the module.
- Gate
Controller - A GateController represents a gate in an LSTM cell. An LSTM cell generally contains three gates: an input gate, forget gate, and output gate. Additionally, cell gate is just used to compute the cell state.
- Gate
Controller Record - The record type for the module.
- Gate
Controller Record Item - The record item type for the module.
- Gelu
- Applies the Gaussian Error Linear Units function element-wise. See also gelu
- Group
Norm - Applies Group Normalization over a mini-batch of inputs as described in the paper Group Normalization.
- Group
Norm Config - Configuration to create a GroupNorm layer using the init function.
- Group
Norm Record - The record type for the module.
- Group
Norm Record Item - The record item type for the module.
- Hard
Sigmoid - Hard Sigmoid layer.
- Hard
Sigmoid Config - Configuration to create a Hard Sigmoid layer using the init function.
- Instance
Norm - Applies Instance Normalization over a tensor as described in the paper Instance Normalization
- Instance
Norm Config - Configuration to create a InstanceNorm layer using the init function.
- Instance
Norm Record - The record type for the module.
- Instance
Norm Record Item - The record item type for the module.
- Layer
Norm - Applies Layer Normalization over an input tensor as described in the paper Layer Normalization.
- Layer
Norm Config - Configuration to create a LayerNorm layer using the init function.
- Layer
Norm Record - The record type for the module.
- Layer
Norm Record Item - The record item type for the module.
- Leaky
Relu - Leaky ReLu layer.
- Leaky
Relu Config - Configuration to create a Leaky Relu layer using the init function.
- Linear
- Applies a linear transformation to the input tensor:
- Linear
Config - Configuration to create a Linear layer using the init function.
- Linear
Record - The record type for the module.
- Linear
Record Item - The record item type for the module.
- Lstm
- The Lstm module. This implementation is for a unidirectional, stateless, Lstm.
- Lstm
Config - Configuration to create a Lstm module using the init function.
- Lstm
Record - The record type for the module.
- Lstm
Record Item - The record item type for the module.
- Lstm
State - A LstmState is used to store cell state and hidden state in LSTM.
- PRelu
- Parametric Relu layer.
- PRelu
Config - Configuration to create a Parametric Relu layer using the init function.
- PRelu
Record - The record type for the module.
- PRelu
Record Item - The record item type for the module.
- Positional
Encoding - Positional encoding layer for transformer models.
- Positional
Encoding Config - Configuration to create a PositionalEncoding layer using the init function.
- Positional
Encoding Record - The record type for the module.
- Positional
Encoding Record Item - The record item type for the module.
- Relu
- Applies the rectified linear unit function element-wise See also relu
- RmsNorm
- Applies RMS Normalization over an input tensor along the last dimension.
- RmsNorm
Config - Configuration to create a RMS Norm layer using the init function.
- RmsNorm
Record - The record type for the module.
- RmsNorm
Record Item - The record item type for the module.
- Rotary
Encoding - A module that applies rotary positional encoding to a tensor. Rotary Position Encoding or Embedding (RoPE), is a type of position embedding which encodes absolute positional information with rotation matrix and naturally incorporates explicit relative position dependency in self-attention formulation.
- Rotary
Encoding Config - Configuration to create a RotaryEncoding layer using the init function.
- Rotary
Encoding Record - The record type for the module.
- Rotary
Encoding Record Item - The record item type for the module.
- Sigmoid
- Applies the sigmoid function element-wise See also sigmoid
- SwiGlu
- Applies the SwiGLU or Swish Gated Linear Unit to the input tensor.
The SwiGLU activation function is defined as:
SwiGLU(x) = Swish(W_inner * x + b_inner) * (W_outer * x + b_outer)
- SwiGlu
Config - Configuration to create a SwiGlu activation layer using the init function.
- SwiGlu
Record - The record type for the module.
- SwiGlu
Record Item - The record item type for the module.
- Tanh
- Applies the tanh activation function element-wise See also tanh
- Unfold4d
- Four-dimensional unfolding.
- Unfold4d
Config - Configuration to create an unfold 4d layer using the init function.
Enums§
- Initializer
- Enum specifying with what values a tensor should be initialized
- Padding
Config1d - Padding configuration for 1D operators.
- Padding
Config2d - Padding configuration for 2D operators.
- Padding
Config3d - Padding configuration for 3D operators.
Functions§
- generate_
sinusoids - Returns sinusoids for positional embedding introduced in Attention is all you need.