Module nn

Source
Expand description

Neural network module.

Modules§

attention
Attention module
cache
Cache module
conv
Convolution module
gru
Gated Recurrent Unit module.
interpolate
Interpolate module
loss
Loss module
lstm
Long Short-Term Memory module.
pool
Pooling module
transformer
Transformer module

Structs§

BatchNorm
Applies Batch Normalization over a tensor as described in the paper Batch Normalization
BatchNormConfig
Configuration to create a BatchNorm layer using the init function.
BatchNormRecord
The record type for the module.
BatchNormRecordItem
The record item type for the module.
BiLstm
The BiLstm module. This implementation is for Bidirectional LSTM.
BiLstmConfig
Configuration to create a BiLstm module using the init function.
BiLstmRecord
The record type for the module.
BiLstmRecordItem
The record item type for the module.
Dropout
Set at random some elements of the input tensor to zero during training.
DropoutConfig
Configuration to create a Dropout layer using the init function.
Embedding
Lookup table to store a fix number of vectors.
EmbeddingConfig
Configuration to create an Embedding layer using the init function.
EmbeddingRecord
The record type for the module.
EmbeddingRecordItem
The record item type for the module.
GateController
A GateController represents a gate in an LSTM cell. An LSTM cell generally contains three gates: an input gate, forget gate, and output gate. Additionally, cell gate is just used to compute the cell state.
GateControllerRecord
The record type for the module.
GateControllerRecordItem
The record item type for the module.
Gelu
Applies the Gaussian Error Linear Units function element-wise. See also gelu
GroupNorm
Applies Group Normalization over a mini-batch of inputs as described in the paper Group Normalization.
GroupNormConfig
Configuration to create a GroupNorm layer using the init function.
GroupNormRecord
The record type for the module.
GroupNormRecordItem
The record item type for the module.
HardSigmoid
Hard Sigmoid layer.
HardSigmoidConfig
Configuration to create a Hard Sigmoid layer using the init function.
InstanceNorm
Applies Instance Normalization over a tensor as described in the paper Instance Normalization
InstanceNormConfig
Configuration to create a InstanceNorm layer using the init function.
InstanceNormRecord
The record type for the module.
InstanceNormRecordItem
The record item type for the module.
LayerNorm
Applies Layer Normalization over an input tensor as described in the paper Layer Normalization.
LayerNormConfig
Configuration to create a LayerNorm layer using the init function.
LayerNormRecord
The record type for the module.
LayerNormRecordItem
The record item type for the module.
LeakyRelu
Leaky ReLu layer.
LeakyReluConfig
Configuration to create a Leaky Relu layer using the init function.
Linear
Applies a linear transformation to the input tensor:
LinearConfig
Configuration to create a Linear layer using the init function.
LinearRecord
The record type for the module.
LinearRecordItem
The record item type for the module.
Lstm
The Lstm module. This implementation is for a unidirectional, stateless, Lstm.
LstmConfig
Configuration to create a Lstm module using the init function.
LstmRecord
The record type for the module.
LstmRecordItem
The record item type for the module.
LstmState
A LstmState is used to store cell state and hidden state in LSTM.
PRelu
Parametric Relu layer.
PReluConfig
Configuration to create a Parametric Relu layer using the init function.
PReluRecord
The record type for the module.
PReluRecordItem
The record item type for the module.
PositionalEncoding
Positional encoding layer for transformer models.
PositionalEncodingConfig
Configuration to create a PositionalEncoding layer using the init function.
PositionalEncodingRecord
The record type for the module.
PositionalEncodingRecordItem
The record item type for the module.
Relu
Applies the rectified linear unit function element-wise See also relu
RmsNorm
Applies RMS Normalization over an input tensor along the last dimension.
RmsNormConfig
Configuration to create a RMS Norm layer using the init function.
RmsNormRecord
The record type for the module.
RmsNormRecordItem
The record item type for the module.
RotaryEncoding
A module that applies rotary positional encoding to a tensor. Rotary Position Encoding or Embedding (RoPE), is a type of position embedding which encodes absolute positional information with rotation matrix and naturally incorporates explicit relative position dependency in self-attention formulation.
RotaryEncodingConfig
Configuration to create a RotaryEncoding layer using the init function.
RotaryEncodingRecord
The record type for the module.
RotaryEncodingRecordItem
The record item type for the module.
Sigmoid
Applies the sigmoid function element-wise See also sigmoid
SwiGlu
Applies the SwiGLU or Swish Gated Linear Unit to the input tensor. The SwiGLU activation function is defined as: SwiGLU(x) = Swish(W_inner * x + b_inner) * (W_outer * x + b_outer)
SwiGluConfig
Configuration to create a SwiGlu activation layer using the init function.
SwiGluRecord
The record type for the module.
SwiGluRecordItem
The record item type for the module.
Tanh
Applies the tanh activation function element-wise See also tanh
Unfold4d
Four-dimensional unfolding.
Unfold4dConfig
Configuration to create an unfold 4d layer using the init function.

Enums§

Initializer
Enum specifying with what values a tensor should be initialized
PaddingConfig1d
Padding configuration for 1D operators.
PaddingConfig2d
Padding configuration for 2D operators.
PaddingConfig3d
Padding configuration for 3D operators.

Functions§

generate_sinusoids
Returns sinusoids for positional embedding introduced in Attention is all you need.