Skip to main content

Module nn

Module nn 

Source
Expand description

Neural network module.

Modules§

activation
Activation Layers
loss
Loss module
modules
Neural network modules implementations.

Structs§

BatchNorm
Applies Batch Normalization over a tensor.
BatchNormConfig
BatchNorm Configuration.
BatchNormRecord
The record type for the module.
BatchNormRecordItem
The record item type for the module.
BiGru
The BiGru module. This implementation is for Bidirectional GRU.
BiGruConfig
Configuration to create a BiGru module using the init function.
BiGruRecord
The record type for the module.
BiGruRecordItem
The record item type for the module.
BiLstm
The BiLstm module. This implementation is for Bidirectional LSTM.
BiLstmConfig
Configuration to create a BiLstm module using the init function.
BiLstmRecord
The record type for the module.
BiLstmRecordItem
The record item type for the module.
BiRnn
The BiRnn module. This implementation is for Bidirectional RNN. Should be created with BiRnnConfig.
BiRnnConfig
Configuration to create a BiRnn module using the init function.
BiRnnRecord
The record type for the module.
BiRnnRecordItem
The record item type for the module.
Celu
CELU (Continuously Differentiable Exponential Linear Unit) layer.
CeluConfig
Configuration to create a Celu layer using the init function.
Dropout
Set at random some elements of the input tensor to zero during training.
DropoutConfig
Configuration to create a Dropout layer using the init function.
Elu
ELU (Exponential Linear Unit) layer.
EluConfig
Configuration to create an Elu layer using the init function.
Embedding
Lookup table to store a fix number of vectors.
EmbeddingConfig
Configuration to create an Embedding layer using the init function.
EmbeddingRecord
The record type for the module.
EmbeddingRecordItem
The record item type for the module.
GLU
Applies the gated linear unit function.
GateController
A GateController represents a gate in an LSTM cell. An LSTM cell generally contains three gates: an input gate, forget gate, and output gate. Additionally, cell gate is just used to compute the cell state.
GateControllerRecord
The record type for the module.
GateControllerRecordItem
The record item type for the module.
GaussianNoise
Add pseudorandom Gaussian noise to an arbitrarily shaped tensor.
GaussianNoiseConfig
Configuration to create a GaussianNoise layer using the init function.
Gelu
Applies the Gaussian Error Linear Units function element-wise.
GroupNorm
Applies Group Normalization over a mini-batch of inputs as described in the paper Group Normalization.
GroupNormConfig
Configuration to create a GroupNorm layer using the init function.
GroupNormRecord
The record type for the module.
GroupNormRecordItem
The record item type for the module.
Gru
The Gru (Gated recurrent unit) module. This implementation is for a unidirectional, stateless, Gru.
GruConfig
Configuration to create a gru module using the init function.
GruRecord
The record type for the module.
GruRecordItem
The record item type for the module.
HardShrink
Hard Shrink layer.
HardShrinkConfig
Configuration to create a HardShrink layer using the init function.
HardSigmoid
Hard Sigmoid layer.
HardSigmoidConfig
Configuration to create a Hard Sigmoid layer using the init function.
InstanceNorm
Applies Instance Normalization over a tensor as described in the paper Instance Normalization
InstanceNormConfig
Configuration to create a InstanceNorm layer using the init function.
InstanceNormRecord
The record type for the module.
InstanceNormRecordItem
The record item type for the module.
LayerNorm
Applies Layer Normalization over an input tensor as described in the paper Layer Normalization.
LayerNormConfig
Configuration to create a LayerNorm layer using the init function.
LayerNormRecord
The record type for the module.
LayerNormRecordItem
The record item type for the module.
LeakyRelu
Leaky ReLu layer.
LeakyReluConfig
Configuration to create a Leaky Relu layer using the init function.
Linear
Applies a linear transformation to the input tensor.
LinearConfig
Configuration to create a Linear layer using the init function.
LinearRecord
The record type for the module.
LinearRecordItem
The record item type for the module.
LocalResponseNorm
Applies Local Response Normalization as described in ImageNet Classification with Deep Convolutional Neural Networks.
LocalResponseNormConfig
Configuration to create a LocalResponseNorm layer using the init function.
Lstm
The Lstm module. This implementation is for a unidirectional, stateless, Lstm.
LstmConfig
Configuration to create a Lstm module using the init function.
LstmRecord
The record type for the module.
LstmRecordItem
The record item type for the module.
LstmState
A LstmState is used to store cell state and hidden state in LSTM.
PRelu
Parametric Relu layer.
PReluConfig
Configuration to create a Parametric Relu layer using the init function.
PReluRecord
The record type for the module.
PReluRecordItem
The record item type for the module.
PositionalEncoding
Positional encoding layer for transformer models.
PositionalEncodingConfig
Configuration to create a PositionalEncoding layer using the init function.
PositionalEncodingRecord
The record type for the module.
PositionalEncodingRecordItem
The record item type for the module.
Relu
Applies the rectified linear unit function element-wise See also relu
RmsNorm
Applies RMS Normalization over an input tensor along the last dimension.
RmsNormConfig
Configuration to create a RMS Norm layer using the init function.
RmsNormRecord
The record type for the module.
RmsNormRecordItem
The record item type for the module.
Rnn
The Rnn module. This implementation is for a unidirectional, stateless, Rnn. Should be created with RnnConfig.
RnnConfig
Configuration to create a Rnn module using the init function.
RnnRecord
The record type for the module.
RnnRecordItem
The record item type for the module.
RnnState
A RnnState is used to store hidden state in RNN.
RotaryEncoding
A module that applies rotary positional encoding to a tensor. Rotary Position Encoding or Embedding (RoPE), is a type of position embedding which encodes absolute positional information with rotation matrix and naturally incorporates explicit relative position dependency in self-attention formulation.
RotaryEncodingConfig
Configuration to create a RotaryEncoding layer using the init function.
RotaryEncodingRecord
The record type for the module.
RotaryEncodingRecordItem
The record item type for the module.
Selu
Applies the Scaled Exponential Linear Unit function element-wise. See also selu
Shrink
Shrink layer.
ShrinkConfig
Configuration to create a Shrink layer using the init function.
Sigmoid
Applies the sigmoid function element-wise See also sigmoid
SoftShrink
Soft Shrink layer.
SoftShrinkConfig
Configuration to create a SoftShrink layer using the init function.
Softplus
Softplus layer.
SoftplusConfig
Configuration to create a Softplus layer using the init function.
Softsign
Applies the softsign function element-wise See also softsign
SwiGlu
Applies the SwiGLU or Swish Gated Linear Unit to the input tensor. The SwiGLU activation function is defined as: SwiGLU(x) = Swish(W_inner * x + b_inner) * (W_outer * x + b_outer)
SwiGluConfig
Configuration to create a SwiGlu activation layer using the init function.
SwiGluRecord
The record type for the module.
SwiGluRecordItem
The record item type for the module.
Tanh
Applies the tanh activation function element-wise See also tanh
ThresholdedRelu
Thresholded ReLU layer.
ThresholdedReluConfig
Configuration to create a ThresholdedRelu layer using the init function.
Unfold4d
Four-dimensional unfolding.
Unfold4dConfig
Configuration to create an unfold 4d layer using the init function.

Enums§

Initializer
Enum specifying with what values a tensor should be initialized
LinearLayout
The layout in which the linear parameters are stored.
PaddingConfig1d
Padding configuration for 1D operators.
PaddingConfig2d
Padding configuration for 2D operators.
PaddingConfig3d
Padding configuration for 3D operators.

Functions§

generate_sinusoids
Returns sinusoids for positional embedding introduced in Attention is all you need.