Skip to main content

Crate burn_nn

Crate burn_nn 

Source
Expand description

Burn neural network module.

Re-exports§

pub use modules::*;

Modules§

activation
Activation Layers
loss
Loss module
modules
Neural network modules implementations.

Structs§

GLU
Applies the gated linear unit function.
Gelu
Applies the Gaussian Error Linear Units function element-wise. See also gelu
HardSigmoid
Hard Sigmoid layer.
HardSigmoidConfig
Configuration to create a Hard Sigmoid layer using the init function.
LeakyRelu
Leaky ReLu layer.
LeakyReluConfig
Configuration to create a Leaky Relu layer using the init function.
PRelu
Parametric Relu layer.
PReluConfig
Configuration to create a Parametric Relu layer using the init function.
PReluRecord
The record type for the module.
PReluRecordItem
The record item type for the module.
Relu
Applies the rectified linear unit function element-wise See also relu
Sigmoid
Applies the sigmoid function element-wise See also sigmoid
Softplus
Softplus layer.
SoftplusConfig
Configuration to create a Softplus layer using the init function.
SwiGlu
Applies the SwiGLU or Swish Gated Linear Unit to the input tensor. The SwiGLU activation function is defined as: SwiGLU(x) = Swish(W_inner * x + b_inner) * (W_outer * x + b_outer)
SwiGluConfig
Configuration to create a SwiGlu activation layer using the init function.
SwiGluRecord
The record type for the module.
SwiGluRecordItem
The record item type for the module.
Tanh
Applies the tanh activation function element-wise See also tanh

Enums§

Initializer
Enum specifying with what values a tensor should be initialized
PaddingConfig1d
Padding configuration for 1D operators.
PaddingConfig2d
Padding configuration for 2D operators.
PaddingConfig3d
Padding configuration for 3D operators.