Expand description
Burn neural network module.
Re-exports§
pub use modules::*;
Modules§
- activation
- Activation Layers
- loss
- Loss module
- modules
- Neural network modules implementations.
Structs§
- GLU
- Applies the gated linear unit function.
- Gelu
- Applies the Gaussian Error Linear Units function element-wise. See also gelu
- Hard
Sigmoid - Hard Sigmoid layer.
- Hard
Sigmoid Config - Configuration to create a Hard Sigmoid layer using the init function.
- Leaky
Relu - Leaky ReLu layer.
- Leaky
Relu Config - Configuration to create a Leaky Relu layer using the init function.
- PRelu
- Parametric Relu layer.
- PRelu
Config - Configuration to create a Parametric Relu layer using the init function.
- PRelu
Record - The record type for the module.
- PRelu
Record Item - The record item type for the module.
- Relu
- Applies the rectified linear unit function element-wise See also relu
- Sigmoid
- Applies the sigmoid function element-wise See also sigmoid
- Softplus
- Softplus layer.
- Softplus
Config - Configuration to create a Softplus layer using the init function.
- SwiGlu
- Applies the SwiGLU or Swish Gated Linear Unit to the input tensor.
The SwiGLU activation function is defined as:
SwiGLU(x) = Swish(W_inner * x + b_inner) * (W_outer * x + b_outer) - SwiGlu
Config - Configuration to create a SwiGlu activation layer using the init function.
- SwiGlu
Record - The record type for the module.
- SwiGlu
Record Item - The record item type for the module.
- Tanh
- Applies the tanh activation function element-wise See also tanh
Enums§
- Initializer
- Enum specifying with what values a tensor should be initialized
- Padding
Config1d - Padding configuration for 1D operators.
- Padding
Config2d - Padding configuration for 2D operators.
- Padding
Config3d - Padding configuration for 3D operators.