Module activation

Module activation 

Source
Expand description

§Activation Functions

functions for adding non-linearity to a neural network

They all have call and derivative methods.

1.ELU

2.LeakyReLU

3.ReLU

4.SELU

5.Sigmoid

6.SoftMax

Structs§

ELU
Exponential Linear Unit (ELU) activation function.
LeakyReLU
Leaky ReLU activation function.
ReLU
Rectified Linear Unit (ReLU) activation function.
SELU
Scaled Exponential Linear Unit (SELU).
Sigmoid
Sigmoid activation function.
SoftMax
Softmax function (normalized exponential function).
Tanh
Hyperbolic Tangent (Tanh) activation function.

Traits§

Function
A trait for activation functions and other operations that can be applied to matrices.