Module leaf::layers::activation::relu [] [src]

Applies the nonlinear Rectified Linear Unit.

Non-linearity activation function: y = max(0, x)

This is generally the preferred choice over Sigmod or TanH. The max function used in ReLU is usually faster to compute than the exponentiation needed in a Sigmoid layer.

Structs

ReLU

ReLU Activation Layer