Module leaf::layers::activation [] [src]

Provides nonlinear activation methods.

Activation Layers take a input tensor, provide the activation operation and produce a output tensor. Thanks to the nonlinearity of the activation methods, we can 'learn' and detect nonlinearities in our (complex) datasets.

The activation operation used should depend on the task at hand. For binary classification a step function might be very useful. For more complex tasks continious activation functions such as Sigmoid, TanH, ReLU should be used. In most cases ReLU might provide the best results.

If you supply the same blob as input and output to a layer via the LayerConfig, computations will be done in-place, requiring less memory.

The activation function is also sometimes called transfer function.

Reexports

pub use self::relu::ReLU;
pub use self::sigmoid::Sigmoid;
pub use self::tanh::TanH;

Modules

relu

Applies the nonlinear Rectified Linear Unit.

sigmoid

Applies the nonlinear Log-Sigmoid function.

tanh

Applies the nonlinear TanH function.