[][src]Module tsuga::activation_functions

Activation functions which can be applied element-wise or to subsets of the network's matrices

Functions

relu

REctified Linear Unit function

relu_prime

Derivative of the ReLU function

sigmoid

Applies the sigmoid logistic function

sigmoid_prime

Derivative of the sigmoid function

softmax

Applies the softmax function in-place on a mutable two-dimensional array, making sure that every row has a proportional value that sums to 1.0