Module hal::activations [] [src]

Functions

get_activation

Helper to get the correct activation using a string

get_derivative

Helper to get the correct activation derivative using a string

is_smooth

Helper to determine whether function is smooth or non-smooth

lrelu

Return the lrelu activated value max(0.01*x, x)

lrelu_derivative

Return the derivative of relu [assumes that relu has already been applied] 0.01 for x <= 0 1 otherwise

ones

Returns a linear activation [no non-linearity]

ones_derivative

Returns a derivative of a linear activation (1's)

relu

Return the relu activated value max(0, x)

relu_derivative

Return the derivative of relu [assumes that relu has already been applied] 0 for x <= 0 1 otherwise

sigmoid

Returns the sigmoid activated value 1.0/(1.0 + exp(-1.0 * e))

sigmoid_derivative

Returns the derivative of sigmoid [assumes that sigmoid is already applied] x * (1 - x)

softmax

Return the softmax activated value exp(x_i) / sum(exp(x))

softmax_derivative

Returns the derivative of softmax [assumes that it was already applied] x * (1 - x)

tanh

Returns the tanh activated value

tanh_derivative

Return the derivative of tanh [assumes that tanh has already been applied] 1 - tanh(x)*tanh(x)