Module alumina::ops::activ [] [src]

Structs

BeLU

BeLU - PSISOF - Bent Linear Unit - huber loss function superimposed with a parameterised (learnable) linear activation

ELUFunc
GenericActivation

GenericActivation - NSISOF Generic over a function which, given an input value, returns an output value and derivative of the output w.r.t the input.

IdentityFunc
LeakyReLU

LeakyReLU - NSISOF - left side slope is a fixed small number, default 0.01

LinearToSrgbFunc
LogisticFunc
ReLUFunc
SignedLn1pFunc
SoftMax

SoftMax - NSISOF -

SrgbToLinearFunc
TanhFunc

Traits

ActivationFunc

Used to define graph operation where the effect of the input on the output is entirely seperable.

Type Definitions

ELU

ELU - NSISOF - left side grad is a fixed small number, default 0.01

Identity

Identity - NSISOF - output equals input

LinearToSrgb
Logistic

Logistic - NSISOF - sigmoid that compresses input and outputs between 0 and 1

ReLU

ReLU - NSISOF - above 0 grad is 1, below 0 grad is 0

SignedLn1p

SignedLn1p - NSISOF - Ln1p(x) above 0, -Ln1p(-x) below

SrgbToLinear
Tanh

Tanh - NSISOF - sigmoid that compresses input and outputs between -1 and 1