| The
Selu op takes one input tensor
| $X$, an argument $alpha$, an argument
| $scale$, and produces one output tensor
| $Y$ of the same shape as $X.$ The op performs
| the element wise
Selu operation,
| defined as
|
| $$y=selu(x) =\begin{cases}scale
| (\alpha e^{x} - \alpha) & x < 0\scale
| * x & otherwise\end{cases}$$
|
| The default value of
alpha is 1.6732632423543772848170429916717
| and the default value of
scale is 1.0507009873554804934193349852946.
| See
Self-Normalizing Neural
|
| Networks
| for more information.
|
| Github Links:
|
| - https://github.com/pytorch/pytorch/blob/master/caffe2/operators/selu_op.h
|
| - https://github.com/pytorch/pytorch/blob/master/caffe2/operators/selu_op.cc
|