| This op implements the exponential linear unit
| (ELU) activation function as described in [Fast
| and Accurate
|
| Deep Network Learning by Exponential Linear Units
| (ELUs)]
|
| (https://arxiv.org/abs/1511.07289).
|
| The op takes an input tensor $X$ of arbitrary
| shape, computes the elementwise elu operation, and
| returns a vector $Y$ of the same shape as output.
|
| The alpha parameter may be passed as an argument,
| but defaults to 1.
|
| The elu operation is defined as
|
| $$y=f(x) =\begin{cases}\alpha(e^x-1) & x < 0 \
| x & otherwise\end{cases}$$
|
| Github Links:
| - https://github.com/pytorch/pytorch/blob/master/caffe2/operators/elu_op.h
| - https://github.com/pytorch/pytorch/blob/master/caffe2/operators/elu_op.cc