| The LeakyRelu op takes one input tensor
| $X$ and an argument $alpha$, and produces
| one output tensor $Y$ of the same shape
| as $X.$ The op performs the element wise
| leaky relu operation, defined as
|
| $$y=LeakyRelu(x) =\begin{cases}\alpha
| x & x < 0\x & otherwise\end{cases}$$
|
| The default value of alpha is 0.01.
|
| Github Links:
|
| - https://github.com/pytorch/pytorch/blob/master/caffe2/operators/leaky_relu_op.h
|
| - https://github.com/pytorch/pytorch/blob/master/caffe2/operators/leaky_relu_op.cc
|