Crate caffe2op_relu

source ·

Structs

  • | Applies rectified linear unit operation to the | input data element-wise. The Relu operation takes | one input $X$, produces one output $Y$, and is | defined as: | | $$Y = max(0,X)$$ | | Github Links: | - https://github.com/pytorch/pytorch/blob/master/caffe2/operators/relu_op.h | - https://github.com/pytorch/pytorch/blob/master/caffe2/operators/relu_op.cc |
  • | ReluGradient takes both Y and dY and | uses this to update dX according to the | chain rule and derivatives of the rectified | linear function. |
  • | Relu takes one input data (Tensor) and | produces one output data (Tensor) where | the rectified linear function, y = min(max(0, | x), n), is applied to the tensor elementwise. |
  • | ReluGradient takes both Y and dY and | uses this to update dX according to the | chain rule and derivatives of the rectified | linear function. |

Functions