Crate caffe2op_sigmoid
source ·Structs
- | Apply the Sigmoid function element-wise | to the input tensor. This is often used | as a non-linear activation function | in a neural network. The sigmoid function | is defined as: | | $$Sigmoid(x) = \frac{1}{1+\exp(-x)}$$ | | Github Links: | | - https://github.com/pytorch/pytorch/blob/master/caffe2/operators/sigmoid_op.cc |
- | SigmoidGradient takes both Y and dY | and uses this to update dX according | to the chain rule and derivatives of | the sigmoid function. |