Expand description
Reverse-mode automatic differentiation (backpropagation)
Reverse-mode AD is efficient for computing derivatives when the number of output variables is small (typically 1 for optimization). It builds a computational graph and then propagates derivatives backwards.
Structs§
- Computation
Graph - Computational graph for reverse-mode AD
- ReverseAD
Options - Options for reverse-mode automatic differentiation
- Reverse
Variable - Variable in the computational graph for reverse-mode AD
Functions§
- add
- Addition operation on computation graph
- cos
- Cosine operation on computation graph
- div
- Division operation on computation graph
- exp
- Exponential operation on computation graph
- is_
reverse_ mode_ efficient - Check if reverse mode is preferred for the given problem dimensions
- ln
- Natural logarithm operation on computation graph
- mul
- Multiplication operation on computation graph
- powi
- Power operation (x^n) on computation graph
- reverse_
gauss_ newton_ hessian - Gauss-Newton Hessian approximation using reverse-mode AD
- reverse_
gradient - Compute gradient using reverse-mode automatic differentiation
- reverse_
gradient_ with_ tape - Simple reverse-mode gradient computation using a basic tape
- reverse_
hessian - Compute Hessian using reverse-mode automatic differentiation
- reverse_
vjp - Vector-Jacobian product using reverse-mode AD
- sin
- Sine operation on computation graph
- sub
- Subtraction operation on computation graph