Module reverse_mode

Source
Expand description

Reverse-mode automatic differentiation (backpropagation)

Reverse-mode AD is efficient for computing derivatives when the number of output variables is small (typically 1 for optimization). It builds a computational graph and then propagates derivatives backwards.

Structs§

ComputationGraph
Computational graph for reverse-mode AD
ReverseADOptions
Options for reverse-mode automatic differentiation
ReverseVariable
Variable in the computational graph for reverse-mode AD

Functions§

add
Addition operation on computation graph
cos
Cosine operation on computation graph
div
Division operation on computation graph
exp
Exponential operation on computation graph
is_reverse_mode_efficient
Check if reverse mode is preferred for the given problem dimensions
ln
Natural logarithm operation on computation graph
mul
Multiplication operation on computation graph
powi
Power operation (x^n) on computation graph
reverse_gauss_newton_hessian
Gauss-Newton Hessian approximation using reverse-mode AD
reverse_gradient
Compute gradient using reverse-mode automatic differentiation
reverse_gradient_with_tape
Simple reverse-mode gradient computation using a basic tape
reverse_hessian
Compute Hessian using reverse-mode automatic differentiation
reverse_vjp
Vector-Jacobian product using reverse-mode AD
sin
Sine operation on computation graph
sub
Subtraction operation on computation graph