Expand description
Reverse mode automatic differentiation (backpropagation)
Reverse mode AD is efficient for computing gradients when the number of outputs is small compared to the number of inputs.
Structs§
- ReverseAD
- Reverse mode automatic differentiation engine
- Tape
- Reverse mode AD tape for recording operations
- Tape
Node - Node in the computation graph
Enums§
- Checkpoint
Strategy - Checkpointing strategy for memory-efficient gradient computation
- Operation
- Operations that can be recorded on the tape
Functions§
- reverse_
gradient - Compute gradient using reverse mode AD (convenience function)
- reverse_
jacobian - Compute Jacobian using reverse mode AD (convenience function)