Expand description
Automatic differentiation for exact gradient and Hessian computation
This module provides automatic differentiation capabilities for optimization, supporting both forward-mode and reverse-mode AD for efficient and exact derivative computation.
Re-exports§
pub use dual_numbers::Dual;
pub use dual_numbers::DualNumber;
pub use forward_mode::forward_gradient;
pub use forward_mode::forward_hessian_diagonal;
pub use forward_mode::ForwardADOptions;
pub use reverse_mode::reverse_gradient;
pub use reverse_mode::reverse_hessian;
pub use reverse_mode::ReverseADOptions;
pub use tape::ComputationTape;
pub use tape::TapeNode;
pub use tape::Variable;
Modules§
- dual_
numbers - Dual numbers for forward-mode automatic differentiation
- forward_
mode - Forward-mode automatic differentiation
- reverse_
mode - Reverse-mode automatic differentiation (backpropagation)
- tape
- Computational tape for reverse-mode automatic differentiation
Structs§
- ADResult
- Result of automatic differentiation computation
- Auto
Diff Options - Options for automatic differentiation
- Function
Wrapper - Wrapper for regular functions to make them compatible with AD
Enums§
- ADMode
- Automatic differentiation mode selection
Traits§
- Auto
Diff Function - Function trait for automatic differentiation
Functions§
- autodiff
- Main automatic differentiation function
- create_
ad_ gradient - Create a gradient function using automatic differentiation
- create_
ad_ hessian - Create a Hessian function using automatic differentiation
- optimize_
ad_ mode - Optimize AD mode selection based on problem characteristics