AAD: Adjoint Automatic Differentiation
A Rust library for reverse-mode automatic differentiation (AD), enabling efficient gradient computation for scalar-valued functions.
Features
- Intuitive API: Write equations as naturally as primitive
f64operations.- Operator overloading (
+,*,sin,ln, etc.) for seamless expression building. - Inspired by
rustogradandRustQuant_autodiff.
- Operator overloading (
- High Performance: Optimized for minimal runtime overhead.
- Benchmarks show competitive performance, often outperforming alternatives in gradient computation ( see Benchmarks).
- Zero Dependencies: Core library has no external dependencies.
RustQuant_autodiffincludes extra dependencies, which may require additional system setup when installing on Linux.- (Optional
criterionandRustQuant_autodifffor benchmarking only.)
- Extensive Math Support:
- Trigonometric (
sin,cos,tanh), exponential (exp,powf), logarithmic (ln,log10), and more. - Full list in supported operations.
- Trigonometric (
Installation
Add to your Cargo.toml:
[]
= "0.3.0"
Quick Start
use Tape;
use ScalarLike;
Supported Operations
Basic Arithmetic
+,-,*,/, negation- Assignment operators (
+=,*=, etc.)
Mathematical Functions
- Exponential:
exp,powf,sqrt,hypot - Logarithmic:
ln,log,log2,log10 - Trigonometric:
sin,cos,tan,asin,acos,sinh,cosh - Other:
abs,recip,cbrt
See math.rs for full details.
Benchmarks
Design Notes
- Tape-based: All operations are recorded to a
Tapefor efficient reverse-mode traversal. - Lightweight: Variables are
Copy-enabled structs with minimal memory footprint.
Contributing
Contributions are welcome! Open an issue or PR for feature requests, bug fixes, or documentation improvements.
License
MIT License. See LICENSE for details.