Expand description
Automatic differentiation using Bevy ECS.
bevy_autodiff implements automatic differentiation using symbolic graph
differentiation, with Bevy ECS as the computational graph backend.
Variables are ECS entities, operations are components, and derivatives
are computed by applying the chain rule symbolically with constant folding.
§Core Concepts
- ECS as computation graph: Variables are entities, operations are components
- Symbolic differentiation:
AutoDiff::differentiatecreates new entities representing the derivative graph via the chain rule - Successive differentiation: For d²f/dxdy, differentiate f w.r.t. x then w.r.t. y
CompiledGraph: Flattens the ECS graph into aVec<NodeOp>for fast repeated evaluation- Reverse-mode gradient:
CompiledGraph::gradientcomputes all partial derivatives in a single backward pass, independent of input count - Forward-mode partials:
AutoDiff::compile_orderpre-compiles symbolic derivative subgraphs for higher-order or mixed partial derivatives
§Example
use bevy_autodiff::AutoDiff;
let mut ad = AutoDiff::new();
// Create input variables
let x = ad.var(2.0).unwrap();
let y = ad.var(3.0).unwrap();
// Build computation graph: f = x * y
let f = ad.mul(x, y);
assert_eq!(ad.eval(f).unwrap(), 6.0);
// Symbolic differentiation creates new graph entities
let dfdx = ad.differentiate(f, x).unwrap();
assert_eq!(ad.eval(dfdx).unwrap(), 3.0); // df/dx = y = 3
// Higher-order: d²f/dxdy = 1
let d2fdxdy = ad.differentiate(dfdx, y).unwrap();
assert_eq!(ad.eval(d2fdxdy).unwrap(), 1.0);§Reverse-mode gradient
use bevy_autodiff::AutoDiff;
let mut ad = AutoDiff::new();
let x = ad.var(1.0).unwrap();
let y = ad.var(2.0).unwrap();
let x2 = ad.square(x);
let y2 = ad.square(y);
let f = ad.add(x2, y2); // x² + y²
// Compile primal only, then use reverse-mode for gradient
let mut cg = ad.compile_primal(f, &[x, y]).unwrap();
cg.eval(&[1.0, 2.0]).unwrap();
assert_eq!(cg.value(), 5.0);
let grad = cg.gradient();
assert_eq!(grad, &[2.0, 4.0]); // [2x, 2y]Modules§
- gpu
wgpu - GPU batch evaluation via wgpu.
- macros
- Ergonomic macros for building computation graphs.
- ops
- Operator overloading for
Vartype.
Macros§
- expr
- Transforms a natural Rust expression into AutoDiff method calls.
Structs§
- Auto
Diff - The main autodiff context for building and evaluating computation graphs.
- Compiled
Graph - A compiled computation graph for fast repeated evaluation.
- Dependencies
- Bitmask tracking which input variables affect this variable.
- IsConstant
- Marker component indicating a variable is a constant. Constants have fixed values and zero derivatives with respect to all inputs.
- IsInput
- Marker component indicating a variable is an input (leaf node). Input variables have user-specified values and are the sources for derivative computation.
- Value
- Stores the numerical value of a variable.
- Var
- A lightweight handle to a variable in the computation graph.
- Variable
- Marker component indicating an entity is a variable in the computation graph.
Enums§
- Auto
Diff Error - Errors that can occur during autodiff operations.
- Binary
Op - Binary operation types.
- NodeOp
- A node in the flattened computation graph.
- UnaryOp
- Unary operation types.
Traits§
- DiffNum
- A numeric type that supports the operations needed for automatic differentiation.
- Float
- A numeric type that can be stored in computation graphs.
Functions§
- count_
operations - Counts the number of operations in a computation graph.
- to_dot
- Generates a DOT graph representation of the computation graph.
- validate_
graph - Validates the computation graph for common issues.
Attribute Macros§
- autodiff
proc-macros - Transforms a function to work with AutoDiff.