Skip to main content

Crate numdiff

Crate numdiff 

Source
Expand description

githubcrates-iodocs-rs

Automatic and numerical differentiation.

§Overview

This crate implements two different methods for evaluating derivatives in Rust:

  1. Automatic differentiation (forward-mode using first-order dual numbers).
  2. Numerical differentiation (using forward difference and central difference approximations).

This crate provides generic functions (for numerical differentiation) and macros (for automatic differentiation) to evaluate various types of derivatives of the following types of functions:

  • Univariate, scalar-valued functions ($f:\mathbb{R}\to\mathbb{R}$)
  • Univariate, vector-valued functions ($\mathbf{f}:\mathbb{R}\to\mathbb{R}^{m}$)
  • Multivariate, scalar-valued functions ($f:\mathbb{R}^{n}\to\mathbb{R}$)
  • Multivariate, vector-valued functions ($\mathbf{f}:\mathbb{R}^{n}\to\mathbb{R}^{m}$)

These functions and macros are made generic over the choice of vector representation, as long as the vector type implements the linalg_traits::Vector trait. See the linalg_traits documentation for more information.

§Automatic Differentiation (Forward-Mode)

§1st-Order Derivatives

Derivative TypeFunction TypeMacro to Generate Derivative Function
derivative$f:\mathbb{R}\to\mathbb{R}$get_sderivative!
derivative$\mathbf{f}:\mathbb{R}\to\mathbb{R}^{m}$get_vderivative!
partial derivative$f:\mathbb{R}^{n}\to\mathbb{R}$get_spartial_derivative!
partial derivative$\mathbf{f}:\mathbb{R}^{n}\to\mathbb{R}^{m}$get_vpartial_derivative!
gradient$f:\mathbb{R}^{n}\to\mathbb{R}$get_gradient!
directional derivative$f:\mathbb{R}^{n}\to\mathbb{R}$get_directional_derivative!
Jacobian$\mathbf{f}:\mathbb{R}^{n}\to\mathbb{R}^{m}$get_jacobian!

§2nd-Order Derivatives

Derivative TypeFunction TypeMacro to Generate Derivative Function
2nd derivative$f:\mathbb{R}\to\mathbb{R}$get_sderivative2!
2nd derivative$\mathbf{f}:\mathbb{R}\to\mathbb{R}^{m}$get_vderivative2!
Hessian$f:\mathbb{R}^{n}\to\mathbb{R}$get_shessian!
Hessian$\mathbf{f}:\mathbb{R}^{n}\to\mathbb{R}^{m}$get_vhessian!

§Passing Runtime Parameters

Many times, we want to automatically differentiate functions that can also depend on parameters defined at runtime. However, automatic differentiation is performed at compile time, so we cannot simply “capture” these parameters using closures. To solve this problem, all automatic differentiation macros expect the function being differentiated to not only accept the point at which it is differentiated, but also a runtime parameter of an arbitrary type (could be a &[f64], could be a &T where T is some custom struct, etc.).

Examples are included for each macro (for example, here is the example for the get_jacobian! macro): get_jacobian! - Example Passing Custom Parameter Types

§Limitations

  • These macros only work on functions that are generic both over the type of scalar and the type of vector.
    • Consequently, these macros do not work on closures.
  • Constants (e.g. 5.0_f64) need to be defined using linalg_traits::Scalar::new (e.g. if a function has the generic parameter S: Scalar, then instead of defining a constant number such as 5.0_f64, we need to do S::new(5.0)).
  • When defining functions that operate on generic scalars (to make them compatible with automatic differentiation), we cannot do an assignment operation such as 1.0 += x if x: S where S: Scalar.

§Alternatives

There are already some alternative crates in the Rust ecosystem that already implement dual numbers. Originally, I intended to implement the autodifferentiation functions in this crate using one of those other dual number implementations in the backend. However, each crate had certain shortcomings that ultimately led to me providing a custom implementation of dual numbers in this crate. The alternative crates implementing dual numbers are described below.

§num-dual
§autodj
  • Can only differentiate functions written using custom types, such as DualF64.
  • Multivariate functions, especially those with a dynamic number of variables, can be extremely clunky (see this example).
§autodiff

§Finite Difference Methods

§Central Difference Approximations

§First-Order Derivatives

Derivative TypeFunction TypeFunction to Approximate Derivative
derivative$f:\mathbb{R}\to\mathbb{R}$central_difference::sderivative()
derivative$\mathbf{f}:\mathbb{R}\to\mathbb{R}^{m}$central_difference::vderivative()
partial derivative$f:\mathbb{R}^{n}\to\mathbb{R}$central_difference::spartial_derivative()
partial derivative$\mathbf{f}:\mathbb{R}^{n}\to\mathbb{R}^{m}$central_difference::vpartial_derivative()
gradient$f:\mathbb{R}^{n}\to\mathbb{R}$central_difference::gradient()
directional derivative$f:\mathbb{R}^{n}\to\mathbb{R}$central_difference::directional_derivative()
Jacobian$\mathbf{f}:\mathbb{R}^{n}\to\mathbb{R}^{m}$central_difference::jacobian()

§Second-Order Derivatives

Derivative TypeFunction TypeFunction to Approximate Derivative
Hessian$f:\mathbb{R}^{n}\to\mathbb{R}$central_difference::shessian()
Hessian$\mathbf{f}:\mathbb{R}^{n}\to\mathbb{R}^{m}$central_difference::vhessian()

§Forward Difference Approximations

§First-Order Derivatives

Derivative TypeFunction TypeFunction to Approximate Derivative
derivative$f:\mathbb{R}\to\mathbb{R}$forward_difference::sderivative()
derivative$\mathbf{f}:\mathbb{R}\to\mathbb{R}^{m}$forward_difference::vderivative()
partial derivative$f:\mathbb{R}^{n}\to\mathbb{R}$forward_difference::spartial_derivative()
partial derivative$\mathbf{f}:\mathbb{R}^{n}\to\mathbb{R}^{m}$forward_difference::vpartial_derivative()
gradient$f:\mathbb{R}^{n}\to\mathbb{R}$forward_difference::gradient()
directional derivative$f:\mathbb{R}^{n}\to\mathbb{R}$forward_difference::directional_derivative()
Jacobian$\mathbf{f}:\mathbb{R}^{n}\to\mathbb{R}^{m}$forward_difference::jacobian()

§Second-Order Derivatives

Derivative TypeFunction TypeFunction to Approximate Derivative
Hessian$f:\mathbb{R}^{n}\to\mathbb{R}$forward_difference::shessian()
Hessian$\mathbf{f}:\mathbb{R}^{n}\to\mathbb{R}^{m}$forward_difference::vhessian()

§Passing Runtime Parameters

For the finite difference methods, we can simply capture any runtime parameters using closures.

Examples are included for each function (for example, here is the example for the central_difference::jacobian() function): central_difference::jacobian() - Example Passing Runtime Parameters

Modules§

central_difference
Central difference approximations.
constants
Constants used for numerical differentiation.
forward_difference
Forward difference approximations.

Macros§

get_directional_derivative
Get a function that returns the directional derivative of the provided multivariate, scalar-valued function.
get_gradient
Get a function that returns the gradient of the provided multivariate, scalar-valued function.
get_jacobian
Get a function that returns the Jacobian of the provided multivariate, vector-valued function.
get_sderivative
Get a function that returns the derivative of the provided univariate, scalar-valued function.
get_sderivative2
Get a function that returns the second derivative of the provided univariate, scalar-valued function.
get_shessian
Get a function that returns the Hessian of the provided multivariate, scalar-valued function.
get_spartial_derivative
Get a function that returns the partial derivative of the provided multivariate, scalar-valued function.
get_vderivative
Get a function that returns the derivative of the provided univariate, vector-valued function.
get_vderivative2
Get a function that returns the second derivative of the provided univariate, vector-valued function.
get_vhessian
Get a function that returns the Hessian of the provided multivariate, vector-valued function.
get_vpartial_derivative
Get a function that returns the partial derivative of the provided multivariate, vector-valued function.

Structs§

Dual
First-order dual number.
HyperDual
Second-order hyper-dual number.

Traits§

DualVector
Trait to create a vector of dual numbers.
HyperDualVector
Trait to create a vector of hyper-dual numbers.