Skip to main content

Module optim

Module optim 

Source
Expand description

Riemannian optimization algorithms.

Deprecated: this module has moved to descend::riemannian (enable the riemannian feature in the descend crate). The symbols here remain for backward compatibility but will be removed in a future release.

Gradient-based optimization on manifolds differs from Euclidean optimization in two ways:

  1. Gradients live in tangent spaces, not in the ambient space.
  2. Updates follow geodesics (via exp_map), not straight lines.

This module provides riemannian_sgd_step and riemannian_adam_step that operate on any type implementing Manifold.

§References

  • Bonnabel (2013), “Stochastic Gradient Descent on Riemannian Manifolds” – foundational convergence analysis for Riemannian SGD.
  • Becigneul & Ganea (2019), “Riemannian Adaptive Optimization Methods” (ICLR) – extends Adam to Riemannian manifolds using parallel transport for moment vectors.

Structs§

RiemannianAdamStateDeprecated
State for the Riemannian Adam optimizer.

Functions§

geodesic_distanceDeprecated
Geodesic distance between two points on a manifold.
riemannian_adam_stepDeprecated
Riemannian Adam step (Becigneul & Ganea, 2019).
riemannian_sgd_stepDeprecated
Riemannian SGD step.