Expand description
Riemannian optimization algorithms.
Deprecated: this module has moved to descend::riemannian (enable the
riemannian feature in the descend crate). The symbols here remain for
backward compatibility but will be removed in a future release.
Gradient-based optimization on manifolds differs from Euclidean optimization in two ways:
- Gradients live in tangent spaces, not in the ambient space.
- Updates follow geodesics (via
exp_map), not straight lines.
This module provides riemannian_sgd_step and riemannian_adam_step
that operate on any type implementing Manifold.
§References
- Bonnabel (2013), “Stochastic Gradient Descent on Riemannian Manifolds” – foundational convergence analysis for Riemannian SGD.
- Becigneul & Ganea (2019), “Riemannian Adaptive Optimization Methods” (ICLR) – extends Adam to Riemannian manifolds using parallel transport for moment vectors.
Structs§
- Riemannian
Adam State Deprecated - State for the Riemannian Adam optimizer.
Functions§
- geodesic_
distance Deprecated - Geodesic distance between two points on a manifold.
- riemannian_
adam_ step Deprecated - Riemannian Adam step (Becigneul & Ganea, 2019).
- riemannian_
sgd_ step Deprecated - Riemannian SGD step.