Journal
IMA JOURNAL OF NUMERICAL ANALYSIS
Volume 39, Issue 1, Pages 1-33Publisher
OXFORD UNIV PRESS
DOI: 10.1093/imanum/drx080
Keywords
complexity; gradient descent; trust-region method; Riemannian optimization; optimization on manifolds
Categories
Funding
- Fonds Speciaux de Recherche (FSR) at UCLouvain
- Chaire Havas 'Chaire Economie et gestion des nouvelles donnees'
- ERC Starting Grant SIPA
- Inria
- ENS
- National Science Foundation [DMS-1719558]
- Natural Environment Research Council [NE/L012146/1]
- NERC [NE/L012146/1] Funding Source: UKRI
Ask authors/readers for more resources
We consider the minimization of a cost function f on a manifold M using Riemannian gradient descent and Riemannian trust regions (RTR). We focus on satisfying necessary optimality conditions within a tolerance epsilon. Specifically, we show that, under Lipschitz-type assumptions on the pullbacks of f to the tangent spaces of M, both of these algorithms produce points with Riemannian gradient smaller than epsilon in O(1/epsilon(2)) iterations. Furthermore, RTR returns a point where also the Riemannian Hessian's least eigenvalue is larger than -epsilon in O(1/epsilon(3)) iterations. There are no assumptions on initialization. The rates match their (sharp) unconstrained counterparts as a function of the accuracy epsilon (up to constants) and hence are sharp in that sense. These are the first deterministic results for global rates of convergence to approximate first- and second-order Karush-Kuhn-Tucker points on manifolds. They apply in particular for optimization constrained to compact submanifolds of R-n, under simpler assumptions.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available