4.5 Article

Global rates of convergence for nonconvex optimization on manifolds

期刊

IMA JOURNAL OF NUMERICAL ANALYSIS
卷 39, 期 1, 页码 1-33

出版社

OXFORD UNIV PRESS
DOI: 10.1093/imanum/drx080

关键词

complexity; gradient descent; trust-region method; Riemannian optimization; optimization on manifolds

资金

  1. Fonds Speciaux de Recherche (FSR) at UCLouvain
  2. Chaire Havas 'Chaire Economie et gestion des nouvelles donnees'
  3. ERC Starting Grant SIPA
  4. Inria
  5. ENS
  6. National Science Foundation [DMS-1719558]
  7. Natural Environment Research Council [NE/L012146/1]
  8. NERC [NE/L012146/1] Funding Source: UKRI

向作者/读者索取更多资源

We consider the minimization of a cost function f on a manifold M using Riemannian gradient descent and Riemannian trust regions (RTR). We focus on satisfying necessary optimality conditions within a tolerance epsilon. Specifically, we show that, under Lipschitz-type assumptions on the pullbacks of f to the tangent spaces of M, both of these algorithms produce points with Riemannian gradient smaller than epsilon in O(1/epsilon(2)) iterations. Furthermore, RTR returns a point where also the Riemannian Hessian's least eigenvalue is larger than -epsilon in O(1/epsilon(3)) iterations. There are no assumptions on initialization. The rates match their (sharp) unconstrained counterparts as a function of the accuracy epsilon (up to constants) and hence are sharp in that sense. These are the first deterministic results for global rates of convergence to approximate first- and second-order Karush-Kuhn-Tucker points on manifolds. They apply in particular for optimization constrained to compact submanifolds of R-n, under simpler assumptions.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据