4.6 Article

ON THE COMPLEXITY OF STEEPEST DESCENT, NEWTON'S AND REGULARIZED NEWTON'S METHODS FOR NONCONVEX UNCONSTRAINED OPTIMIZATION PROBLEMS

期刊

SIAM JOURNAL ON OPTIMIZATION
卷 20, 期 6, 页码 2833-2852

出版社

SIAM PUBLICATIONS
DOI: 10.1137/090774100

关键词

nonlinear optimization; unconstrained optimization; steepest-descent method; Newton's method; trust-region methods; cubic regularization; global complexity bounds; global rate of convergence

资金

  1. Royal Society [14265]
  2. EPSRC [EP/E053351/1, EP/G038643/1]
  3. Sciences et Technologies pour l'Aeronautique et l'Espace (STAE) Foundation (Toulouse, France) within the Reseau Thematique de Recherche Avancee (RTRA)
  4. EPSRC [EP/F005369/1] Funding Source: UKRI
  5. Engineering and Physical Sciences Research Council [EP/F005369/1, EP/E053351/1] Funding Source: researchfish

向作者/读者索取更多资源

It is shown that the steepest-descent and Newton's methods for unconstrained nonconvex optimization under standard assumptions may both require a number of iterations and function evaluations arbitrarily close to O(epsilon(-2)) to drive the norm of the gradient below epsilon. This shows that the upper bound of O(c(-2)) evaluations known for the steepest descent is tight and that Newton's method may be as slow as the steepest-descent method in the worst case. The improved evaluation complexity bound of O(epsilon(-3/2)) evaluations known for cubically regularized Newton's methods is also shown to be tight.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据