4.6 Article

Performance of first-order methods for smooth convex minimization: a novel approach

期刊

MATHEMATICAL PROGRAMMING
卷 145, 期 1-2, 页码 451-482

出版社

SPRINGER HEIDELBERG
DOI: 10.1007/s10107-013-0653-0

关键词

Performance of first-order algorithms; Rate of convergence; Complexity; Smooth convex minimization; Duality; Semidefinite relaxations; Fast gradient schemes; Heavy Ball method

资金

  1. Israel Science Foundation under ISF [998-12]

向作者/读者索取更多资源

We introduce a novel approach for analyzing the worst-case performance of first-order black-box optimization methods. We focus on smooth unconstrained convex minimization over the Euclidean space. Our approach relies on the observation that by definition, the worst-case behavior of a black-box optimization method is by itself an optimization problem, which we call the performance estimation problem (PEP). We formulate and analyze the PEP for two classes of first-order algorithms. We first apply this approach on the classical gradient method and derive a new and tight analytical bound on its performance. We then consider a broader class of first-order black-box methods, which among others, include the so-called heavy-ball method and the fast gradient schemes. We show that for this broader class, it is possible to derive new bounds on the performance of these methods by solving an adequately relaxed convex semidefinite PEP. Finally, we show an efficient procedure for finding optimal step sizes which results in a first-order black-box method that achieves best worst-case performance.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据