4.6 Article

Universal gradient methods for convex optimization problems

期刊

MATHEMATICAL PROGRAMMING
卷 152, 期 1-2, 页码 381-404

出版社

SPRINGER HEIDELBERG
DOI: 10.1007/s10107-014-0790-0

关键词

Convex optimization; Black-box methods; Complexity bounds; Optimal methods; Weakly smooth functions

资金

  1. grant Action de recherche concerte from the Direction de la recherche scientifique - Communaute francaise de Belgique [ARC 04/09-315]
  2. Laboratory of Structural Methods of Data Analysis in Predictive Modelling, through RF [11.G34.31.0073]
  3. RFBR [13-01-12007 ofi_m, 14-01-00722-a]

向作者/读者索取更多资源

In this paper, we present new methods for black-box convex minimization. They do not need to know in advance the actual level of smoothness of the objective function. Their only essential input parameter is the required accuracy of the solution. At the same time, for each particular problem class they automatically ensure the best possible rate of convergence. We confirm our theoretical results by encouraging numerical experiments, which demonstrate that the fast rate of convergence, typical for the smooth optimization problems, sometimes can be achieved even on nonsmooth problem instances.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据