4.6 Article

EFFICIENCY OF COORDINATE DESCENT METHODS ON HUGE-SCALE OPTIMIZATION PROBLEMS

期刊

SIAM JOURNAL ON OPTIMIZATION
卷 22, 期 2, 页码 341-362

出版社

SIAM PUBLICATIONS
DOI: 10.1137/100802001

关键词

convex optimization; coordinate relaxation; worst-case efficiency estimates; fast gradient schemes; Google problem

资金

  1. Direction de la recherche scientifique - Communaute francaise de Belgique [ARC 04/09-315]
  2. Office of Naval Research grant [N000140811104]
  3. Efficiently Computable Compressed Sensing
  4. Laboratory of Structural Methods of Data Analysis in Predictive Modelling, through the RF government grant [11.G34.31.0073]

向作者/读者索取更多资源

In this paper we propose new methods for solving huge-scale optimization problems. For problems of this size, even the simplest full-dimensional vector operations are very expensive. Hence, we propose to apply an optimization technique based on random partial update of decision variables. For these methods, we prove the global estimates for the rate of convergence. Surprisingly, for certain classes of objective functions, our results are better than the standard worst-case bounds for deterministic algorithms. We present constrained and unconstrained versions of the method and its accelerated variant. Our numerical test confirms a high efficiency of this technique on problems of very big size.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据