4.6 Article

EFFICIENCY OF COORDINATE DESCENT METHODS ON HUGE-SCALE OPTIMIZATION PROBLEMS

Journal

SIAM JOURNAL ON OPTIMIZATION
Volume 22, Issue 2, Pages 341-362

Publisher

SIAM PUBLICATIONS
DOI: 10.1137/100802001

Keywords

convex optimization; coordinate relaxation; worst-case efficiency estimates; fast gradient schemes; Google problem

Funding

  1. Direction de la recherche scientifique - Communaute francaise de Belgique [ARC 04/09-315]
  2. Office of Naval Research grant [N000140811104]
  3. Efficiently Computable Compressed Sensing
  4. Laboratory of Structural Methods of Data Analysis in Predictive Modelling, through the RF government grant [11.G34.31.0073]

Ask authors/readers for more resources

In this paper we propose new methods for solving huge-scale optimization problems. For problems of this size, even the simplest full-dimensional vector operations are very expensive. Hence, we propose to apply an optimization technique based on random partial update of decision variables. For these methods, we prove the global estimates for the rate of convergence. Surprisingly, for certain classes of objective functions, our results are better than the standard worst-case bounds for deterministic algorithms. We present constrained and unconstrained versions of the method and its accelerated variant. Our numerical test confirms a high efficiency of this technique on problems of very big size.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available