Journal
SIAM JOURNAL ON OPTIMIZATION
Volume 22, Issue 2, Pages 341-362Publisher
SIAM PUBLICATIONS
DOI: 10.1137/100802001
Keywords
convex optimization; coordinate relaxation; worst-case efficiency estimates; fast gradient schemes; Google problem
Categories
Funding
- Direction de la recherche scientifique - Communaute francaise de Belgique [ARC 04/09-315]
- Office of Naval Research grant [N000140811104]
- Efficiently Computable Compressed Sensing
- Laboratory of Structural Methods of Data Analysis in Predictive Modelling, through the RF government grant [11.G34.31.0073]
Ask authors/readers for more resources
In this paper we propose new methods for solving huge-scale optimization problems. For problems of this size, even the simplest full-dimensional vector operations are very expensive. Hence, we propose to apply an optimization technique based on random partial update of decision variables. For these methods, we prove the global estimates for the rate of convergence. Surprisingly, for certain classes of objective functions, our results are better than the standard worst-case bounds for deterministic algorithms. We present constrained and unconstrained versions of the method and its accelerated variant. Our numerical test confirms a high efficiency of this technique on problems of very big size.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available