4.7 Article

Parallel Selective Algorithms for Nonconvex Big Data Optimization

Journal

IEEE TRANSACTIONS ON SIGNAL PROCESSING
Volume 63, Issue 7, Pages 1874-1889

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TSP.2015.2399858

Keywords

Parallel optimization; variables selection; distributed methods; Jacobi method; LASSO; sparse solution

Funding

  1. MIUR project PLATINO [PON01_01007]
  2. USA National Science Foundation [CMS 1218717]
  3. CAREER Award [1254739]
  4. Directorate For Engineering
  5. Div Of Electrical, Commun & Cyber Sys [1555850] Funding Source: National Science Foundation

Ask authors/readers for more resources

We propose a decomposition framework for the parallel optimization of the sum of a differentiable (possibly nonconvex) function and a (block) separable nonsmooth, convex one. The latter term is usually employed to enforce structure in the solution, typically sparsity. Our framework is very flexible and includes both fully parallel Jacobi schemes and Gauss-Seidel (i.e., sequential) ones, as well as virtually all possibilities in between with only a subset of variables updated at each iteration. Our theoretical convergence results improve on existing ones, and numerical results on LASSO, logistic regression, and some nonconvex quadratic problems show that the new method consistently outperforms existing algorithms.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available