4.6 Article

LINEAR CONVERGENCE OF THE ALTERNATING DIRECTION METHOD OF MULTIPLIERS FOR A CLASS OF CONVEX OPTIMIZATION PROBLEMS

Journal

SIAM JOURNAL ON NUMERICAL ANALYSIS
Volume 54, Issue 2, Pages 625-640

Publisher

SIAM PUBLICATIONS
DOI: 10.1137/140974237

Keywords

global linear convergence; alternating direction method; piecewise linear multifunctions; alternating proximal gradient method; error bound

Funding

  1. NSFC [11371102, 11371197, 11431002]
  2. NSFC Key Project [91330201]

Ask authors/readers for more resources

The numerical success of the alternating direction method of multipliers (ADMM) inspires much attention in analyzing its theoretical convergence rate. While there are several results on the iterative complexity results implying sublinear convergence rate for the general case, there are only a few results for the special cases such as linear programming, quadratic programming, and nonlinear programming with strongly convex functions. In this paper, we consider the convergence rate of ADMM when applying to the convex optimization problems that the subdifferentials of the underlying functions are piecewise linear multifunctions, including LASSO, a well-known regression model in statistics, as a special case. We prove that due to its inherent polyhedral structure, a recent global error bound holds for this class of problems. Based on this error bound, we derive the linear rate of convergence for ADMM. We also consider the proximal based ADMM and derive its linear convergence rate.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available