Journal
SIAM JOURNAL ON NUMERICAL ANALYSIS
Volume 54, Issue 2, Pages 625-640Publisher
SIAM PUBLICATIONS
DOI: 10.1137/140974237
Keywords
global linear convergence; alternating direction method; piecewise linear multifunctions; alternating proximal gradient method; error bound
Categories
Funding
- NSFC [11371102, 11371197, 11431002]
- NSFC Key Project [91330201]
Ask authors/readers for more resources
The numerical success of the alternating direction method of multipliers (ADMM) inspires much attention in analyzing its theoretical convergence rate. While there are several results on the iterative complexity results implying sublinear convergence rate for the general case, there are only a few results for the special cases such as linear programming, quadratic programming, and nonlinear programming with strongly convex functions. In this paper, we consider the convergence rate of ADMM when applying to the convex optimization problems that the subdifferentials of the underlying functions are piecewise linear multifunctions, including LASSO, a well-known regression model in statistics, as a special case. We prove that due to its inherent polyhedral structure, a recent global error bound holds for this class of problems. Based on this error bound, we derive the linear rate of convergence for ADMM. We also consider the proximal based ADMM and derive its linear convergence rate.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available