4.1 Article

Convergence of ADMM for multi-block nonconvex separable optimization models

Journal

FRONTIERS OF MATHEMATICS IN CHINA
Volume 12, Issue 5, Pages 1139-1162

Publisher

HIGHER EDUCATION PRESS
DOI: 10.1007/s11464-017-0631-6

Keywords

Nonconvex optimization; separable structure; alternating direction method of multipliers (ADMM); Kurdyka-Lojasiewicz inequality

Categories

Funding

  1. Priority Academic Program Development of Jiangsu Higher Education Institutions
  2. Jiangsu Planned Projects for Postdoctoral Research Funds [1501071B]
  3. Foundation of Jiangsu Key Lab for NSLSCS [201601]
  4. National Natural Science Foundation of China [11371015]
  5. [11625105]
  6. [11371197]
  7. [11431002]
  8. [11501301]

Ask authors/readers for more resources

For solving minimization problems whose objective function is the sum of two functions without coupled variables and the constrained function is linear, the alternating direction method of multipliers (ADMM) has exhibited its efficiency and its convergence is well understood. When either the involved number of separable functions is more than two, or there is a nonconvex function, ADMM or its direct extended version may not converge. In this paper, we consider the multi-block separable optimization problems with linear constraints and absence of convexity of the involved component functions. Under the assumption that the associated function satisfies the Kurdyka-Lojasiewicz inequality, we prove that any cluster point of the iterative sequence generated by ADMM is a critical point, under the mild condition that the penalty parameter is sufficiently large. We also present some sufficient conditions guaranteeing the sublinear and linear rate of convergence of the algorithm.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.1
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available