4.6 Article

Global Convergence of Unmodified 3-Block ADMM for a Class of Convex Minimization Problems

期刊

JOURNAL OF SCIENTIFIC COMPUTING
卷 76, 期 1, 页码 69-88

出版社

SPRINGER/PLENUM PUBLISHERS
DOI: 10.1007/s10915-017-0612-7

关键词

ADMM; Global convergence; Convex minimization; Regularized least squares decomposition

资金

  1. Department of Mathematics at UC Davis
  2. NSF [CMMI-1462408]

向作者/读者索取更多资源

The alternating direction method of multipliers (ADMM) has been successfully applied to solve structured convex optimization problems due to its superior practical performance. The convergence properties of the 2-block ADMM have been studied extensively in the literature. Specifically, it has been proven that the 2-block ADMM globally converges for any penalty parameter . In this sense, the 2-block ADMM allows the parameter to be free, i.e., there is no need to restrict the value for the parameter when implementing this algorithm in order to ensure convergence. However, for the 3-block ADMM, Chen et al. (Math Program 155:57-79, 2016) recently constructed a counter-example showing that it can diverge if no further condition is imposed. The existing results on studying further sufficient conditions on guaranteeing the convergence of the 3-block ADMM usually require to be smaller than a certain bound, which is usually either difficult to compute or too small to make it a practical algorithm. In this paper, we show that the 3-block ADMM still globally converges with any penalty parameter if the third function in the objective is smooth and strongly convex, and its condition number is in [1, 1.0798), besides some other mild conditions. This requirement covers an important class of problems to be called regularized least squares decomposition (RLSD) in this paper.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据