期刊
JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS
卷 346, 期 -, 页码 171-183出版社
ELSEVIER SCIENCE BV
DOI: 10.1016/j.cam.2018.07.007
关键词
Batch images alignment; Non-smooth convex minimization; Alternating direction method of multipliers; Accelerated proximal gradient algorithm; Symmetric Gauss-Seidel
资金
- Major State Basic Research Development Program of China (973 Program) [2015CB856003]
- National Natural Science Foundation of China [11471102, 11471101]
With the appearance of approach named robust alignment by sparse and low-rank decomposition (RASL), a number of linearly correlated images can be accurately and robustly aligned despite significant corruptions and occlusions. It has been discovered that this aligning task can be characterized as a sequence of 3-block convex minimization problems which can be solved efficiently by the accelerated proximal gradient method (APG), or alternatively, by the directly extended alternating direction method of multipliers (ADMM). However, the directly extended ADMM may diverge although it often performs well in numerical computations. Ideally, one should find an algorithm which can have both theoretical guarantee and superior numerical efficiency over the directly extended ADMM. We achieve this goal by using the intelligent symmetric Gauss-Seidel iteration based ADMM (sGS-ADMM) which only needs to update one of the variables twice, but surprisingly, it leads to the desired convergence to be guaranteed. The convergence of sGS-ADMM can be followed directly by relating it to the classical 2-block ADMM and with a couple of specially designed semi-proximal terms. Beyond this, we also add a rank correction term to the model with the purpose of deriving the alignment results with higher accuracy. The numerical experiments over a wide range of realistic misalignments demonstrate that sGS-ADMM is at least two times faster than RASL and APG for the vast majority of the tested problems. (C) 2018 Elsevier B.V. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据