4.6 Article

On the Convergence of Primal-Dual Hybrid Gradient Algorithm

Journal

SIAM JOURNAL ON IMAGING SCIENCES
Volume 7, Issue 4, Pages 2526-2537

Publisher

SIAM PUBLICATIONS
DOI: 10.1137/140963467

Keywords

primal-dual hybrid gradient algorithm; image restoration; total variation; saddle-point problem; convex optimization; convergence rate

Funding

  1. International Centre of Management Science and Engineering, and Department of Mathematics, Nanjing University, Nanjing China [210093]
  2. NSFC [91130007]
  3. MOEC [20110091110004]

Ask authors/readers for more resources

The primal-dual hybrid gradient algorithm (PDHG) has been widely used, especially for some basic image processing models. In the literature, PDHG's convergence was established only under some restrictive conditions on its step sizes. In this paper, we revisit PDHG's convergence in the context of a saddle-point problem and try to better understand how to choose its step sizes. More specifically, we show by an extremely simple example that PDHG is not necessarily convergent even when the step sizes are fixed as tiny constants. We then show that PDHG with constant step sizes is indeed convergent if one of the functions of the saddle-point problem is strongly convex, a condition that does hold for some variational models in imaging. With this additional condition, we also establish a worst-case convergence rate measured by the iteration complexity for PDHG with constant step sizes.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available