Journal
JOURNAL OF GLOBAL OPTIMIZATION
Volume 66, Issue 3, Pages 457-485Publisher
SPRINGER
DOI: 10.1007/s10898-016-0405-9
Keywords
Nonconvex optimization; Nonsmooth optimization; Proximity operator; Majorize-Minimize algorithm; Block coordinate descent; Alternating minimization; Phase retrieval; Inverse problems
Funding
- CNRS MASTODONS project [2013MesureHD]
- CNRS Imag'in Project [2015OPTIMISME]
Ask authors/readers for more resources
A number of recent works have emphasized the prominent role played by the Kurdyka-Aojasiewicz inequality for proving the convergence of iterative algorithms solving possibly nonsmooth/nonconvex optimization problems. In this work, we consider the minimization of an objective function satisfying this property, which is a sum of two terms: (i) a differentiable, but not necessarily convex, function and (ii) a function that is not necessarily convex, nor necessarily differentiable. The latter function is expressed as a separable sum of functions of blocks of variables. Such an optimization problem can be addressed with the Forward-Backward algorithm which can be accelerated thanks to the use of variable metrics derived from the Majorize-Minimize principle. We propose to combine the latter acceleration technique with an alternating minimization strategy which relies upon a flexible update rule. We give conditions under which the sequence generated by the resulting Block Coordinate Variable Metric Forward-Backward algorithm converges to a critical point of the objective function. An application example to a nonconvex phase retrieval problem encountered in signal/image processing shows the efficiency of the proposed optimization method.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available