4.6 Article

VARIABLE METRIC INEXACT LINE-SEARCH-BASED METHODS FOR NONSMOOTH OPTIMIZATION

期刊

SIAM JOURNAL ON OPTIMIZATION
卷 26, 期 2, 页码 891-921

出版社

SIAM PUBLICATIONS
DOI: 10.1137/15M1019325

关键词

proximal algorithms; nonsmooth optimization; generalized projection; nonconvex optimization

资金

  1. MIUR under project FIRB - Futuro in Ricerca [RBFR12M3AC]
  2. MIUR under the project PRIN [2012MTE38N]

向作者/读者索取更多资源

We develop a new proximal-gradient method for minimizing the sum of a differentiable, possibly nonconvex, function plus a convex, possibly nondifferentiable, function. The key features of the proposed method are the definition of a suitable descent direction, based on the proximal operator associated to the convex part of the objective function, and an Armijo-like rule to determine the stepsize along this direction ensuring the sufficient decrease of the objective function. In this frame, we especially address the possibility of adopting a metric which may change at each iteration and an inexact computation of the proximal point defining the descent direction. For the more general nonconvex case, we prove that all limit points of the iterates sequence are stationary, while for convex objective functions we prove the convergence of the whole sequence to a minimizer, under the assumption that a minimizer exists. In the latter case, assuming also that the gradient of the smooth part of the objective function is Lipschitz, we also give a convergence rate estimate, showing the O(1/k) complexity with respect to the function values. We also discuss verifiable sufficient conditions for the inexact proximal point and present the results of two numerical tests on total-variation-based image restoration problems, showing that the proposed approach is competitive with other state-of-the-art methods.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据