期刊
MATHEMATICS OF OPERATIONS RESEARCH
卷 42, 期 2, 页码 330-348出版社
INFORMS
DOI: 10.1287/moor.2016.0817
关键词
first-order methods; composite nonsmooth convex minimization; descent lemma; proximal-gradient algorithms; complexity; Bregman distance; multiplicative Poisson linear inverse problems
资金
- Natural Sciences and Engineering Council of Canada
- Air Force Office of Scientific Research, Air Force Material Command, U.S. Air Force [FA9550-14-1-0056]
- Israel Science Foundation [ISF-998-12]
- German-Israel Foundation [GIF-G1253304.6-2014]
The proximal gradient and its variants is one of the most attractive first-order algorithm for minimizing the sum of two convex functions, with one being nonsmooth. However, it requires the differentiable part of the objective to have a Lipschitz continuous gradient, thus precluding its use in many applications. In this paper we introduce a framework which allows to circumvent the intricate question of Lipschitz continuity of gradients by using an elegant and easy to check convexity condition which captures the geometry of the constraints. This condition translates into a new descent lemma which in turn leads to a natural derivation of the proximal-gradient scheme with Bregman distances. We then identify a new notion of asymmetry measure for Bregman distances, which is central in determining the relevant step-size. These novelties allow to prove a global sublinear rate of convergence, and as a by-product, global pointwise convergence is obtained. This provides a new path to a broad spectrum of problems arising in key applications which were, until now, considered as out of reach via proximal gradient methods. We illustrate this potential by showing how our results can be applied to build new and simple schemes for Poisson inverse problems.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据