期刊
SIAM REVIEW
卷 65, 期 2, 页码 375-435出版社
SIAM PUBLICATIONS
DOI: 10.1137/20M1379344
关键词
large-scale convex optimization; nonsmooth optimization; splitting; proximal algorithm; primal-dual algorithm
Convex nonsmooth optimization problems in high dimensional spaces are addressed with proximal splitting algorithms, which handle the terms in the objective function separately. This paper provides an overview of recent developments in these algorithms, presenting them within a unified framework that applies splitting methods in primal-dual product spaces with well-chosen metrics. The paper also introduces new algorithm variants and extends the parameter ranges for convergence, highlighting the benefits of larger relaxation parameters for faster convergence, especially for problems with quadratic smooth terms.
Convex nonsmooth optimization problems, whose solutions live in very high dimensional spaces, have become ubiquitous. To solve them, the class of first-order algorithms known as proximal splitting algorithms is particularly adequate: they consist of simple operations, handling the terms in the objective function separately. In this overview, we demystify a selection of recent proximal splitting algorithms: we present them within a unified framework, which consists in applying splitting methods for monotone inclusions in primal-dual product spaces, with well-chosen metrics. Along the way, we easily derive new variants of the algorithms and revisit existing convergence results, extending the parameter ranges in several cases. In particular, we emphasize that when the smooth term in the objective function is quadratic, e.g., for least-squares problems, convergence is guaranteed with larger values of the relaxation parameter than previously known. Such larger values are usually beneficial for the convergence speed in practice.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据