4.5 Article

Extragradient Method in Optimization: Convergence and Complexity

期刊

出版社

SPRINGER/PLENUM PUBLISHERS
DOI: 10.1007/s10957-017-1200-6

关键词

Extragradient; Descent method; Forward-Backward splitting; Kurdyka-Lojasiewicz inequality; Complexity, first-order method; LASSO problem

资金

  1. Air Force Office of Scientific Research, Air Force Material Command [FA9550-15-1-0500]
  2. Fondecyt [1140829]
  3. Basal Project CMM Universidad de Chile

向作者/读者索取更多资源

We consider the extragradient method to minimize the sum of two functions, the first one being smooth and the second being convex. Under the Kurdyka-Aojasiewicz assumption, we prove that the sequence produced by the extragradient method converges to a critical point of the problem and has finite length. The analysis is extended to the case when both functions are convex. We provide, in this case, a sublinear convergence rate, as for gradient-based methods. Furthermore, we show that the recent small-prox complexity result can be applied to this method. Considering the extragradient method is an occasion to describe an exact line search scheme for proximal decomposition methods. We provide details for the implementation of this scheme for the one-norm regularized least squares problem and demonstrate numerical results which suggest that combining nonaccelerated methods with exact line search can be a competitive choice.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据