4.6 Article

On the iteration complexity analysis of Stochastic Primal-Dual Hybrid Gradient approach with high probability

期刊

NEUROCOMPUTING
卷 307, 期 -, 页码 78-90

出版社

ELSEVIER SCIENCE BV
DOI: 10.1016/j.neucom.2018.03.066

关键词

Stochastic Primal-Dual Hybrid Gradient; Iteration complexity; High probability; Graph-guided regularized logistic regression

资金

  1. National Natural Science Foundation of China [61303264, 61202482]

向作者/读者索取更多资源

In this paper, we propose a stochastic Primal-Dual Hybrid Gradient (PDHG) approach for solving a wide spectrum of regularized stochastic minimization problems, where the regularization term is composite with a linear function. It has been recognized that solving this kind of problem is challenging since the closed-form solution of the proximal mapping associated with the regularization term is not available due to the imposed linear composition, and the per-iteration cost of computing the full gradient of the expected objective function is extremely high when the number of input data samples is considerably large. Our new approach overcomes these issues by exploring the special structure of the regularization term and sampling a few data points at each iteration. Rather than analyzing the convergence in expectation, we provide the detailed iteration complexity analysis for the cases of both uniformly and nonuniformly averaged iterates with high probability. This strongly supports the good practical performance of the proposed approach. Numerical experiments demonstrate that the efficiency of stochastic PDHG, which outperforms other competing algorithms, as expected by the high-probability convergence analysis. Keywords: Stochastic Primal-Dual Hybrid Gradient Iteration complexity High probability Graph-guided regularized logistic regression (C) 2018 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据