4.4 Article

Convergence analysis of the stochastic reflected forward-backward splitting algorithm

Journal

OPTIMIZATION LETTERS
Volume 16, Issue 9, Pages 2649-2679

Publisher

SPRINGER HEIDELBERG
DOI: 10.1007/s11590-021-01844-8

Keywords

Monotone inclusion; Stochastic optimization; Reflected method; Duality; Primal-dual algorithm; Ergodic convergence

Funding

  1. University of Transport and Communications (UTC) [T2022-CB-008]

Ask authors/readers for more resources

We propose a novel stochastic algorithm for solving monotone inclusions that are the sum of a maximal monotone operator and a monotone, Lipschitzian operator and analyze its convergence. The algorithm only requires unbiased estimations of the Lipschitzian operator and achieves a convergence rate of O(log(n)/n) in expectation for the strongly monotone case, as well as almost sure convergence for the general case. Furthermore, in the context of application to convex-concave saddle point problems, we derive the convergence rate of the primal-dual gap, obtaining a convergence rate of O(1/n) in the deterministic setting.
We propose and analyze the convergence of a novel stochastic algorithm for solving monotone inclusions that are the sum of a maximal monotone operator and a monotone, Lipschitzian operator. The propose algorithm requires only unbiased estimations of the Lipschitzian operator. We obtain the rate O(log(n)/n) in expectation for the strongly monotone case, as well as almost sure convergence for the general case. Furthermore, in the context of application to convex-concave saddle point problems, we derive the rate of the primal-dual gap. In particular, we also obtain O(1/n) rate convergence of the primal-dual gap in the deterministic setting.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.4
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available