4.6 Article

Stochastic projective splitting

Journal

Publisher

SPRINGER
DOI: 10.1007/s10589-023-00528-6

Keywords

Proximal operator splitting; Monotone inclusions; Convex optimization; Stochastic gradient descent

Ask authors/readers for more resources

This paper presents a new stochastic variant of the projective splitting algorithm for inclusion problems involving maximal monotone operators. The proposed method has significance in applications such as machine learning.
We present a new, stochastic variant of the projective splitting (PS) family of algorithms for inclusion problems involving the sum of any finite number of maximal monotone operators. This new variant uses a stochastic oracle to evaluate one of the operators, which is assumed to be Lipschitz continuous, and (deterministic) resolvents to process the remaining operators. Our proposal is the first version of PS with such stochastic capabilities. We envision the primary application being machine learning (ML) problems, with the method's stochastic features facilitating mini-batch sampling of datasets. Since it uses a monotone operator formulation, the method can handle not only Lipschitz-smooth loss minimization, but also min-max and noncooperative game formulations, with better convergence properties than the gradient descent-ascent methods commonly applied in such settings. The proposed method can handle any number of constraints and nonsmooth regularizers via projection and proximal operators. We prove almost-sure convergence of the iterates to a solution and a convergence rate result for the expected residual, and close with numerical experiments on a distributionally robust sparse logistic regression problem.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available