4.6 Article

Stochastic projective splitting

期刊

出版社

SPRINGER
DOI: 10.1007/s10589-023-00528-6

关键词

Proximal operator splitting; Monotone inclusions; Convex optimization; Stochastic gradient descent

向作者/读者索取更多资源

This paper presents a new stochastic variant of the projective splitting algorithm for inclusion problems involving maximal monotone operators. The proposed method has significance in applications such as machine learning.
We present a new, stochastic variant of the projective splitting (PS) family of algorithms for inclusion problems involving the sum of any finite number of maximal monotone operators. This new variant uses a stochastic oracle to evaluate one of the operators, which is assumed to be Lipschitz continuous, and (deterministic) resolvents to process the remaining operators. Our proposal is the first version of PS with such stochastic capabilities. We envision the primary application being machine learning (ML) problems, with the method's stochastic features facilitating mini-batch sampling of datasets. Since it uses a monotone operator formulation, the method can handle not only Lipschitz-smooth loss minimization, but also min-max and noncooperative game formulations, with better convergence properties than the gradient descent-ascent methods commonly applied in such settings. The proposed method can handle any number of constraints and nonsmooth regularizers via projection and proximal operators. We prove almost-sure convergence of the iterates to a solution and a convergence rate result for the expected residual, and close with numerical experiments on a distributionally robust sparse logistic regression problem.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据