4.6 Article

A SINGLE TIMESCALE STOCHASTIC APPROXIMATION METHOD FOR NESTED STOCHASTIC OPTIMIZATION

期刊

SIAM JOURNAL ON OPTIMIZATION
卷 30, 期 1, 页码 960-979

出版社

SIAM PUBLICATIONS
DOI: 10.1137/18M1230542

关键词

stochastic approximation; compositional optimization; stochastic gradient; stochastic variational inequality; machine learning

资金

  1. AFOSR [FA9550-19-1-0203 608]
  2. NSF [CMMI-1653435, DMS-1907522]

向作者/读者索取更多资源

We study constrained nested stochastic optimization problems in which the objective function is a composition of two smooth functions whose exact values and derivatives are not available. We propose a single timescale stochastic approximation algorithm, which we call the nested averaged stochastic approximation (NASA), to find an approximate stationary point of the problem. The algorithm has two auxiliary averaged sequences (filters) which estimate the gradient of the composite objective function and the inner function value. By using a special Lyapunov function, we show that the NASA achieves the sample complexity of O(1/epsilon(2)) for finding an s-approximate stationary point, thus outperforming all extant methods for nested stochastic approximation. Our method and its analysis are the same for both unconstrained and constrained problems, without any need of batch samples for constrained nonconvex stochastic optimization. We also present a simplified parameter-free variant of the NASA method for solving constrained single-level stochastic optimization problems, and we prove the same complexity result for both unconstrained and constrained problems.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据