4.7 Article

Distributed Variable Sample-Size Stochastic Optimization With Fixed Step-Sizes

Journal

IEEE TRANSACTIONS ON AUTOMATIC CONTROL
Volume 67, Issue 10, Pages 5630-5637

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TAC.2022.3179216

Keywords

Convergence; Stochastic processes; Cost function; Convex functions; Costs; Complexity theory; Distributed algorithms; Distributed optimization; multiagent systems; stochastic optimization; variance reduction

Funding

  1. National Natural Science Foundation of China [62003239, 62003240, 61733018]
  2. Shanghai Sailing Program
  3. Fundamental Research Funds for the Central Universities
  4. Shanghai Municipal Science and Technology Major, Project [2021SHZDZX0100]
  5. Shanghai Municipal Commission of Science and Technology [19511132101]

Ask authors/readers for more resources

In this article, the authors investigate the problem of distributed stochastic optimization over randomly switching networks. They propose a distributed stochastic gradient tracking algorithm that incorporates a variance reduction scheme. They provide convergence guarantees and complexity analysis under certain conditions.
In this article, we consider distributed stochastic optimization over randomly switching networks, where agents collaboratively minimize the average of all agents' local expectation-valued convex cost functions. Due to the stochasticity in gradient observations, distributedness of local functions, and randomness of communication topologies, distributed algorithms with an exact convergence guarantee under fixed step-sizes have not been achieved yet. This work incorporates variance reduction scheme into the distributed stochastic gradient tracking algorithm, where local gradients are estimated by averaging across a variable number of sampled gradients. With an identically and independently distributed random network, we show that all agents' iterates converge almost surely to the same optimal solution under fixed step-sizes. When the global cost function is strongly convex and the sample size increases at a geometric rate, we prove that the iterates geometrically converge to the unique optimal solution, and establish the iteration, oracle, and communication complexity. The algorithm performance, including rate and complexity analysis, are further investigated with constant step-sizes and a polynomially increasing sample size. Finally, the empirical algorithm performance are illustrated with numerical examples.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available