Journal
2018 IEEE CONFERENCE ON DECISION AND CONTROL (CDC)
Volume -, Issue -, Pages 5753-5758Publisher
IEEE
Keywords
Consensus networks; convex optimization; first-order methods; gradient descent; heavy-ball method; Nesterov's accelerated gradient method; stochastic performance; variance amplification
Funding
- National Science Foundation [ECCS-1809833]
- Air Force Office of Scientific Research [FA9550-16-1-0009]
Ask authors/readers for more resources
We study performance of accelerated first-order optimization algorithms in the presence of additive white stochastic disturbances. For strongly convex quadratic problems, we explicitly evaluate the steady-state variance of the optimization variable in terms of the eigenvalues of the Hessian of the objective function. We demonstrate that, as the condition number increases, variance amplification of both Nesterov's accelerated method and the heavy-ball method by Polyak is significantly larger than that of the standard gradient descent. In the context of distributed computation over networks, we examine the role of network topology and spatial dimension on the performance of these first-order algorithms. For d-dimensional tori, we establish explicit asymptotic dependence for the variance amplification on the network size and the corresponding condition number. Our results demonstrate detrimental influence of acceleration on amplification of stochastic disturbances and suggest that metrics other than convergence rate have to be considered when evaluating performance of optimization algorithms.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available