Journal
SIAM JOURNAL ON OPTIMIZATION
Volume 26, Issue 1, Pages 57-95Publisher
SIAM PUBLICATIONS
DOI: 10.1137/15M1009597
Keywords
convex optimization; first-order methods; Nesterov's method; Heavy-ball method; proximal gradient methods; semidefinite programming; integral quadratic constraints; control theory
Categories
Funding
- NSF CISE Expeditions award [CCF-1139158]
- LBNL award [7076018]
- DARPA XData award [FA8750-12-2-0331]
- AFOSR [FA9550-12-1-0339, FA9550-13-1-0138]
- NASA [NRA NNX12AM55A]
- ONR [N00014-11-1-0723, N00014-13-1-0129]
- NSF [CCF-1148243]
- Sloan Research Fellowship
Ask authors/readers for more resources
This paper develops a new framework to analyze and design iterative optimization algorithms built on the notion of integral quadratic constraints (IQCs) from robust control theory. IQCs provide sufficient conditions for the stability of complicated interconnected systems, and these conditions can be checked by semidefinite programming. We discuss how to adapt IQC theory to study optimization algorithms, proving new inequalities about convex functions and providing a version of IQC theory adapted for use by optimization researchers. Using these inequalities, we derive numerical upper bounds on convergence rates for the Gradient method, the Heavy-ball method, Nesterov's accelerated method, and related variants by solving small, simple semidefinite programming problems. We also briefly show how these techniques can be used to search for optimization algorithms with desired performance characteristics, establishing a new methodology for algorithm design.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available