4.5 Article

Information-Theoretic Lower Bounds on the Oracle Complexity of Stochastic Convex Optimization

Journal

IEEE TRANSACTIONS ON INFORMATION THEORY
Volume 58, Issue 5, Pages 3235-3249

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIT.2011.2182178

Keywords

Computational learning theory; convex optimization; Fano's inequality; information-based complexity; minimax analysis; oracle complexity

Funding

  1. National Science Foundation [DMS-0707060, DMS-0830410, DARPA-HR0011-08-2-0002, DMS-0605165, DMS-0907632]
  2. Microsoft
  3. Air Force Office of Scientific Research [AFOSR-09NL184]
  4. Division of Computing and Communication Foundations
  5. Direct For Computer & Info Scie & Enginr [1115788] Funding Source: National Science Foundation

Ask authors/readers for more resources

Relative to the large literature on upper bounds on complexity of convex optimization, lesser attention has been paid to the fundamental hardness of these problems. Given the extensive use of convex optimization in machine learning and statistics, gaining an understanding of these complexity-theoretic issues is important. In this paper, we study the complexity of stochastic convex optimization in an oracle model of computation. We introduce a new notion of discrepancy between functions, and use it to reduce problems of stochastic convex optimization to statistical parameter estimation, which can be lower bounded using information-theoretic methods. Using this approach, we improve upon known results and obtain tight minimax complexity estimates for various function classes.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available