Journal
IEEE TRANSACTIONS ON INFORMATION THEORY
Volume 58, Issue 5, Pages 3235-3249Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIT.2011.2182178
Keywords
Computational learning theory; convex optimization; Fano's inequality; information-based complexity; minimax analysis; oracle complexity
Funding
- National Science Foundation [DMS-0707060, DMS-0830410, DARPA-HR0011-08-2-0002, DMS-0605165, DMS-0907632]
- Microsoft
- Air Force Office of Scientific Research [AFOSR-09NL184]
- Division of Computing and Communication Foundations
- Direct For Computer & Info Scie & Enginr [1115788] Funding Source: National Science Foundation
Ask authors/readers for more resources
Relative to the large literature on upper bounds on complexity of convex optimization, lesser attention has been paid to the fundamental hardness of these problems. Given the extensive use of convex optimization in machine learning and statistics, gaining an understanding of these complexity-theoretic issues is important. In this paper, we study the complexity of stochastic convex optimization in an oracle model of computation. We introduce a new notion of discrepancy between functions, and use it to reduce problems of stochastic convex optimization to statistical parameter estimation, which can be lower bounded using information-theoretic methods. Using this approach, we improve upon known results and obtain tight minimax complexity estimates for various function classes.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available