Journal
ANNALS OF STATISTICS
Volume 37, Issue 6A, Pages 3498-3528Publisher
INST MATHEMATICAL STATISTICS
DOI: 10.1214/09-AOS683
Keywords
Model selection; sparse recovery; high dimensionality; concave penalty; regularized least squares; weak oracle property
Categories
Funding
- NSF [DMS-08-06030, DMS-09-06784]
- USC's James H. Zumberge Faculty Research and Innovation Fund
- Direct For Mathematical & Physical Scien
- Division Of Mathematical Sciences [0806030] Funding Source: National Science Foundation
Ask authors/readers for more resources
Model selection and sparse recovery are two important problems for which many regularization methods have been proposed. We study the properties of regularization methods in both problems under the unified framework of regularized least squares with concave penalties. For model selection, we establish conditions under which a regularized least squares estimator enjoys a nonasymptotic property, called the weak oracle property, where the dimensionality can grow exponentially with sample size. For sparse recovery, we present a sufficient condition that ensures the recoverability of the sparsest solution. In particular, we approach both problems by considering a family of penalties that give a smooth homotopy between L-0 and L-1 penalties. We also propose the sequentially and iteratively reweighted squares (SIRS) algorithm for sparse recovery. Numerical studies support our theoretical results and demonstrate the advantage of our new methods for model selection and sparse recovery.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available