Journal
ANNALS OF STATISTICS
Volume 36, Issue 6, Pages 2717-2756Publisher
INST MATHEMATICAL STATISTICS
DOI: 10.1214/07-AOS559
Keywords
Covariance matrices; correlation matrices; adjacency matrices; eigenvalues of covariance matrices; multivariate statistical analysis; high-dimensional inference; random matrix theory; sparsity; beta-sparsity
Categories
Funding
- NSF [DMS-06-05169]
- SANSI
Ask authors/readers for more resources
Estimating covariance matrices is a problem of fundamental importance in multivariate statistics. In practice it is increasingly frequent to work with data matrices X of dimension if x p, where p and n are both large. Results from random matrix theory show very clearly that in this setting, standard estimators like the sample covariance matrix perform in general very poorly. In this large n, large p setting, it is sometimes the case that practitioners are willing to assume that many elements of the population covariance matrix are equal to 0, and hence this matrix is sparse. We develop an estimator to handle this situation. The estimator is shown to be consistent in operator norm, when, for instance, we have p asymptotic to n as n -> infinity. In other words the largest singular value of the difference between the estimator and the population covariance matrix goes to zero. This implies consistency of all the eigenvalues and consistency of eigenspaces associated to isolated eigenvalues. We also propose a notion of sparsity for matrices, that is, compatible with spectral analysis and is independent of the ordering of the variables.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available