4.6 Article

SPARSISTENCY AND RATES OF CONVERGENCE IN LARGE COVARIANCE MATRIX ESTIMATION

期刊

ANNALS OF STATISTICS
卷 37, 期 6B, 页码 4254-4278

出版社

INST MATHEMATICAL STATISTICS
DOI: 10.1214/09-AOS720

关键词

Covariance matrix; high-dimensionality; consistency; nonconcave penalized likelihood; sparsistency; asymptotic normality

资金

  1. NSF [DMS-03-54223, DMS-07-04337]
  2. NIH [R01-GM072611]

向作者/读者索取更多资源

This paper studies the sparsistency and rates of convergence for estimating sparse covariance and precision matrices based on penalized likelihood with nonconvex penalty functions. Here, sparsistency refers to the property that all parameters that are zero are actually estimated as zero with probability tending to one. Depending on the case of applications, sparsity priori may occur on the covariance matrix, its inverse or its Cholesky decomposition. We study these three sparsity exploration problems under a unified framework with a general penalty function. We show that the rates of convergence for these problems under the Frobenius norm are of order (s(n) log p(n)/n)(1/2), where s(n) is the number of nonzero elements, p(n) is the size of the covariance matrix and n is the sample size. This explicitly spells out the contribution of high-dimensionality is merely of a logarithmic factor. The conditions on the rate with which the tuning parameter lambda(n) goes to 0 have been made explicit and compared under different penalties. As a result, for the L(1)-penalty, to guarantee the sparsistency and optimal rate of convergence, the number of nonzero elements should be small: s'(n) = O(p(n)) at most, among O(p(n)(2)) parameters, for estimating sparse covariance or correlation matrix, sparse precision or inverse correlation matrix or sparse Cholesky factor, where s'(n) is the number of the nonzero elements on the off-diagonal entries. On the other hand, using the SCAD or hard-thresholding penalty functions, there is no such a restriction.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据