4.6 Article

Shrinkage tuning parameter selection with a diverging number of parameters

出版社

WILEY-BLACKWELL
DOI: 10.1111/j.1467-9868.2008.00693.x

关键词

Bayesian information criterion; Diverging number of parameters; Lasso; Smoothly clipped absolute deviation

资金

  1. National Natural Science Foundation of China [10771006, 10801086, 70621061, 70831003]
  2. National University of Singapore
  3. National University of Singapore Risk Management Institute

向作者/读者索取更多资源

Contemporary statistical research frequently deals with problems involving a diverging number of parameters. For those problems, various shrinkage methods (e.g. the lasso and smoothly clipped absolute deviation) are found to be particularly useful for variable selection. Nevertheless, the desirable performances of those shrinkage methods heavily hinge on an appropriate selection of the tuning parameters. With a fixed predictor dimension, Wang and co-worker have demonstrated that the tuning parameters selected by a Bayesian information criterion type criterion can identify the true model consistently. In this work, similar results are further extended to the situation with a diverging number of parameters for both unpenalized and penalized estimators. Consequently, our theoretical results further enlarge not only the scope of applicabilityation criterion type criteria but also that of those shrinkage estimation methods.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据