4.1 Article

Multiple predicting K-fold cross-validation for model selection

期刊

JOURNAL OF NONPARAMETRIC STATISTICS
卷 30, 期 1, 页码 197-215

出版社

TAYLOR & FRANCIS LTD
DOI: 10.1080/10485252.2017.1404598

关键词

Cross-validation; K-fold cross-validation; model selection; tuning parameter selection

资金

  1. National Research Foundation of Korea [NRF-2017R1C1B5017431]
  2. Korea University [K1705711]

向作者/读者索取更多资源

K-fold cross-validation (CV) is widely adopted as a model selection criterion. In K-fold CV, (K - 1) folds are used for model construction and the hold-out fold is allocated to model validation. This implies model construction is more emphasised than the model validation procedure. However, some studies have revealed that more emphasis on the validation procedure may result in improved model selection. Specifically, leave-m-out CV with n samples may achieve variable-selection consistency when m/n approaches to 1. In this study, a new CV method is proposed within the framework of K-fold CV. The proposed method uses (K - 1) folds of the data for model validation, while the other fold is for model construction. This provides (K - 1) predicted values for each observation. These values are averaged to produce a final predicted value. Then, the model selection based on the averaged predicted values can reduce variation in the assessment due to the averaging. The variable-selection consistency of the suggested method is established. Its advantage over K-fold CV with finite samples are examined under linear, non-linear, and high-dimensional models.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.1
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据