4.6 Article

Adaptive model selection

Journal

JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
Volume 97, Issue 457, Pages 210-221

Publisher

AMER STATISTICAL ASSOC
DOI: 10.1198/016214502753479356

Keywords

adaptive penalty; false discovery rate; optimal predication; parametric and nonparametric regression; variable selection; wavelets

Ask authors/readers for more resources

Most model selection procedures use a fixed penalty penalizing an increase in the size of a model. These nonadaptive selection procedures perform well only in one type of situation. For instance, Bayesian information criterion (BIC) with a large penalty per-forms well for small models and poorly for large models, and Akaike's information criterion (AIC) does just the opposite. This article proposes an adaptive model selection procedure that uses a data-adaptive complexity penalty based on a concept of generalized degrees of freedom. The proposed procedure, combining the benefit of a class of nonadaptive procedures, approximates the best performance of this class of procedures across a variety of different situations. This class includes many well-known procedures, such as AIC, BIC, Mallows's C-p, and risk inflation criterion (RIC). The proposed procedure is applied to wavelet thresholding in nonparametric regression and variable selection in least squares regression. Simulation results and an asymptotic analysis support the effectiveness of the proposed procedure.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available