Journal
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
Volume 102, Issue 480, Pages 1289-1299Publisher
AMER STATISTICAL ASSOC
DOI: 10.1198/016214507000000950
Keywords
bootstrap; computational complexity; robust prediction; stepwise algorithm; Winsorization
Categories
Ask authors/readers for more resources
In this article we consider the problem of building a linear prediction model when the number of candidate predictors is large and the data possibly contain anomalies that are difficult to visualize and clean. We want to predict the nonoutlying cases; therefore, we need a method that is simultaneously robust and scalable. We consider the stepwise least angle regression (LARS) algorithm which is computationally very efficient but sensitive to outliers. We introduce two different approaches to robustify LARS. The plug-in approach replaces the classical correlations in LARS by robust correlation estimates. The cleaning approach first transforms the data set by shrinking the outliers toward the bulk of the data (which we call multivariate Winsorization) and then applies LARS to the transformed data. We show that the plug in approach is time-efficient and scalable and that the bootstrap can be used to stabilize its results. We recommend using bootstrapped robustified LARS to sequence a number of candidate predictors to form a reduced set from which a more refined model can be selected.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available