Journal
ELECTRONIC JOURNAL OF STATISTICS
Volume 7, Issue -, Pages 3124-3169Publisher
INST MATHEMATICAL STATISTICS
DOI: 10.1214/14-EJS875
Keywords
Lasso; irrepresentable condition; Lasso-fmLS and Lasso plus Ridge; sparsity; asymptotic unbiasedness; asymptotic normality; residual bootstrap
Categories
Ask authors/readers for more resources
We study the asymptotic properties of Lasso+mLS and Lasso+ Ridge under the sparse high-dimensional linear regression model: Lasso selecting predictors and then modified Least Squares (mLS) or Ridge estimating their coefficients. First, we propose a valid inference procedure for parameter estimation based on parametric residual bootstrap after Lasso+ rriLS and Lasso+Ridge. Second, we derive the asymptotic unbiasedness of Lasso rilLS and Lasso Ridge. More specifically, we show that their biases decay at an exponential rate and they can achieve the oracle convergence rate of sln. (where s is the number of nonzero regression coefficients and 12 is the sample size) for mean squared error (MSE). Third, we show that Lasso mLS and Lasso Ridge are asymptotically normal. They have an oracle property in the sense that they can select the true predictors with probability converging to 1 and the estimates of nonzero parameters have the same asymptotic normal distribution that they would have if the zero parameters were known in advance. In fact, our analysis is not limited to adopting Lasso in the selection stage, but is applicable to any other model selection criteria with exponentially decay rates of the probability of selecting wrong models.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available