4.2 Article

Random weighting in LASSO regression

Journal

ELECTRONIC JOURNAL OF STATISTICS
Volume 16, Issue 1, Pages 3430-3481

Publisher

INST MATHEMATICAL STATISTICS-IMS
DOI: 10.1214/22-EJS2020

Keywords

Random weights; weighted likelihood bootstrap; weighted Bayesian bootstrap; LASSO; bootstrap; perturbation bootstrap; consistency; model selection consistency

Funding

  1. University of Wisconsin Institute for the Foundations of Data Science
  2. US National Science Foundation [1740707, 2023239]
  3. National Institutes of Health [P01CA250972]
  4. Direct For Mathematical & Physical Scien
  5. Division Of Mathematical Sciences [2023239] Funding Source: National Science Foundation
  6. Division of Computing and Communication Foundations
  7. Direct For Computer & Info Scie & Enginr [1740707] Funding Source: National Science Foundation

Ask authors/readers for more resources

This article establishes the statistical properties of random-weighting methods in LASSO regression under different regularization parameters and suitable regularity conditions. By assigning random weights to terms in the objective function and optimizing them, we obtain a set of random-weighting estimators. The study shows that existing methods have conditional model selection consistency and conditional asymptotic normality at different growth rates. Moreover, an extension to these methods is proposed, demonstrating conditional sparse normality and consistency in a growing-dimension setting. The effectiveness of the proposed methodology is illustrated using synthetic and benchmark datasets, and its relationship to approximate nonparametric Bayesian analysis and perturbation bootstrap methods is discussed.
We establish statistical properties of random-weighting methods in LASSO regression under different regularization parameters lambda(n) and suitable regularity conditions. The random-weighting methods in view concern repeated optimization of a randomized objective function, motivated by the need for computationally efficient uncertainty quantification in contemporary estimation settings. In the context of LASSO regression, we repeatedly assign analyst-drawn random weights to terms in the objective function, and optimize to obtain a sample of random-weighting estimators. We show that existing approaches have conditional model selection consistency and conditional asymptotic normality at different growth rates of lambda(n) as n -> infinity. We propose an extension to the available random-weighting methods and establish that the resulting samples attain conditional sparse normality and conditional consistency in a growing-dimension setting. We illustrate the proposed methodology using synthetic and benchmark data sets, and we discuss the relationship of the results to approximate nonparametric Bayesian analysis and to perturbation bootstrap methods.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.2
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available