4.6 Article

Quantile Regression for Analyzing Heterogeneity in Ultra-High Dimension

Journal

JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
Volume 107, Issue 497, Pages 214-222

Publisher

TAYLOR & FRANCIS INC
DOI: 10.1080/01621459.2012.656014

Keywords

Penalized quantile regression; SCAD; Sparsity; Ultra-high-dimensional data

Funding

  1. NSF [DMS1007603, DMS-0905561, 1055210, DMS 0348869]
  2. NIH/NCI [R01 CA-149569]
  3. NNSF of China [11028103, 10911120395]
  4. NIDA, NIH [R21 DA024260, P50 DA10075]
  5. Direct For Mathematical & Physical Scien
  6. Division Of Mathematical Sciences [1055210] Funding Source: National Science Foundation
  7. Division Of Mathematical Sciences
  8. Direct For Mathematical & Physical Scien [0905561, 1007603] Funding Source: National Science Foundation

Ask authors/readers for more resources

Ultra-high dimensional data often display heterogeneity due to either heteroscedastic variance or other forms of non-location-scale covariate effects. To accommodate heterogeneity, we advocate a more general interpretation of sparsity, which assumes that only a small number of covariates influence the conditional distribution of the response variable, given all candidate covariates; however, the sets of relevant covariates may differ when we consider different segments of the conditional distribution. In this framework, we investigate the methodology and theory of nonconvex, penalized quantile regression in ultra-high dimension. The proposed approach has two distinctive features: (1) It enables us to explore the entire conditional distribution of the response variable, given the ultra-high-dimensional covariates, and provides a more realistic picture of the sparsity pattern; (2) it requires substantially weaker conditions compared with alternative methods in the literature; thus, it greatly alleviates the difficulty of model checking in the ultra-high dimension. In theoretic development, it is challenging to deal with both the nonsmooth loss function and the nonconvex penalty function in ultra-high-dimensional parameter space. We introduce a novel, sufficient optimality condition that relies on a convex differencing representation of the penalized loss function and the subdifferential calculus. Exploring this optimality condition enables us to establish the oracle property for sparse quantile regression in the ultra-high dimension under relaxed conditions. The proposed method greatly enhances existing tools for ultra-high-dimensional data analysis. Monte Carlo simulations demonstrate the usefulness of the proposed procedure. The real data example we analyzed demonstrates that the new approach reveals substantially more information as compared with alternative methods. This article has online supplementary material.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available