4.6 Article

Sparse Reduced-Rank Regression for Simultaneous Dimension Reduction and Variable Selection

Journal

JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
Volume 107, Issue 500, Pages 1533-1545

Publisher

AMER STATISTICAL ASSOC
DOI: 10.1080/01621459.2012.734178

Keywords

Group lasso penalty; Low rank matrix approximation; Multivariate regression; Penalized least squares; Sparsity; Stiefel manifold

Funding

  1. National Science Foundation (NSF) [DMS-0907170, DMS-1007618, DMS-1208952]
  2. King Abdullah University of Science and Technology (KAUST) [KUS-CI-016-04]
  3. Division Of Mathematical Sciences
  4. Direct For Mathematical & Physical Scien [1007618, 1208952] Funding Source: National Science Foundation

Ask authors/readers for more resources

The reduced-rank regression is an effective method in predicting multiple response variables from the same set of predictor variables. It reduces the dumber of model parameters and takes advantage of interrelations between. the response variables and hence improves predictive accuracy. We propose to select relevant variables for reduced-rank regression by using a sparsity-inducing penalty. We apply a group-lasso type penalty that treats each row of the matrix of the regression coefficients as a group and show that this penalty satisfies certain desirable invariance properties. We develop two numerical algorithms to solve the penalized regression problem and establish the asymptotic consistency of the proposed method. In particular, the manifold structure of the reduced-rank regression coefficient matrix is considered and studied in our theoretical analysis. In our simulation study and real data analysis, the new method is compared with several existing variable selection methods for multivariate regression and exhibits competitive performance in prediction and variable selection.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available