4.5 Article

Penalized principal logistic regression for sparse sufficient dimension reduction

Journal

COMPUTATIONAL STATISTICS & DATA ANALYSIS
Volume 111, Issue -, Pages 48-58

Publisher

ELSEVIER SCIENCE BV
DOI: 10.1016/j.csda.2016.12.003

Keywords

Max-SCAD penalty; Principal logistic regression; Sparse sufficient dimension reduction; Sufficient dimension reduction

Funding

  1. National Research Foundation of Korea (NRF) - Korea government (MSIP) [2015R1C1A1A01054913]
  2. National Research Foundation of Korea [2015R1C1A1A01054913] Funding Source: Korea Institute of Science & Technology Information (KISTI), National Science & Technology Information Service (NTIS)

Ask authors/readers for more resources

Sufficient dimension reduction (SDR) is a successive tool for reducing the dimensionality of predictors by finding the central subspace, a minimal subspace of predictors that preserves all the regression information. When predictor dimension is large, it is often assumed that only a small number of predictors is informative. In this regard, sparse SDR is desired to achieve variable selection and dimension reduction simultaneously. We propose a principal logistic regression (PLR) as a new SDR tool and further develop its penalized version for sparse SDR. Asymptotic analysis shows that the penalized PLR enjoys the oracle property. Numerical investigation supports the advantageous performance of the proposed methods. (C) 2016 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available