4.5 Article

Penalized principal logistic regression for sparse sufficient dimension reduction

期刊

COMPUTATIONAL STATISTICS & DATA ANALYSIS
卷 111, 期 -, 页码 48-58

出版社

ELSEVIER SCIENCE BV
DOI: 10.1016/j.csda.2016.12.003

关键词

Max-SCAD penalty; Principal logistic regression; Sparse sufficient dimension reduction; Sufficient dimension reduction

资金

  1. National Research Foundation of Korea (NRF) - Korea government (MSIP) [2015R1C1A1A01054913]
  2. National Research Foundation of Korea [2015R1C1A1A01054913] Funding Source: Korea Institute of Science & Technology Information (KISTI), National Science & Technology Information Service (NTIS)

向作者/读者索取更多资源

Sufficient dimension reduction (SDR) is a successive tool for reducing the dimensionality of predictors by finding the central subspace, a minimal subspace of predictors that preserves all the regression information. When predictor dimension is large, it is often assumed that only a small number of predictors is informative. In this regard, sparse SDR is desired to achieve variable selection and dimension reduction simultaneously. We propose a principal logistic regression (PLR) as a new SDR tool and further develop its penalized version for sparse SDR. Asymptotic analysis shows that the penalized PLR enjoys the oracle property. Numerical investigation supports the advantageous performance of the proposed methods. (C) 2016 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据