4.6 Article

A GENERAL THEORY FOR NONLINEAR SUFFICIENT DIMENSION REDUCTION: FORMULATION AND ESTIMATION

期刊

ANNALS OF STATISTICS
卷 41, 期 1, 页码 221-249

出版社

INST MATHEMATICAL STATISTICS
DOI: 10.1214/12-AOS1071

关键词

Dimension reduction sigma-field; exhaustivenes; generalized sliced average variance estimator; generalized sliced inverse regression estimator; heteroscedastic conditional covariance operator; sufficient and complete dimension reduction classes; unbiasedness

资金

  1. NSF [DMS-08-06058, DMS-11-06815]
  2. Division Of Mathematical Sciences
  3. Direct For Mathematical & Physical Scien [1106815] Funding Source: National Science Foundation

向作者/读者索取更多资源

In this paper we introduce a general theory for nonlinear sufficient dimension reduction, and explore its ramifications and scope. This theory subsumes recent work employing reproducing kernel Hilbert spaces, and reveals many parallels between linear and nonlinear sufficient dimension reduction. Using these parallels we analyze the properties of existing methods and develop new ones. We begin by characterizing dimension reduction at the general level of sigma-fields and proceed to that of classes of functions, leading to the notions of sufficient, complete and central dimension reduction classes. We show that, when it exists, the complete and sufficient class coincides with the central class, and can be unbiasedly and exhaustively estimated by a generalized sliced inverse regression estimator (GSIR). When completeness does not hold, this estimator captures only part of the central class. However, in these cases we show that a generalized sliced average variance estimator (GSAVE) can capture a larger portion of the class. Both estimators require no numerical optimization because they can be computed by spectral decomposition of linear operators. Finally, we compare our estimators with existing methods by simulation and on actual data sets.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据