4.2 Review

A brief review of linear sufficient dimension reduction through optimization

Journal

JOURNAL OF STATISTICAL PLANNING AND INFERENCE
Volume 211, Issue -, Pages 154-161

Publisher

ELSEVIER
DOI: 10.1016/j.jspi.2020.06.006

Keywords

Conditional independence; Distance covariance; Loss minimization; Mutual information; Response transformation

Ask authors/readers for more resources

In this paper, we review three families of methods in linear sufficient dimension reduction through optimization, including minimization of general loss functions and maximization of dependence measures. Classical methods and modern methods are unified under a common framework; an information-theoretic perspective is provided for the third family of sufficient dimension reduction methods.
In this paper, we review three families of methods in linear sufficient dimension reduction through optimization. Through minimization of general loss functions, we cast classical methods, such as ordinary least squares and sliced inverse regression, and modern methods, such as principal support vector machines and principal quantile regression, under a unified framework. Then we review sufficient dimension reduction methods through maximizing dependence measures, which include the distance covariance, the Hilbert-Schmidt independence criterion, the martingale difference divergence, and the expected conditional difference. Last but not least, we provide an information-theoretic perspective for the third family of sufficient dimension reduction methods. (C) 2020 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.2
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available