4.7 Article

Stochastic mutual information gradient estimation for dimensionality reduction networks

Journal

INFORMATION SCIENCES
Volume 570, Issue -, Pages 298-305

Publisher

ELSEVIER SCIENCE INC
DOI: 10.1016/j.ins.2021.04.066

Keywords

Feature projection; Dimensionality reduction; Neural networks; Information theoretic learning; Mutual information; Stochastic gradient estimation; MMINet

Funding

  1. NSF [IIS-1149570, CNS-1544895, IIS-1715858]
  2. NIH [R01DC009834]
  3. DHHS [90RE5017-02-01]

Ask authors/readers for more resources

This study introduces an emerging information theoretic feature transformation protocol as an end-to-end neural network training approach, achieving feature dimensionality reduction and experimental evaluation on high-dimensional biological data sets.
Feature ranking and selection is a widely used approach in various applications of supervised dimensionality reduction in discriminative machine learning. Nevertheless there exists significant evidence on feature ranking and selection algorithms based on any criterion leading to potentially sub-optimal solutions for class separability. In that regard, we introduce emerging information theoretic feature transformation protocols as an end-to end neural network training approach. We present a dimensionality reduction network (MMINet) training procedure based on the stochastic estimate of the mutual information gradient. The network projects high-dimensional features onto an output feature space where lower dimensional representations of features carry maximum mutual information with their associated class labels. Furthermore, we formulate the training objective to be estimated non-parametrically with no distributional assumptions. We experimentally evaluate our method with applications to high-dimensional biological data sets, and relate it to conventional feature selection algorithms to form a special case of our approach. (c) 2021 Elsevier Inc. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available