4.5 Article

Ensemble Estimation of Generalized Mutual Information With Applications to Genomics

Journal

IEEE TRANSACTIONS ON INFORMATION THEORY
Volume 67, Issue 9, Pages 5963-5996

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIT.2021.3100108

Keywords

Convergence; Estimation; Random variables; Feature extraction; Entropy; Density measurement; Kernel; Mutual information; nonparametric estimation; central limit theorem; single cell data; feature selection; minimax rate

Funding

  1. U.S. Army Research Office [W911NF1910269, W911NF1510479]
  2. National Nuclear Security Administration in U.S. Department of Energy [DE-NA0003921]
  3. U.S. Department of Defense (DOD) [W911NF1510479] Funding Source: U.S. Department of Defense (DOD)

Ask authors/readers for more resources

This passage introduces the concept and application of mutual information, and proposes an ensemble estimator called GENIE for estimating mutual information measures between continuous and mixed variables. The estimator can achieve a parametric mean squared error convergence rate of 1/N and is suitable for complex scenarios commonly encountered in practical applications.
Mutual information is a measure of the dependence between random variables that has been used successfully in myriad applications in many fields. Generalized mutual information measures that go beyond classical Shannon mutual information have also received much interest in these applications. We derive the mean squared error convergence rates of kernel density-based plug-in estimators of general mutual information measures between two multidimensional random variables X and Y for two cases: 1) X and Y are continuous; 2) X and Y may have a mixture of discrete and continuous components. Using the derived rates, we propose an ensemble estimator of these information measures called GENIE by taking a weighted sum of the plug-in estimators with varied bandwidths. The resulting ensemble estimators achieve the 1/N parametric mean squared error convergence rate when the conditional densities of the continuous variables are sufficiently smooth. To the best of our knowledge, this is the first nonparametric mutual information estimator known to achieve the parametric convergence rate for the mixture case, which frequently arises in applications (e.g. variable selection in classification). The estimator is simple to implement and it uses the solution to an offline convex optimization problem and simple plug-in estimators. A central limit theorem is also derived for the ensemble estimators and minimax rates are derived for the continuous case. We demonstrate the ensemble estimator for the mixed case on simulated data and apply the proposed estimator to analyze gene relationships in single cell data.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available