4.6 Article

Minimum Mutual Information and Non-Gaussianity Through the Maximum Entropy Method: Theory and Properties

期刊

ENTROPY
卷 14, 期 6, 页码 1103-1126

出版社

MDPI AG
DOI: 10.3390/e14061103

关键词

mutual information; non-Gaussianity; maximum entropy distributions; non-Gaussian noise

资金

  1. [PEST-OE/CTE/LA0019/2011-FCT]
  2. Fundação para a Ciência e a Tecnologia [PEst-OE/CTE/LA0019/2011] Funding Source: FCT

向作者/读者索取更多资源

The application of the Maximum Entropy (ME) principle leads to a minimum of the Mutual Information (MI), I(X,Y), between random variables X,Y, which is compatible with prescribed joint expectations and given ME marginal distributions. A sequence of sets of joint constraints leads to a hierarchy of lower MI bounds increasingly approaching the true MI. In particular, using standard bivariate Gaussian marginal distributions, it allows for the MI decomposition into two positive terms: the Gaussian MI (I-g), depending upon the Gaussian correlation or the correlation between 'Gaussianized variables', and a non-Gaussian MI (I-ng), coinciding with joint negentropy and depending upon nonlinear correlations. Joint moments of a prescribed total order p are bounded within a compact set defined by Schwarz-like inequalities, where I-ng grows from zero at the 'Gaussian manifold' where moments are those of Gaussian distributions, towards infinity at the set's boundary where a deterministic relationship holds. Sources of joint non-Gaussianity have been systematized by estimating I-ng between the input and output from a nonlinear synthetic channel contaminated by multiplicative and non-Gaussian additive noises for a full range of signal-to-noise ratio (snr) variances. We have studied the effect of varying snr on I-g and I-ng under several signal/noise scenarios.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据