4.5 Article Proceedings Paper

Representation of mutual information via input estimates

Journal

IEEE TRANSACTIONS ON INFORMATION THEORY
Volume 53, Issue 2, Pages 453-470

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIT.2006.889728

Keywords

computation of mutual information; extrinsic information; input estimation; low-density parity-check (LDPC) codes; minimum mean square error (MMSE); mutual information; soft channel decoding

Ask authors/readers for more resources

A relationship between information theory and estimation theory was recently shown for the Gaussian channel, relating the derivative of mutual information with the minimum mean-square error. This paper generalizes the link between information theory and estimation theory to arbitrary channels, giving representations of the derivative of mutual information as a function of the conditional marginal input distributions given the outputs. We illustrate the use of this representation in the efficient numerical computation of the mutual information achieved by inputs such as specific codes or natural language.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available