期刊
IEEE TRANSACTIONS ON INFORMATION THEORY
卷 53, 期 2, 页码 453-470出版社
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIT.2006.889728
关键词
computation of mutual information; extrinsic information; input estimation; low-density parity-check (LDPC) codes; minimum mean square error (MMSE); mutual information; soft channel decoding
A relationship between information theory and estimation theory was recently shown for the Gaussian channel, relating the derivative of mutual information with the minimum mean-square error. This paper generalizes the link between information theory and estimation theory to arbitrary channels, giving representations of the derivative of mutual information as a function of the conditional marginal input distributions given the outputs. We illustrate the use of this representation in the efficient numerical computation of the mutual information achieved by inputs such as specific codes or natural language.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据