Journal
COMMUNICATIONS IN STATISTICS-THEORY AND METHODS
Volume 51, Issue 13, Pages 4358-4369Publisher
TAYLOR & FRANCIS INC
DOI: 10.1080/03610926.2020.1813305
Keywords
Additive noise channels; entropy power inequality; Fisher information; differential entropy
Categories
Funding
- Graduate office of the University of Isfahan
Ask authors/readers for more resources
This article shows that the entropy power inequality (EPI) holds for the case when the involved random variables are dependent, under some conditions. A lower bound for the Fisher information of the output signal is obtained, which is useful on its own. An example is provided to illustrate the result.
The entropy power inequality (EPI) for convolution of two independent random variables was first proposed by Shannon, C. E., 1948. However, in practice, there are many situations in which the involved random variables are not independent. In this article, considering additive noise channels, it is shown that, under some conditions, EPI holds for the case when the involved random variables are dependent. In order to achieve our main result, meanwhile a lower bound for the Fisher information of the output signal is obtained which is useful on its own. An example is also provided to illustrate our result.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available