Computer Science – Information Theory
Scientific paper
2007-01-08
Computer Science
Information Theory
5 pages, accepted for presentation at the IEEE International Symposium on Information Theory 2007
Scientific paper
While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, Shannon's entropy power inequality (EPI) seems to be an exception: available information theoretic proofs of the EPI hinge on integral representations of differential entropy using either Fisher's information (FI) or minimum mean-square error (MMSE). In this paper, we first present a unified view of proofs via FI and MMSE, showing that they are essentially dual versions of the same proof, and then fill the gap by providing a new, simple proof of the EPI, which is solely based on the properties of mutual information and sidesteps both FI or MMSE representations.
No associations
LandOfFree
A Simple Proof of the Entropy-Power Inequality via Properties of Mutual Information does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with A Simple Proof of the Entropy-Power Inequality via Properties of Mutual Information, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and A Simple Proof of the Entropy-Power Inequality via Properties of Mutual Information will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-721702