Computer Science – Information Theory
Scientific paper
2010-12-02
Computer Science
Information Theory
18 pages, 1 figure
Scientific paper
It is known that the Entropy Power Inequality (EPI) always holds if the random variables have density. Not much work has been done to identify discrete distributions for which the inequality holds with the differential entropy replaced by the discrete entropy. Harremo\"{e}s and Vignat showed that it holds for the pair (B(m,p), B(n,p)), m,n \in \mathbb{N}, (where B(n,p) is a Binomial distribution with n trials each with success probability p) for p = 0.5. In this paper, we considerably expand the set of Binomial distributions for which the inequality holds and, in particular, identify n_0(p) such that for all m,n \geq n_0(p), the EPI holds for (B(m,p), B(n,p)). We further show that the EPI holds for the discrete random variables that can be expressed as the sum of n independent identical distributed (IID) discrete random variables for large n.
Das Smarajit
Muthukrishnan Siddharth
Sharma Naresh
No associations
LandOfFree
Entropy power inequality for a family of discrete random variables does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Entropy power inequality for a family of discrete random variables, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Entropy power inequality for a family of discrete random variables will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-603636