Computer Science – Information Theory
Scientific paper
2009-09-03
IEEE Transactions on Information Theory, Vol 56/11, 2010, pages 5387-5395
Computer Science
Information Theory
9 pages (revised to take account of referees' comments)
Scientific paper
10.1109/TIT.2010.2070570
We consider the entropy of sums of independent discrete random variables, in analogy with Shannon's Entropy Power Inequality, where equality holds for normals. In our case, infinite divisibility suggests that equality should hold for Poisson variables. We show that some natural analogues of the Entropy Power Inequality do not in fact hold, but propose an alternative formulation which does always hold. The key to many proofs of Shannon's Entropy Power Inequality is the behaviour of entropy on scaling of continuous random variables. We believe that R\'{e}nyi's operation of thinning discrete random variables plays a similar role to scaling, and give a sharp bound on how the entropy of ultra log-concave random variables behaves on thinning. In the spirit of the monotonicity results established by Artstein, Ball, Barthe and Naor, we prove a stronger version of concavity of entropy, which implies a strengthened form of our discrete Entropy Power Inequality.
Johnson Oliver
Yu Yaming
No associations
LandOfFree
Monotonicity, thinning and discrete versions of the Entropy Power Inequality does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Monotonicity, thinning and discrete versions of the Entropy Power Inequality, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Monotonicity, thinning and discrete versions of the Entropy Power Inequality will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-663835