Computer Science – Information Theory
Scientific paper
2009-04-08
IEEE International Symposium on Information Theory, June 28 2009-July 3, 2009, pp. 144 -- 148
Computer Science
Information Theory
To be presented at ISIT09
Scientific paper
10.1109/ISIT.2009.5205880
Building on the recent work of Johnson (2007) and Yu (2008), we prove that entropy is a concave function with respect to the thinning operation T_a. That is, if X and Y are independent random variables on Z_+ with ultra-log-concave probability mass functions, then H(T_a X+T_{1-a} Y)>= a H(X)+(1-a)H(Y), 0 <= a <= 1, where H denotes the discrete entropy. This is a discrete analogue of the inequality (h denotes the differential entropy) h(sqrt(a) X + sqrt{1-a} Y)>= a h(X)+(1-a) h(Y), 0 <= a <= 1, which holds for continuous X and Y with finite variances and is equivalent to Shannon's entropy power inequality. As a consequence we establish a special case of a conjecture of Shepp and Olkin (1981).
Johnson Oliver
Yu Yaming
No associations
LandOfFree
Concavity of entropy under thinning does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Concavity of entropy under thinning, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Concavity of entropy under thinning will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-408158