Computer Science – Information Theory
Scientific paper
2010-01-17
IEEE Trans. Inform. Theory 56 (2010) 2299 - 2306
Computer Science
Information Theory
To appear, IEEE Trans. Inform. Theory
Scientific paper
10.1109/TIT.2010.2044057
One of the difficulties in calculating the capacity of certain Poisson channels is that H(lambda), the entropy of the Poisson distribution with mean lambda, is not available in a simple form. In this work we derive upper and lower bounds for H(lambda) that are asymptotically tight and easy to compute. The derivation of such bounds involves only simple probabilistic and analytic tools. This complements the asymptotic expansions of Knessl (1998), Jacquet and Szpankowski (1999), and Flajolet (1999). The same method yields tight bounds on the relative entropy D(n, p) between a binomial and a Poisson, thus refining the work of Harremoes and Ruzankin (2004). Bounds on the entropy of the binomial also follow easily.
Adell Jose A.
Lekuona Alberto
Yu Yaming
No associations
LandOfFree
Sharp Bounds on the Entropy of the Poisson Law and Related Quantities does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Sharp Bounds on the Entropy of the Poisson Law and Related Quantities, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Sharp Bounds on the Entropy of the Poisson Law and Related Quantities will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-455722