Computer Science – Information Theory
Scientific paper
2008-10-29
IEEE Transactions on Information Theory 55 (2009) 5412--5422
Computer Science
Information Theory
minor changes; references added
Scientific paper
10.1109/TIT.2009.2032727
An "entropy increasing to the maximum" result analogous to the entropic central limit theorem (Barron 1986; Artstein et al. 2004) is obtained in the discrete setting. This involves the thinning operation and a Poisson limit. Monotonic convergence in relative entropy is established for general discrete distributions, while monotonic increase of Shannon entropy is proved for the special class of ultra-log-concave distributions. Overall we extend the parallel between the information-theoretic central limit theorem and law of small numbers explored by Kontoyiannis et al. (2005) and Harremo\"es et al.\ (2007, 2008). Ingredients in the proofs include convexity, majorization, and stochastic orders.
No associations
LandOfFree
Monotonic Convergence in an Information-Theoretic Law of Small Numbers does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Monotonic Convergence in an Information-Theoretic Law of Small Numbers, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Monotonic Convergence in an Information-Theoretic Law of Small Numbers will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-71832