Mathematics – Probability
Scientific paper
2010-04-21
Electronic Journal of Probability, Vol 15, Paper no. 42, pages 1344-1369, 2010
Mathematics
Probability
27 pages
Scientific paper
An information-theoretic development is given for the problem of compound Poisson approximation, which parallels earlier treatments for Gaussian and Poisson approximation. Let $P_{S_n}$ be the distribution of a sum $S_n=\Sumn Y_i$ of independent integer-valued random variables $Y_i$. Nonasymptotic bounds are derived for the distance between $P_{S_n}$ and an appropriately chosen compound Poisson law. In the case where all $Y_i$ have the same conditional distribution given $\{Y_i\neq 0\}$, a bound on the relative entropy distance between $P_{S_n}$ and the compound Poisson distribution is derived, based on the data-processing property of relative entropy and earlier Poisson approximation results. When the $Y_i$ have arbitrary distributions, corresponding bounds are derived in terms of the total variation distance. The main technical ingredient is the introduction of two "information functionals," and the analysis of their properties. These information functionals play a role analogous to that of the classical Fisher information in normal approximation. Detailed comparisons are made between the resulting inequalities and related bounds.
Barbour Andrew D.
Johnson Oliver
Kontoyiannis Ioannis
Madiman Mokshay
No associations
LandOfFree
Compound Poisson Approximation via Information Functionals does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Compound Poisson Approximation via Information Functionals, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Compound Poisson Approximation via Information Functionals will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-327037