Generalized Entropy Power Inequalities and Monotonicity Properties of Information

Computer Science – Information Theory

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

13 pages. Many minor modifications from first version, plus a section on refined results. This is almost but not exactly ident

Scientific paper

New families of Fisher information and entropy power inequalities for sums of independent random variables are presented. These inequalities relate the information in the sum of $n$ independent random variables to the information contained in sums over subsets of the random variables, for an arbitrary collection of subsets. As a consequence, a simple proof of the monotonicity of information in central limit theorems is obtained, both in the setting of i.i.d. summands as well as in the more general setting of independent summands with variance-standardized sums.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Generalized Entropy Power Inequalities and Monotonicity Properties of Information does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Generalized Entropy Power Inequalities and Monotonicity Properties of Information, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Generalized Entropy Power Inequalities and Monotonicity Properties of Information will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-469480

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.