Computer Science – Information Theory
Scientific paper
2006-05-11
IEEE Transactions on Information Theory, Vol. 53(7), pp. 2317-2329, July 2007
Computer Science
Information Theory
13 pages. Many minor modifications from first version, plus a section on refined results. This is almost but not exactly ident
Scientific paper
New families of Fisher information and entropy power inequalities for sums of independent random variables are presented. These inequalities relate the information in the sum of $n$ independent random variables to the information contained in sums over subsets of the random variables, for an arbitrary collection of subsets. As a consequence, a simple proof of the monotonicity of information in central limit theorems is obtained, both in the setting of i.i.d. summands as well as in the more general setting of independent summands with variance-standardized sums.
Barron Andrew
Madiman Mokshay
No associations
LandOfFree
Generalized Entropy Power Inequalities and Monotonicity Properties of Information does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Generalized Entropy Power Inequalities and Monotonicity Properties of Information, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Generalized Entropy Power Inequalities and Monotonicity Properties of Information will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-469480