Computer Science – Information Theory
Scientific paper
2006-12-17
Computer Science
Information Theory
Submitted to the Trasactions of the IEEE on Information Theory
Scientific paper
Several proofs of the monotonicity of the non-Gaussianness (divergence with respect to a Gaussian random variable with identical second order statistics) of the sum of n independent and identically distributed (i.i.d.) random variables were published. We give an upper bound on the decrease rate of the non-Gaussianness which is proportional to the inverse of n, for large n. The proof is based on the relationship between non-Gaussianness and minimum mean-square error (MMSE) and causal minimum mean-square error (CMMSE) in the time-continuous Gaussian channel.
No associations
LandOfFree
On the Decrease Rate of the Non-Gaussianness of the Sum of Independent Random Variables does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with On the Decrease Rate of the Non-Gaussianness of the Sum of Independent Random Variables, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and On the Decrease Rate of the Non-Gaussianness of the Sum of Independent Random Variables will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-609489