On the Decrease Rate of the Non-Gaussianness of the Sum of Independent Random Variables

Computer Science – Information Theory

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Submitted to the Trasactions of the IEEE on Information Theory

Scientific paper

Several proofs of the monotonicity of the non-Gaussianness (divergence with respect to a Gaussian random variable with identical second order statistics) of the sum of n independent and identically distributed (i.i.d.) random variables were published. We give an upper bound on the decrease rate of the non-Gaussianness which is proportional to the inverse of n, for large n. The proof is based on the relationship between non-Gaussianness and minimum mean-square error (MMSE) and causal minimum mean-square error (CMMSE) in the time-continuous Gaussian channel.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

On the Decrease Rate of the Non-Gaussianness of the Sum of Independent Random Variables does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with On the Decrease Rate of the Non-Gaussianness of the Sum of Independent Random Variables, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and On the Decrease Rate of the Non-Gaussianness of the Sum of Independent Random Variables will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-609489

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.