Computer Science – Information Theory
Scientific paper
2010-02-03
Computer Science
Information Theory
Submitted to the IEEE Transactions on Information Theory
Scientific paper
The minimum mean square error of the estimation of a non Gaussian signal where observed from an additive white Gaussian noise channel's output, is analyzed. First, a quite general time-continuous channel model is assumed for which the behavior of the non-Gaussianess of the channel's output for small signal to noise ratio q, is proved. Then, It is assumed that the channel input's signal is composed of a (normalized) sum of N narrowband, mutually independent waves. It is shown that if N goes to infinity, then for any fixed q (no mater how big) both CMMSE and MMSE converge to the signal energy at a rate which is proportional to the inverse of N. Finally, a known result for the MMSE in the one-dimensional case, for small q, is used to show that all the first four terms in the Taylor expansion of the non-Gaussianess of the channel's output equal to zero.
No associations
LandOfFree
Some Relations between Divergence Derivatives and Estimation in Gaussian channels does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Some Relations between Divergence Derivatives and Estimation in Gaussian channels, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Some Relations between Divergence Derivatives and Estimation in Gaussian channels will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-601896