Computer Science – Information Theory
Scientific paper
2004-12-23
Computer Science
Information Theory
Scientific paper
This paper deals with arbitrarily distributed finite-power input signals observed through an additive Gaussian noise channel. It shows a new formula that connects the input-output mutual information and the minimum mean-square error (MMSE) achievable by optimal estimation of the input given the output. That is, the derivative of the mutual information (nats) with respect to the signal-to-noise ratio (SNR) is equal to half the MMSE, regardless of the input statistics. This relationship holds for both scalar and vector signals, as well as for discrete-time and continuous-time noncausal MMSE estimation. This fundamental information-theoretic result has an unexpected consequence in continuous-time nonlinear estimation: For any input signal with finite power, the causal filtering MMSE achieved at SNR is equal to the average value of the noncausal smoothing MMSE achieved with a channel whose signal-to-noise ratio is chosen uniformly distributed between 0 and SNR.
Guo Dongning
Shamai Shlomo
Verdu Sergio
No associations
LandOfFree
Mutual Information and Minimum Mean-square Error in Gaussian Channels does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Mutual Information and Minimum Mean-square Error in Gaussian Channels, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Mutual Information and Minimum Mean-square Error in Gaussian Channels will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-581400