Computer Science – Information Theory
Scientific paper
2010-01-22
Computer Science
Information Theory
This paper has been withdrawn by the author due to its significant modification
Scientific paper
Provable lower bounds are presented for the information rate I(X; X+S+N) where X is the symbol drawn from a fixed, finite-size alphabet, S a discrete-valued random variable (RV) and N a Gaussian RV. The information rate I(X; X+S+N) serves as a tight lower bound for capacity of intersymbol interference (ISI) channels corrupted by Gaussian noise. The new bounds can be calculated with a reasonable computational load and provide a similar level of tightness as the well-known conjectured lower bound by Shamai and Laroia for a good range of finite-ISI channels of practical interest. The computation of the presented bounds requires the evaluation of the magnitude sum of the precursor ISI terms as well as the identification of dominant terms among them seen at the output of the minimum mean-squared error (MMSE) decision feedback equalizer (DFE).
Jeong Seongwook
Moon Jaekyun
No associations
LandOfFree
Computing Lower Bounds on the Information Rate of Intersymbol Interference Channels does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Computing Lower Bounds on the Information Rate of Intersymbol Interference Channels, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Computing Lower Bounds on the Information Rate of Intersymbol Interference Channels will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-655159