Computer Science – Information Theory
Scientific paper
2010-03-31
Computer Science
Information Theory
16 pages
Scientific paper
We exhibit two memoryless Gaussian networks where the capacity-gains afforded by feedback are unbounded in the signal-to-noise ratio (SNR). The networks are instances of the Gaussian broadcast channel and the two-user Gaussian interference channel. To demonstrate the capacity-gains we propose and analyze a novel feedback coding scheme. For the broadcast channel with two receivers it is shown that if the noise sequences at the two receivers are perfectly anticorrelated, then, at high SNR, feedback asymptotically doubles the sum-capacity. The same holds if the noise sequences are perfectly correlated provided that they are of unequal variances. This result extends to the multi-receiver broadcast channel: if the noise sequences are all different and have a rank-one covariance matrix, then, at high-SNR, feedback asymptotically multiplies the sum-capacity by the number of receivers. However, as we show, these multiplicative gains collapse when the feedback is noisy. For the two-receiver Gaussian broadcast channel with noise-free feedback we also derive the high-SNR asymptotic sum-capacity. The expansion is exact in the sense that, as the SNR tends to infinity, the difference between the sum-capacity and our asymptotic expression tends to zero. If the noise sequences are perfectly anticorrelated or if they are perfectly correlated and of unequal variances, then the asymptotic expansion is as if the transmitter communicated to the two receivers over two parallel Gaussian channels. Otherwise, the asymptotic expansion is the same as if the receivers could cooperate. For the two-user interference channel it is shown that if the noises experienced by the two receivers are perfectly correlated or perfectly anticorrelated, then for most channel-gains feedback doubles the high SNR sum-capacity.
Gastpar Michael
Lapidoth Amos
Steinberg Yossef
Wigger Michele
No associations
LandOfFree
Feedback Can Double the Prelog of Some Memoryless Gaussian Networks does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Feedback Can Double the Prelog of Some Memoryless Gaussian Networks, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Feedback Can Double the Prelog of Some Memoryless Gaussian Networks will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-580730