Computer Science – Information Theory
Scientific paper
2010-04-27
Computer Science
Information Theory
5 pages, 1 figure, submitted to ITW 2010
Scientific paper
Over binary input channels, uniform distribution is a universal prior, in the sense that it allows to maximize the worst case mutual information over all binary input channels, ensuring at least 94.2% of the capacity. In this paper, we address a similar question, but with respect to a universal generalized linear decoder. We look for the best collection of finitely many a posteriori metrics, to maximize the worst case mismatched mutual information achieved by decoding with these metrics (instead of an optimal decoder such as the Maximum Likelihood (ML) tuned to the true channel). It is shown that for binary input and output channels, two metrics suffice to actually achieve the same performance as an optimal decoder. In particular, this implies that there exist a decoder which is generalized linear and achieves at least 94.2% of the compound capacity on any compound set, without the knowledge of the underlying set.
Abbe Emmanuel
Pulikkoonattu Rethnakaran
No associations
LandOfFree
Universal A Posteriori Metrics Game does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Universal A Posteriori Metrics Game, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Universal A Posteriori Metrics Game will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-237402