Computer Science – Information Theory
Scientific paper
2006-08-22
Computer Science
Information Theory
8 pages, 6 figures, To be presented at the 44-th Annual Allerton Conference on Communication, Control, and Computing, Septembe
Scientific paper
We consider the problem of estimating the probability of error in multi-hypothesis testing when MAP criterion is used. This probability, which is also known as the Bayes risk is an important measure in many communication and information theory problems. In general, the exact Bayes risk can be difficult to obtain. Many upper and lower bounds are known in literature. One such upper bound is the equivocation bound due to R\'enyi which is of great philosophical interest because it connects the Bayes risk to conditional entropy. Here we give a simple derivation for an improved equivocation bound. We then give some typical examples of problems where these bounds can be of use. We first consider a binary hypothesis testing problem for which the exact Bayes risk is difficult to derive. In such problems bounds are of interest. Furthermore using the bounds on Bayes risk derived in the paper and a random coding argument, we prove a lower bound on equivocation valid for most random codes over memoryless channels.
Santhi Nandakishore
Vardy Alexander
No associations
LandOfFree
On an Improvement over Rényi's Equivocation Bound does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with On an Improvement over Rényi's Equivocation Bound, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and On an Improvement over Rényi's Equivocation Bound will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-571892