Computer Science – Information Theory
Scientific paper
2009-06-06
Computer Science
Information Theory
21 pages, 3 figures, accepted to COLT 2009
Scientific paper
We generalise the classical Pinsker inequality which relates variational divergence to Kullback-Liebler divergence in two ways: we consider arbitrary f-divergences in place of KL divergence, and we assume knowledge of a sequence of values of generalised variational divergences. We then develop a best possible inequality for this doubly generalised situation. Specialising our result to the classical case provides a new and tight explicit bound relating KL to variational divergence (solving a problem posed by Vajda some 40 years ago). The solution relies on exploiting a connection between divergences and the Bayes risk of a learning problem via an integral representation.
Reid Mark D.
Williamson Robert C.
No associations
LandOfFree
Generalised Pinsker Inequalities does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Generalised Pinsker Inequalities, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Generalised Pinsker Inequalities will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-61923