Statistics – Machine Learning
Scientific paper
2009-01-05
Statistics
Machine Learning
89 pages, 9 figures
Scientific paper
We unify f-divergences, Bregman divergences, surrogate loss bounds (regret bounds), proper scoring rules, matching losses, cost curves, ROC-curves and information. We do this by systematically studying integral and variational representations of these objects and in so doing identify their primitives which all are related to cost-sensitive binary classification. As well as clarifying relationships between generative and discriminative views of learning, the new machinery leads to tight and more general surrogate loss bounds and generalised Pinsker inequalities relating f-divergences to variational divergence. The new viewpoint illuminates existing algorithms: it provides a new derivation of Support Vector Machines in terms of divergences and relates Maximum Mean Discrepancy to Fisher Linear Discriminants. It also suggests new techniques for estimating f-divergences.
Reid Mark D.
Williamson Robert C.
No associations
LandOfFree
Information, Divergence and Risk for Binary Experiments does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Information, Divergence and Risk for Binary Experiments, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Information, Divergence and Risk for Binary Experiments will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-34440