Information, Divergence and Risk for Binary Experiments

Statistics – Machine Learning

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

89 pages, 9 figures

Scientific paper

We unify f-divergences, Bregman divergences, surrogate loss bounds (regret bounds), proper scoring rules, matching losses, cost curves, ROC-curves and information. We do this by systematically studying integral and variational representations of these objects and in so doing identify their primitives which all are related to cost-sensitive binary classification. As well as clarifying relationships between generative and discriminative views of learning, the new machinery leads to tight and more general surrogate loss bounds and generalised Pinsker inequalities relating f-divergences to variational divergence. The new viewpoint illuminates existing algorithms: it provides a new derivation of Support Vector Machines in terms of divergences and relates Maximum Mean Discrepancy to Fisher Linear Discriminants. It also suggests new techniques for estimating f-divergences.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Information, Divergence and Risk for Binary Experiments does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Information, Divergence and Risk for Binary Experiments, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Information, Divergence and Risk for Binary Experiments will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-34440

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.