Mathematics – Statistics Theory
Scientific paper
2005-10-25
Annals of Statistics 2009, Vol. 37, No. 2, 876-904
Mathematics
Statistics Theory
Published in at http://dx.doi.org/10.1214/08-AOS595 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of
Scientific paper
10.1214/08-AOS595
The goal of binary classification is to estimate a discriminant function $\gamma$ from observations of covariate vectors and corresponding binary labels. We consider an elaboration of this problem in which the covariates are not available directly but are transformed by a dimensionality-reducing quantizer $Q$. We present conditions on loss functions such that empirical risk minimization yields Bayes consistency when both the discriminant function and the quantizer are estimated. These conditions are stated in terms of a general correspondence between loss functions and a class of functionals known as Ali-Silvey or $f$-divergence functionals. Whereas this correspondence was established by Blackwell [Proc. 2nd Berkeley Symp. Probab. Statist. 1 (1951) 93--102. Univ. California Press, Berkeley] for the 0--1 loss, we extend the correspondence to the broader class of surrogate loss functions that play a key role in the general theory of Bayes consistency for binary classification. Our result makes it possible to pick out the (strict) subset of surrogate loss functions that yield Bayes consistency for joint estimation of the discriminant function and the quantizer.
Jordan Michael I.
Nguyen XuanLong
Wainwright Martin J.
No associations
LandOfFree
On surrogate loss functions and $f$-divergences does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with On surrogate loss functions and $f$-divergences, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and On surrogate loss functions and $f$-divergences will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-430551