Computer Science – Learning
Scientific paper
2011-04-12
Proc. 2011 International Joint Confernce on Neural Networks (IJCNN'2011), San Jose, CA (July 30 - Aug. 5, 2011), pp. 1141 - 11
Computer Science
Learning
Revised submission to IJCNN'2011
Scientific paper
A fundamental result of statistical learnig theory states that a concept class is PAC learnable if and only if it is a uniform Glivenko-Cantelli class if and only if the VC dimension of the class is finite. However, the theorem is only valid under special assumptions of measurability of the class, in which case the PAC learnability even becomes consistent. Otherwise, there is a classical example, constructed under the Continuum Hypothesis by Dudley and Durst and further adapted by Blumer, Ehrenfeucht, Haussler, and Warmuth, of a concept class of VC dimension one which is neither uniform Glivenko-Cantelli nor consistently PAC learnable. We show that, rather surprisingly, under an additional set-theoretic hypothesis which is much milder than the Continuum Hypothesis (Martin's Axiom), PAC learnability is equivalent to finite VC dimension for every concept class.
No associations
LandOfFree
PAC learnability versus VC dimension: a footnote to a basic result of statistical learning does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with PAC learnability versus VC dimension: a footnote to a basic result of statistical learning, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and PAC learnability versus VC dimension: a footnote to a basic result of statistical learning will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-730940