Computer Science – Learning
Scientific paper
2008-12-04
Computer Science
Learning
Scientific paper
We consider the problem of PAC-learning decision trees, i.e., learning a decision tree over the n-dimensional hypercube from independent random labeled examples. Despite significant effort, no polynomial-time algorithm is known for learning polynomial-sized decision trees (even trees of any super-constant size), even when examples are assumed to be drawn from the uniform distribution on {0,1}^n. We give an algorithm that learns arbitrary polynomial-sized decision trees for {\em most product distributions}. In particular, consider a random product distribution where the bias of each bit is chosen independently and uniformly from, say, [.49,.51]. Then with high probability over the parameters of the product distribution and the random examples drawn from it, the algorithm will learn any tree. More generally, in the spirit of smoothed analysis, we consider an arbitrary product distribution whose parameters are specified only up to a [-c,c] accuracy (perturbation), for an arbitrarily small positive constant c.
Kalai Adam Tauman
Teng Shang-Hua
No associations
LandOfFree
Decision trees are PAC-learnable from most product distributions: a smoothed analysis does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Decision trees are PAC-learnable from most product distributions: a smoothed analysis, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Decision trees are PAC-learnable from most product distributions: a smoothed analysis will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-135896