Computer Science – Learning
Scientific paper
2005-02-17
Computer Science
Learning
Scientific paper
In statistical setting of the pattern recognition problem the number of examples required to approximate an unknown labelling function is linear in the VC dimension of the target learning class. In this work we consider the question whether such bounds exist if we restrict our attention to computable pattern recognition methods, assuming that the unknown labelling function is also computable. We find that in this case the number of examples required for a computable method to approximate the labelling function not only is not linear, but grows faster (in the VC dimension of the class) than any computable function. No time or space constraints are put on the predictors or target functions; the only resource we consider is the training examples. The task of pattern recognition is considered in conjunction with another learning problem -- data compression. An impossibility result for the task of data compression allows us to estimate the sample complexity for pattern recognition.
No associations
LandOfFree
On sample complexity for computational pattern recognition does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with On sample complexity for computational pattern recognition, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and On sample complexity for computational pattern recognition will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-97247