Statistics – Machine Learning
Scientific paper
2011-10-19
Statistics
Machine Learning
17 pages, latex2e
Scientific paper
There is an increasing body of evidence suggesting that exact nearest neighbour search in high-dimensional spaces is affected by the curse of dimensionality at a fundamental level. Does it necessarily mean that the same is true for k nearest neighbours based learning algorithms such as the k-NN classifier? We analyse this question at a number of levels and show that the answer is different at every layer that we peel. As our first main result, we show the consistency of a k approximate nearest neighbour classifier. However, the performance of the classifier in very high dimensions is provably unstable. As our second major result, we point out that the existing model for statistical learning is oblivious of dimension of the domain and so every learning problem admits a universally consistent reduction to the one-dimensional case.
No associations
LandOfFree
Is the k-NN classifier in high dimensions affected by the curse of dimensionality? does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Is the k-NN classifier in high dimensions affected by the curse of dimensionality?, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Is the k-NN classifier in high dimensions affected by the curse of dimensionality? will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-597742