Computer Science – Learning
Scientific paper
2011-11-09
Computer Science
Learning
Scientific paper
We develop three approaches for analyzing the approximation bound for the Nystrom method, one based on the matrix perturbation theory, one based on the concentration inequality of integral operator, and one based on the incoherence measure introduced in compressive sensing. The new analysis improves the approximation error of the Nystrom method from $O(m^{-1/4})$ to $O(m^{-1/2})$, and further to $O(m^{-p})$ if the eigenvalues of the kernel matrix follow a $p$-power law, which explains why the Nystrom method works very well for kernel matrix with skewed eigenvalues. We develop a kernel classification approach based on the Nystrom method and derive its generalized performance. We show that when the eigenvalues of kernel matrix follow a $p$-power law, we can reduce the number of support vectors to $O(N^{2/(p+1)})$ without seriously sacrificing its generalized performance.
Jin Rong
Mahdavi Mehrdad
Yang Tianbao
No associations
LandOfFree
Improved Bound for the Nystrom's Method and its Application to Kernel Classification does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Improved Bound for the Nystrom's Method and its Application to Kernel Classification, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Improved Bound for the Nystrom's Method and its Application to Kernel Classification will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-43114