Improved Bound for the Nystrom's Method and its Application to Kernel Classification

Computer Science – Learning

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Scientific paper

We develop three approaches for analyzing the approximation bound for the Nystrom method, one based on the matrix perturbation theory, one based on the concentration inequality of integral operator, and one based on the incoherence measure introduced in compressive sensing. The new analysis improves the approximation error of the Nystrom method from $O(m^{-1/4})$ to $O(m^{-1/2})$, and further to $O(m^{-p})$ if the eigenvalues of the kernel matrix follow a $p$-power law, which explains why the Nystrom method works very well for kernel matrix with skewed eigenvalues. We develop a kernel classification approach based on the Nystrom method and derive its generalized performance. We show that when the eigenvalues of kernel matrix follow a $p$-power law, we can reduce the number of support vectors to $O(N^{2/(p+1)})$ without seriously sacrificing its generalized performance.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Improved Bound for the Nystrom's Method and its Application to Kernel Classification does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Improved Bound for the Nystrom's Method and its Application to Kernel Classification, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Improved Bound for the Nystrom's Method and its Application to Kernel Classification will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-43114

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.