Linear Time Feature Selection for Regularized Least-Squares

Statistics – Machine Learning

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

17 pages, 15 figures

Scientific paper

We propose a novel algorithm for greedy forward feature selection for regularized least-squares (RLS) regression and classification, also known as the least-squares support vector machine or ridge regression. The algorithm, which we call greedy RLS, starts from the empty feature set, and on each iteration adds the feature whose addition provides the best leave-one-out cross-validation performance. Our method is considerably faster than the previously proposed ones, since its time complexity is linear in the number of training examples, the number of features in the original data set, and the desired size of the set of selected features. Therefore, as a side effect we obtain a new training algorithm for learning sparse linear RLS predictors which can be used for large scale learning. This speed is possible due to matrix calculus based short-cuts for leave-one-out and feature addition. We experimentally demonstrate the scalability of our algorithm and its ability to find good quality feature sets.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Linear Time Feature Selection for Regularized Least-Squares does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Linear Time Feature Selection for Regularized Least-Squares, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Linear Time Feature Selection for Regularized Least-Squares will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-700411

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.