Iterative Feature Selection In Least Square Regression Estimation

Mathematics – Statistics Theory

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Scientific paper

In this paper, we focus on regression estimation in both the inductive and the transductive case. We assume that we are given a set of features (which can be a base of functions, but not necessarily). We begin by giving a deviation inequality on the risk of an estimator in every model defined by using a single feature. These models are too simple to be useful by themselves, but we then show how this result motivates an iterative algorithm that performs feature selection in order to build a suitable estimator. We prove that every selected feature actually improves the performance of the estimator. We give all the estimators and results at first in the inductive case, which requires the knowledge of the distribution of the design, and then in the transductive case, in which we do not need to know this distribution.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Iterative Feature Selection In Least Square Regression Estimation does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Iterative Feature Selection In Least Square Regression Estimation, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Iterative Feature Selection In Least Square Regression Estimation will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-291415

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.