Computer Science – Learning
Scientific paper
2007-04-20
Computer Science
Learning
9 pages
Scientific paper
We introduce a framework for filtering features that employs the Hilbert-Schmidt Independence Criterion (HSIC) as a measure of dependence between the features and the labels. The key idea is that good features should maximise such dependence. Feature selection for various supervised learning problems (including classification and regression) is unified under this framework, and the solutions can be approximated using a backward-elimination algorithm. We demonstrate the usefulness of our method on both artificial and real world datasets.
Bedo Justin
Borgwardt Karsten
Gretton Arthur
Smola Alex
Song Le
No associations
LandOfFree
Supervised Feature Selection via Dependence Estimation does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Supervised Feature Selection via Dependence Estimation, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Supervised Feature Selection via Dependence Estimation will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-573387