Computer Science – Learning
Scientific paper
2007-09-23
Neurocomputing 70, 7-9 (2007) 1276-1288
Computer Science
Learning
Scientific paper
10.1016/j.neucom.2006.11.019
Combining the mutual information criterion with a forward feature selection strategy offers a good trade-off between optimality of the selected feature subset and computation time. However, it requires to set the parameter(s) of the mutual information estimator and to determine when to halt the forward procedure. These two choices are difficult to make because, as the dimensionality of the subset increases, the estimation of the mutual information becomes less and less reliable. This paper proposes to use resampling methods, a K-fold cross-validation and the permutation test, to address both issues. The resampling methods bring information about the variance of the estimator, information which can then be used to automatically set the parameter and to calculate a threshold to stop the forward procedure. The procedure is illustrated on a synthetic dataset as well as on real-world examples.
François Damien
Rossi Fabrice
Verleysen Michel
Wertz Vincent
No associations
LandOfFree
Resampling methods for parameter-free and robust feature selection with mutual information does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Resampling methods for parameter-free and robust feature selection with mutual information, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Resampling methods for parameter-free and robust feature selection with mutual information will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-651662