Risk Bounds for Embedded Variable Selection in Classification Trees

Mathematics – Statistics Theory

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Scientific paper

The problems of model and variable selections for classification trees are jointly considered. A penalized criterion is proposed which explicitly takes into account the number of variables, and a risk bound inequality is provided for the tree classifier minimizing this criterion. This penalized criterion is compared to the one used during the pruning step of the CART algorithm. It is shown that the two criteria are similar under some specific margin assumptions. In practice, the tuning parameter of the CART penalty has to be calibrated by hold-out. Simulation studies are performed which confirm that the hold-out procedure mimics the form of the proposed penalized criterion.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Risk Bounds for Embedded Variable Selection in Classification Trees does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Risk Bounds for Embedded Variable Selection in Classification Trees, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Risk Bounds for Embedded Variable Selection in Classification Trees will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-663471

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.