Choosing a penalty for model selection in heteroscedastic regression

Mathematics – Statistics Theory

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Scientific paper

We consider the problem of choosing between several models in least-squares regression with heteroscedastic data. We prove that any penalization procedure is suboptimal when the penalty is a function of the dimension of the model, at least for some typical heteroscedastic model selection problems. In particular, Mallows' Cp is suboptimal in this framework. On the contrary, optimal model selection is possible with data-driven penalties such as resampling or $V$-fold penalties. Therefore, it is worth estimating the shape of the penalty from data, even at the price of a higher computational cost. Simulation experiments illustrate the existence of a trade-off between statistical accuracy and computational complexity. As a conclusion, we sketch some rules for choosing a penalty in least-squares regression, depending on what is known about possible variations of the noise-level.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Choosing a penalty for model selection in heteroscedastic regression does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Choosing a penalty for model selection in heteroscedastic regression, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Choosing a penalty for model selection in heteroscedastic regression will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-94128

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.