General bound of overfitting for MLP regression models

Mathematics – Statistics Theory

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Scientific paper

Multilayer perceptrons (MLP) with one hidden layer have been used for a long time to deal with non-linear regression. However, in some task, MLP's are too powerful models and a small mean square error (MSE) may be more due to overfitting than to actual modelling. If the noise of the regression model is Gaussian, the overfitting of the model is totally determined by the behavior of the likelihood ratio test statistic (LRTS), however in numerous cases the assumption of normality of the noise is arbitrary if not false. In this paper, we present an universal bound for the overfitting of such model under weak assumptions, this bound is valid without Gaussian or identifiability assumptions. The main application of this bound is to give a hint about determining the true architecture of the MLP model when the number of data goes to infinite. As an illustration, we use this theoretical result to propose and compare effective criteria to find the true architecture of an MLP.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

General bound of overfitting for MLP regression models does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with General bound of overfitting for MLP regression models, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and General bound of overfitting for MLP regression models will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-183815

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.