Finite size scaling of the bayesian perceptron

Physics – Condensed Matter – Statistical Mechanics

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

RevTeX, 7 pages, 7 figures, submitted to Phys. Rev. E

Scientific paper

10.1103/PhysRevE.55.7434

We study numerically the properties of the bayesian perceptron through a gradient descent on the optimal cost function. The theoretical distribution of stabilities is deduced. It predicts that the optimal generalizer lies close to the boundary of the space of (error-free) solutions. The numerical simulations are in good agreement with the theoretical distribution. The extrapolation of the generalization error to infinite input space size agrees with the theoretical results. Finite size corrections are negative and exhibit two different scaling regimes, depending on the training set size. The variance of the generalization error vanishes for $N \rightarrow \infty$ confirming the property of self-averaging.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Finite size scaling of the bayesian perceptron does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Finite size scaling of the bayesian perceptron, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Finite size scaling of the bayesian perceptron will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-424254

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.