Physics – Condensed Matter – Statistical Mechanics
Scientific paper
1997-03-20
Physics
Condensed Matter
Statistical Mechanics
RevTeX, 7 pages, 7 figures, submitted to Phys. Rev. E
Scientific paper
10.1103/PhysRevE.55.7434
We study numerically the properties of the bayesian perceptron through a gradient descent on the optimal cost function. The theoretical distribution of stabilities is deduced. It predicts that the optimal generalizer lies close to the boundary of the space of (error-free) solutions. The numerical simulations are in good agreement with the theoretical distribution. The extrapolation of the generalization error to infinite input space size agrees with the theoretical results. Finite size corrections are negative and exhibit two different scaling regimes, depending on the training set size. The variance of the generalization error vanishes for $N \rightarrow \infty$ confirming the property of self-averaging.
Buhot Arnaud
Gordon Mirta B.
Torres Moreno J.-M.
No associations
LandOfFree
Finite size scaling of the bayesian perceptron does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Finite size scaling of the bayesian perceptron, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Finite size scaling of the bayesian perceptron will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-424254