Physics – Condensed Matter – Statistical Mechanics
Scientific paper
2001-02-27
Physics
Condensed Matter
Statistical Mechanics
10 pages, 5 figs., submitted to PRE
Scientific paper
10.1103/PhysRevE.64.046109
On-line and batch learning of a perceptron in a discrete weight space, where each weight can take $2 L+1$ different values, are examined analytically and numerically. The learning algorithm is based on the training of the continuous perceptron and prediction following the clipped weights. The learning is described by a new set of order parameters, composed of the overlaps between the teacher and the continuous/clipped students. Different scenarios are examined among them on-line learning with discrete/continuous transfer functions and off-line Hebb learning. The generalization error of the clipped weights decays asymptotically as $exp(-K \alpha^2)$/$exp(-e^{|\lambda| \alpha})$ in the case of on-line learning with binary/continuous activation functions, respectively, where $\alpha$ is the number of examples divided by N, the size of the input vector and $K$ is a positive constant that decays linearly with 1/L. For finite $N$ and $L$, a perfect agreement between the discrete student and the teacher is obtained for $\alpha \propto \sqrt{L \ln(NL)}$. A crossover to the generalization error $\propto 1/\alpha$, characterized continuous weights with binary output, is obtained for synaptic depth $L > O(\sqrt{N})$.
Kanter Ido
Rosen-Zvi Michal
No associations
LandOfFree
Training a perceptron in a discrete weight space does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Training a perceptron in a discrete weight space, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Training a perceptron in a discrete weight space will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-563174