Physics – Condensed Matter – Disordered Systems and Neural Networks
Scientific paper
2000-07-13
J.Phys.A. 33 (2000) 8703-8722
Physics
Condensed Matter
Disordered Systems and Neural Networks
20 pages
Scientific paper
10.1088/0305-4470/33/48/309
We study the dynamics of on-line learning in large perceptrons, for the case of training sets with a structural bias of the input vectors, by deriving exact and closed macroscopic dynamical laws using non-equilibrium statistical mechanical tools. In sharp contrast to the more conventional theories developed for homogeneously distributed or only weakly biased data, these laws are found to describe a non-trivial and persistently non-deterministic macroscopic evolution, and a generalisation error which retains both stochastic and sample-to-sample fluctuations, even for infinitely large networks. Furthermore, for the standard error-correcting microscopic algorithms (such as the perceptron learning rule) one obtains learning curves with distinct bias-induced phases. Our theoretical predictions find excellent confirmation in numerical simulations.
Coolen Anthony C. C.
Heimel J. A. F.
Rae H. C.
No associations
LandOfFree
Non-Deterministic Learning Dynamics in Large Neural Networks due to Structural Data Bias does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Non-Deterministic Learning Dynamics in Large Neural Networks due to Structural Data Bias, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Non-Deterministic Learning Dynamics in Large Neural Networks due to Structural Data Bias will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-415512