Gradient descent learning in and out of equilibrium

Physics – Condensed Matter – Disordered Systems and Neural Networks

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

08 pages, submitted to the Journal of Physics A

Scientific paper

10.1103/PhysRevE.63.061905

Relations between the off thermal equilibrium dynamical process of on-line learning and the thermally equilibrated off-line learning are studied for potential gradient descent learning. The approach of Opper to study on-line Bayesian algorithms is extended to potential based or maximum likelihood learning. We look at the on-line learning algorithm that best approximates the off-line algorithm in the sense of least Kullback-Leibler information loss. It works by updating the weights along the gradient of an effective potential different from the parent off-line potential. The interpretation of this off equilibrium dynamics holds some similarities to the cavity approach of Griniasty. We are able to analyze networks with non-smooth transfer functions and transfer the smoothness requirement to the potential.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Gradient descent learning in and out of equilibrium does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Gradient descent learning in and out of equilibrium, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Gradient descent learning in and out of equilibrium will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-72932

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.