Minimum Probability Flow Learning

Computer Science – Learning

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Updated to match ICML conference proceedings

Scientific paper

Fitting probabilistic models to data is often difficult, due to the general intractability of the partition function and its derivatives. Here we propose a new parameter estimation technique that does not require computing an intractable normalization factor or sampling from the equilibrium distribution of the model. This is achieved by establishing dynamics that would transform the observed data distribution into the model distribution, and then setting as the objective the minimization of the KL divergence between the data distribution and the distribution produced by running the dynamics for an infinitesimal time. Score matching, minimum velocity learning, and certain forms of contrastive divergence are shown to be special cases of this learning technique. We demonstrate parameter estimation in Ising models, deep belief networks and an independent component analysis model of natural scenes. In the Ising model case, current state of the art techniques are outperformed by at least an order of magnitude in learning time, with lower error in recovered coupling parameters.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Minimum Probability Flow Learning does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Minimum Probability Flow Learning, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Minimum Probability Flow Learning will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-538038

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.