Sum-Product Networks: A New Deep Architecture

Computer Science – Learning

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Scientific paper

The key limiting factor in graphical model inference and learning is the complexity of the partition function. We thus ask the question: what are general conditions under which the partition function is tractable? The answer leads to a new kind of deep architecture, which we call sum-product networks (SPNs). SPNs are directed acyclic graphs with variables as leaves, sums and products as internal nodes, and weighted edges. We show that if an SPN is complete and consistent it represents the partition function and all marginals of some graphical model, and give semantics to its nodes. Essentially all tractable graphical models can be cast as SPNs, but SPNs are also strictly more general. We then propose learning algorithms for SPNs, based on backpropagation and EM. Experiments show that inference and learning with SPNs can be both faster and more accurate than with standard deep networks. For example, SPNs perform image completion better than state-of-the-art deep networks for this task. SPNs also have intriguing potential connections to the architecture of the cortex.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Sum-Product Networks: A New Deep Architecture does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Sum-Product Networks: A New Deep Architecture, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Sum-Product Networks: A New Deep Architecture will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-90500

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.