Computer Science – Learning
Scientific paper
2010-03-25
Computer Science
Learning
Scientific paper
Current statistical models for structured prediction make simplifying assumptions about the underlying output graph structure, such as assuming a low-order Markov chain, because exact inference becomes intractable as the tree-width of the underlying graph increases. Approximate inference algorithms, on the other hand, force one to trade off representational power with computational efficiency. In this paper, we propose two new types of probabilistic graphical models, large margin Boltzmann machines (LMBMs) and large margin sigmoid belief networks (LMSBNs), for structured prediction. LMSBNs in particular allow a very fast inference algorithm for arbitrary graph structures that runs in polynomial time with a high probability. This probability is data-distribution dependent and is maximized in learning. The new approach overcomes the representation-efficiency trade-off in previous models and allows fast structured prediction with complicated graph structures. We present results from applying a fully connected model to multi-label scene classification and demonstrate that the proposed approach can yield significant performance gains over current state-of-the-art methods.
Miao Xu
Rao Rajesh P. N.
No associations
LandOfFree
Large Margin Boltzmann Machines and Large Margin Sigmoid Belief Networks does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Large Margin Boltzmann Machines and Large Margin Sigmoid Belief Networks, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Large Margin Boltzmann Machines and Large Margin Sigmoid Belief Networks will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-555790