Computer Science – Learning
Scientific paper
2012-03-15
Computer Science
Learning
Appears in Proceedings of the Twenty-Sixth Conference on Uncertainty in Artificial Intelligence (UAI2010)
Scientific paper
There is much interest in the Hierarchical Dirichlet Process Hidden Markov Model (HDP-HMM) as a natural Bayesian nonparametric extension of the traditional HMM. However, in many settings the HDP-HMM's strict Markovian constraints are undesirable, particularly if we wish to learn or encode non-geometric state durations. We can extend the HDP-HMM to capture such structure by drawing upon explicit-duration semi-Markovianity, which has been developed in the parametric setting to allow construction of highly interpretable models that admit natural prior information on state durations. In this paper we introduce the explicitduration HDP-HSMM and develop posterior sampling algorithms for efficient inference in both the direct-assignment and weak-limit approximation settings. We demonstrate the utility of the model and our inference methods on synthetic data as well as experiments on a speaker diarization problem and an example of learning the patterns in Morse code.
Johnson Matthew J.
Willsky Alan
No associations
LandOfFree
The Hierarchical Dirichlet Process Hidden Semi-Markov Model does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with The Hierarchical Dirichlet Process Hidden Semi-Markov Model, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and The Hierarchical Dirichlet Process Hidden Semi-Markov Model will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-32178