Substructure and Boundary Modeling for Continuous Action Recognition

Computer Science – Computer Vision and Pattern Recognition

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Detailed version of the CVPR 2012 paper. 15 pages, 6 figures

Scientific paper

This paper introduces a probabilistic graphical model for continuous action recognition with two novel components: substructure transition model and discriminative boundary model. The first component encodes the sparse and global temporal transition prior between action primitives in state-space model to handle the large spatial-temporal variations within an action class. The second component enforces the action duration constraint in a discriminative way to locate the transition boundaries between actions more accurately. The two components are integrated into a unified graphical structure to enable effective training and inference. Our comprehensive experimental results on both public and in-house datasets show that, with the capability to incorporate additional information that had not been explicitly or efficiently modeled by previous methods, our proposed algorithm achieved significantly improved performance for continuous action recognition.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Substructure and Boundary Modeling for Continuous Action Recognition does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Substructure and Boundary Modeling for Continuous Action Recognition, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Substructure and Boundary Modeling for Continuous Action Recognition will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-301565

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.