Video Event Recognition for Surveillance Applications (VERSA)

Computer Science – Computer Vision and Pattern Recognition

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Master's Thesis, University of Nebraska at Omaha, 2008

Scientific paper

VERSA provides a general-purpose framework for defining and recognizing events in live or recorded surveillance video streams. The approach for event recognition in VERSA is using a declarative logic language to define the spatial and temporal relationships that characterize a given event or activity. Doing so requires the definition of certain fundamental spatial and temporal relationships and a high-level syntax for specifying frame templates and query parameters. Although the handling of uncertainty in the current VERSA implementation is simplistic, the language and architecture is amenable to extending using Fuzzy Logic or similar approaches. VERSA's high-level architecture is designed to work in XML-based, services- oriented environments. VERSA can be thought of as subscribing to the XML annotations streamed by a lower-level video analytics service that provides basic entity detection, labeling, and tracking. One or many VERSA Event Monitors could thus analyze video streams and provide alerts when certain events are detected.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Video Event Recognition for Surveillance Applications (VERSA) does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Video Event Recognition for Surveillance Applications (VERSA), we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Video Event Recognition for Surveillance Applications (VERSA) will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-610555

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.