Segmentation, Indexing, and Visualization of Extended Instructional Videos

Computer Science – Information Retrieval

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

8 pages, 13 figures

Scientific paper

We present a new method for segmenting, and a new user interface for indexing and visualizing, the semantic content of extended instructional videos. Given a series of key frames from the video, we generate a condensed view of the data by clustering frames according to media type and visual similarities. Using various visual filters, key frames are first assigned a media type (board, class, computer, illustration, podium, and sheet). Key frames of media type board and sheet are then clustered based on contents via an algorithm with near-linear cost. A novel user interface, the result of two user studies, displays related topics using icons linked topologically, allowing users to quickly locate semantically related portions of the video. We analyze the accuracy of the segmentation tool on 17 instructional videos, each of which is from 75 to 150 minutes in duration (a total of 40 hours); the classification accuracy exceeds 96%.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Segmentation, Indexing, and Visualization of Extended Instructional Videos does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Segmentation, Indexing, and Visualization of Extended Instructional Videos, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Segmentation, Indexing, and Visualization of Extended Instructional Videos will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-381249

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.