Visual tracking for mobile robot pursuit

Physics

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Scientific paper

This research is part of a broader effort to develop a supervisory control system for small robot navigation. Previous research and development focused on a "one-touch, point-and-go" navigation control system using visual homing. In the current research, we have begun to investigate visual tracking methods to extend supervisory control to tasks involving tracking and pursuit of a moving object. Ground-to-ground tracking of arbitrary targets in natural and damaged environments is challenging. Automatic tracking is expected to fail due to line-of-sight obstruction, lighting gradients, rapid changes in perspective and orientation, etc. In supervisory control, the automatic tracker needs able to alert the operator when it is at risk of losing track or when it may have already lost track, and do so with a low false alarm rate. The focus of the current research is on detecting tracking failure during pursuit. We are attempting to develop approaches to detecting failure that can integrate different low-level tracking algorithms. In this paper, we demonstrate stereo vision methods for pursuit tracking and examine several indicators of track loss in field experiments with a variety of moving targets in natural environment.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Visual tracking for mobile robot pursuit does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Visual tracking for mobile robot pursuit, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Visual tracking for mobile robot pursuit will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-1659304

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.