Recognition based shape from shading for planetary robotics

Statistics – Computation

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Scientific paper

Planetary robotics often requires the creation of a digital elevation model (DEM) of the surrounding environment for navigation and the identification of science targets. Shape from shading approaches are capable of constructing high detail DEMs at long range given a single camera image and usually a priori information about rough surface shape and/or the lighting direction. A new recognition based method for estimating surface shape from the shading information in an image is developed. Surface normals are used to label the surrounding image segments from a training set. Unknown image segments are then matched to those previously recorded to recover surface shape. This recognition based method allows it in principle to be used on surfaces with more detailed lighting models than typical shape from shading algorithms. A basic dimensionality reduction method allows the computational time to be improved and provides some gains in matching accuracy. Current results show a promising ability to recover shape given a similar training surface.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Recognition based shape from shading for planetary robotics does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Recognition based shape from shading for planetary robotics, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Recognition based shape from shading for planetary robotics will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-971499

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.