The relationship between shape under similarity transformations and shape under affine transformations

Statistics – Applications

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Scientific paper

Recent progress in shape theory, including the development of object/image equations for shape matching and shape space metrics (especially object/image metrics), is now being exploited to develop new algorithms for target recognition. This theory makes use of advanced mathematical techniques from algebraic and differential geometry to construct generalized shape spaces for various projection and sensor models, and then uses that construction to find natural metrics that express the distance (difference) between two configurations of object features, two configurations of image features, or an object and an image pair. Such metrics produce the most robust tests for target identification; at least as far as target geometry is concerned. Moreover, they also provide the basis for efficient hashing schemes to do target identification quickly and provide a rigorous foundation for error analysis in ATR.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

The relationship between shape under similarity transformations and shape under affine transformations does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with The relationship between shape under similarity transformations and shape under affine transformations, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and The relationship between shape under similarity transformations and shape under affine transformations will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-1472640

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.