Classification of Log-Polar-Visual Eigenfaces using Multilayer Perceptron

Computer Science – Computer Vision and Pattern Recognition

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Scientific paper

In this paper we present a simple novel approach to tackle the challenges of scaling and rotation of face images in face recognition. The proposed approach registers the training and testing visual face images by log-polar transformation, which is capable to handle complicacies introduced by scaling and rotation. Log-polar images are projected into eigenspace and finally classified using an improved multi-layer perceptron. In the experiments we have used ORL face database and Object Tracking and Classification Beyond Visible Spectrum (OTCBVS) database for visual face images. Experimental results show that the proposed approach significantly improves the recognition performances from visual to log-polar-visual face images. In case of ORL face database, recognition rate for visual face images is 89.5% and that is increased to 97.5% for log-polar-visual face images whereas for OTCBVS face database recognition rate for visual images is 87.84% and 96.36% for log-polar-visual face images.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Classification of Log-Polar-Visual Eigenfaces using Multilayer Perceptron does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Classification of Log-Polar-Visual Eigenfaces using Multilayer Perceptron, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Classification of Log-Polar-Visual Eigenfaces using Multilayer Perceptron will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-594640

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.