Automatic local Gabor Features extraction for face recognition

Computer Science – Computer Vision and Pattern Recognition

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

7 pages, International Journal of Computer Science and Information Security, IJCSIS, Impact Factor 0.423

Scientific paper

We present in this paper a biometric system of face detection and recognition in color images. The face detection technique is based on skin color information and fuzzy classification. A new algorithm is proposed in order to detect automatically face features (eyes, mouth and nose) and extract their correspondent geometrical points. These fiducial points are described by sets of wavelet components which are used for recognition. To achieve the face recognition, we use neural networks and we study its performances for different inputs. We compare the two types of features used for recognition: geometric distances and Gabor coefficients which can be used either independently or jointly. This comparison shows that Gabor coefficients are more powerful than geometric distances. We show with experimental results how the importance recognition ratio makes our system an effective tool for automatic face detection and recognition.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Automatic local Gabor Features extraction for face recognition does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Automatic local Gabor Features extraction for face recognition, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Automatic local Gabor Features extraction for face recognition will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-520859

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.