Automatic facial feature extraction and expression recognition based on neural network

Computer Science – Computer Vision and Pattern Recognition

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

6 pages,pp. 113-118, (IJACSA) International Journal of Advanced Computer Science and Applications, Vol. 2, No.1, January 2011

Scientific paper

In this paper, an approach to the problem of automatic facial feature extraction from a still frontal posed image and classification and recognition of facial expression and hence emotion and mood of a person is presented. Feed forward back propagation neural network is used as a classifier for classifying the expressions of supplied face into seven basic categories like surprise, neutral, sad, disgust, fear, happy and angry. For face portion segmentation and localization, morphological image processing operations are used. Permanent facial features like eyebrows, eyes, mouth and nose are extracted using SUSAN edge detection operator, facial geometry, edge projection analysis. Experiments are carried out on JAFFE facial expression database and gives better performance in terms of 100% accuracy for training set and 95.26% accuracy for test set.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Automatic facial feature extraction and expression recognition based on neural network does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Automatic facial feature extraction and expression recognition based on neural network, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Automatic facial feature extraction and expression recognition based on neural network will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-644688

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.