Fusion of Daubechies Wavelet Coefficients for Human Face Recognition

Computer Science – Computer Vision and Pattern Recognition

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Scientific paper

In this paper fusion of visual and thermal images in wavelet transformed domain has been presented. Here, Daubechies wavelet transform, called as D2, coefficients from visual and corresponding coefficients computed in the same manner from thermal images are combined to get fused coefficients. After decomposition up to fifth level (Level 5) fusion of coefficients is done. Inverse Daubechies wavelet transform of those coefficients gives us fused face images. The main advantage of using wavelet transform is that it is well-suited to manage different image resolution and allows the image decomposition in different kinds of coefficients, while preserving the image information. Fused images thus found are passed through Principal Component Analysis (PCA) for reduction of dimensions and then those reduced fused images are classified using a multi-layer perceptron. For experiments IRIS Thermal/Visual Face Database was used. Experimental results show that the performance of the approach presented here achieves maximum success rate of 100% in many cases.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Fusion of Daubechies Wavelet Coefficients for Human Face Recognition does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Fusion of Daubechies Wavelet Coefficients for Human Face Recognition, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Fusion of Daubechies Wavelet Coefficients for Human Face Recognition will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-594545

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.