27 April 2010 Human emotion detector based on genetic algorithm using lip features
Author Affiliations +
Abstract
We predicted human emotion using a Genetic Algorithm (GA) based lip feature extractor from facial images to classify all seven universal emotions of fear, happiness, dislike, surprise, anger, sadness and neutrality. First, we isolated the mouth from the input images using special methods, such as Region of Interest (ROI) acquisition, grayscaling, histogram equalization, filtering, and edge detection. Next, the GA determined the optimal or near optimal ellipse parameters that circumvent and separate the mouth into upper and lower lips. The two ellipses then went through fitness calculation and were followed by training using a database of Japanese women's faces expressing all seven emotions. Finally, our proposed algorithm was tested using a published database consisting of emotions from several persons. The final results were then presented in confusion matrices. Our results showed an accuracy that varies from 20% to 60% for each of the seven emotions. The errors were mainly due to inaccuracies in the classification, and also due to the different expressions in the given emotion database. Detailed analysis of these errors pointed to the limitation of detecting emotion based on the lip features alone. Similar work [1] has been done in the literature for emotion detection in only one person, we have successfully extended our GA based solution to include several subjects.
© (2010) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Terrence Brown, Gholamreza Fetanat, Abdollah Homaifar, Brian Tsou, Olga Mendoza-Schrock, "Human emotion detector based on genetic algorithm using lip features", Proc. SPIE 7704, Evolutionary and Bio-Inspired Computation: Theory and Applications IV, 77040I (27 April 2010); doi: 10.1117/12.851190; https://doi.org/10.1117/12.851190
PROCEEDINGS
8 PAGES


SHARE
Back to Top