Emotion Classification using Adaptive SVMs

The study of the interaction between humans and computers has been emerging during the last few years. This interaction will be more powerful if computers are able to perceive and respond to human nonverbal communication such as emotions. In this study, we present the image-based approach to emotion classification through lower facial expression. We employ a set of feature points in the lower face image according to the particular face model used and consider their motion across each emotive expression of images. The vector of displacements of all feature points input to the Adaptive Support Vector Machines (A-SVMs) classifier that classify it into seven basic emotions scheme, namely neutral, angry, disgust, fear, happy, sad and surprise. The system was tested on the Japanese Female Facial Expression (JAFFE) dataset of frontal view facial expressions [7]. Our experiments on emotion classification through lower facial expressions demonstrate the robustness of Adaptive SVM classifier and verify the high efficiency of our approach.

Authors:



References:
[1] J. F. Cohn, A. J. Zlochower, J. J. Lien, and T. Kanade. 1999. Automated
face analysis by feature-point tracking has high concurrent validity with
manual faces coding. Psychophysiology. 36:35-43.
[2] M. N. Dailey, G. W. Cottrell, and R. Adolphs. 2000. A six-unit network
is all you need to discover happiness. In Proceedings of the Twenty-
Second Annual Conference of the Cognitive Science Society. Mahwah.
NJ. USA.
[3] P. Ekman. 1982. Emotion in the Human Face. Cambridge University
Press. Cambridge. UK.
[4] P.Ekman and W. Frisen. 1978. Facial Action Coding System (FACS):
Manual. Consulting Psychologists Press. Palo Alto. CA. USA.
[5] I. Essa and A. Pentland. 1997. Coding, Analysis, Interpretation and
Recognition of Facial Expressions. IEEE Transaction on Pattern
Analysis and Machine Intelligence. 19(7): 757-763.
[6] G. Littlewort, I. Fasel, M. Stewart Bartlett, and J. R. Movellan. 2002.
Fully Automatic Coding of Basic Expressions from Video. Technical
Report 2002.03, UCSD INC MPLab.
[7] M. J. Lyons, Shigeru Akamatsu, Miyuki Kamachi & Jiro Gyoba. 1998.
Coding Facial Expressions with Gabor Wavelets. Proceedings Third
IEEE International Conference on Automatic Face and Gesture
Recognition. April 14-16. Nara Japan. IEEE Computer Society. pp. 200-
205.
[8] M. J. Lyons, J. Budynek, and S. Akamatsu. 1999. Automatic
Classification of Single Facial Images. IEEE Trans. On Patt. And Anal.
And Mach. Intel. vol. 21. no. 12. pp. 1357-1362.
[9] K. Matsuno, C-W Lee, S. Kimura, S. Tsuji. 1995. Automatic
Recoginition of Human Facial Expressions. In Proceeding of the Fifth
International Conference on Computer Vision (ICCV- 95).
[10] P. Michel and R. E. Kaliouby. 2003. Real Time Facial Expression
Recognition in Video using Support Vector Machines. ICMI-03.
November 5-7. Vancouver. British Columbia. Canada.
[11] M. Pantic and L. Rothkrantz. 2000. Automatic Analysis of Facial
Expressions: The State of the Art. IEEE Transactions on Pattern
Analysis and Machine Intelligence, vol. 22. no. 12. December.
[12] V. Vapnik. 1995. The Natural of Statistical Learning Theory. Springer.
New York.
[13] P. Viola and M. Jones. 2001. Robust Real-Time Object Detection.
Technical Report 2001/01. Compaq Cambridge Research Lab..
[14] Porawat Visutsak. Emotion Recognition through Lower Facial
Expressions Using Support Vector Machines, In Proceedings of The
Fifth National Symposium on Graduate Research, Kasetsart University,
BKK, Thailand. 10-11 Oct. 2005.