Development System for Emotion Detection Based on Brain Signals and Facial Images

Detection of human emotions has many potential applications. One of application is to quantify attentiveness audience in order evaluate acoustic quality in concern hall. The subjective audio preference that based on from audience is used. To obtain fairness evaluation of acoustic quality, the research proposed system for multimodal emotion detection; one modality based on brain signals that measured using electroencephalogram (EEG) and the second modality is sequences of facial images. In the experiment, an audio signal was customized which consist of normal and disorder sounds. Furthermore, an audio signal was played in order to stimulate positive/negative emotion feedback of volunteers. EEG signal from temporal lobes, i.e. T3 and T4 was used to measured brain response and sequence of facial image was used to monitoring facial expression during volunteer hearing audio signal. On EEG signal, feature was extracted from change information in brain wave, particularly in alpha and beta wave. Feature of facial expression was extracted based on analysis of motion images. We implement an advance optical flow method to detect the most active facial muscle form normal to other emotion expression that represented in vector flow maps. The reduce problem on detection of emotion state, vector flow maps are transformed into compass mapping that represents major directions and velocities of facial movement. The results showed that the power of beta wave is increasing when disorder sound stimulation was given, however for each volunteer was giving different emotion feedback. Based on features derived from facial face images, an optical flow compass mapping was promising to use as additional information to make decision about emotion feedback.





References:
[1] C. Busso, Z. Deng *, S. Yildirim, M. Bulut, C. Min Lee, A.
Kazemzadeh, S. Lee, U. Neumann*, S. Narayanan, "Analysis of
Emotion Recognition using Facial Expressions, Speech and
Multimodal Information", the Proceedings of of ACM 6th
International Conference on Multimodal Interfaces (ICMI 2004),
State College, PA, Oct 2004
[2] M. Pantic and L.J.M. Rothkrantz, "Toward an Affect-Sensitive Multimodal
Human-Computer Interaction", Invited Speaker on the Proceedings
of IEEE Vol. 91, No. 9, September 2003
[3] D. DeCarlo, D. Metaxas, "Optical Flow Constraints on Deformable
Models with Applications to Face Tracking", International Journal of
Computer Vision, 38(2), pp. 99-127, July 2000
[4] S. Periaswamy and H. Farid. Elastic registration in the presence of
intensity variations. IEEE Transactions on Medical Imaging, 2003
[5] Cohn, J.F., Zlochower, A.J., Lien, J.J., and Kanade, T., "Feature-
Point Tracking by Optical Flow Discriminates Subtle Differences in
Facial Expression," The Proceedings of Third IEEE International
Conference on Automatic Face and Gesture Recognition, Japan, April
1998
[6] J. Lien, T. Kanade, J. Cohn, and C. Li, "Automatic Analysis of Facial
Expressions: The State of the Art", IEEE Transactions on Pattern
Analysis and Machine Intelligence, pp. 1424-1445, Vol.22, Issue 12,
2000
[7] Sato, S., Nishio, K., and Ando, Y. ÔÇÿPropagation of alpha waves
corresponding to subjective preference from the right hemisphere to
the left with changes in the IACC of a sound field-. J. Temporal Des.
Arch. Environ. 3, 60-69. 2003
[8] John L. Semmlow, Biosignal and Biomedical Image Processing:
MATLAB-Based Applications, (Signal Processing and Communications,
22), Marcel Dekker, Inc. 2004
[9] Webster, John G (1998), "Medical Instrumentation: Application and
Design": John Wiley and Sons, Third Edition, USA.
[10] James A. Cond, John J.B. Allen and E. Harmon, Voluntary facial
expression and hemispheric asymmetry over the frontal cortex, Psychophysiology,
38 ~2001!, 912-925. Cambridge
[11] Carmen J. Duthoit, et.al, "Optical Flow Image Analysis of Facial
Expression of Human Emotion - Forensic Applications, Journal e-
Forensic, January 2008,