OPEN_EmoRec_II- A Multimodal Corpus of Human-Computer Interaction

OPEN_EmoRec_II is an open multimodal corpus with experimentally induced emotions. In the first half of the experiment, emotions were induced with standardized picture material and in the second half during a human-computer interaction (HCI), realized with a wizard-of-oz design. The induced emotions are based on the dimensional theory of emotions (valence, arousal and dominance). These emotional sequences - recorded with multimodal data (facial reactions, speech, audio and physiological reactions) during a naturalistic-like HCI-environment one can improve classification methods on a multimodal level. This database is the result of an HCI-experiment, for which 30 subjects in total agreed to a publication of their data including the video material for research purposes*. The now available open corpus contains sensory signal of: video, audio, physiology (SCL, respiration, BVP, EMG Corrugator supercilii, EMG Zygomaticus Major) and facial reactions annotations.




References:
[1] Wendemuth, A. and S. Biundo, A Companion Technology for Cognitive
Technical Systems, in COST 2102 International Training School, A.
Esposito, et al., Editors. 2012, LNCS 7403, Berlin, Springer: Dresden,
Germany. p. 89-103.
[2] Traue, H.C., et al., A Framework for Emotions and Dispositions in Man-
Companion Interaction, in Converbal Synchrony in Human-Machine
Interaction, M. Rojc and N. Campbell, Editors. 2013, CRC Press. p. 98-
140.
[3] Walter, S., et al., "Similarities and differences of emotions in humanmachine
and human-human interactions: what kind of emotions are
relevant for future companion systems?". Ergonomics, 2014. 57(3): p.
374-86.
[4] Pentland, A.S., Honest Signals: How they shape our world. 2008,
Cambridge, Massachusetts, London, England: MIT Press.
[5] Tan, J., et al., "Repeatability of facial electromyography (EMG) activity
over corrugator supercilii and zygomaticus major on differentiating
various emotions". J Ambient Intell Human Comput, 2012. 3(3): p. 3-10.
[6] Schels M, et al. Multi-Modal Classifier-Fusion for the Classification of
Emotional States in WOZ Scenarios. in 1st International Conference on
Affective and Pleasurable Design (APD'12) 2012: CRC Press.
[7] Böck, R., et al. Intraindividual and interindividual multimodal emotion
analyses in human-machine-interaction. in Cognitive Methods in
Situation Awareness and Decision Support (CogSIMA), 2012 IEEE
International Multi-Disciplinary Conference on. 2012: IEEE.
[8] Bernsen, N.O., H. Dybkjaer, and L. Dybkjaer, Wizard of oz prototyping:
When and How, in Experimentelle Emotionspsychologie, W. Janke, M.
Schmidt-Daffy, and G. Debus, Editors. 2008, Pabst Publishers:
Lengerich. p. 179-192.
[9] Kächele, M., S. Rukavina, and F. Schwenker. Paradigms for the
Construction and Annotation of Emotional Corpora for Real-World
Human-Computer-Interaction. in International Conference on Pattern
Recognition Applications and Methods (ICPRAM). 2015: SciTePress.
[10] Walter, S., et al., "Transsituational Individual-Specific Biopsychological
Classification of Emotions". Systems, Man, and Cybernetics: Systems,
IEEE Transactions on, 2013. 43(4): p. 988-995.
[11] Limbrecht-Ecklundt, K., et al., The importance of subtle facial
expressions for emotion classification in human-computer interaction, in
Emotional Expression: The Brain and The Face, F.-M. A, Editor. 2013,
UFT Press.
[12] Hrabal, D., et al., Physiological Effects of Delayed System Response
Time on Skin Conductance, in Multimodal Pattern Recognition of Social
Signals in Human-Computer-Interaction, F. Schwenker, S. Scherer, and
L.-P. Morency, Editors. 2013, Springer Berlin Heidelberg. p. 52-62.
[13] Scherer, S., et al., "Spotting laughter in natural multiparty conversations:
A comparison of automatic online and offline approaches using
audiovisual data". ACM Trans. Interact. Intell. Syst., 2012. 2(1): p. 1-31.
[14] Meudt, S., L. Bigalke, and F. Schwenker. ATLAS – an annotation tool
for HCI data utilizing machine learning methods. in 1st International
Conference on Affective and Pleasurable Design (Jointly with the 4th
International Conference on Applied Human Factors and Ergonomics
(AHFE'12)). 2012.
[15] Lang, P.J., M.M. Bradley, and B.N. Cuthbert, International Affective
Picture System (IAPS): Affective ratings of pictures and instruction
manual. Technical Report A-6. 2005, University of Florida, Gainesville,
FL.
[16] Lang, P.J., M.M. Bradley, and B.N. Cuthbert, International affective
picture system (IAPS): Technical manual and affective ratings. 1999,
University of Florida, Center for Research in Psychophysiology:
Gainesville.
[17] Walter, S., et al., "The influence of neuroticism and psychological
symptoms on the assessment of images in three-dimensional emotion
space". Psychosoc Med, 2011. 8: p. Doc04.
[18] Smith, J.C., M.M. Bradley, and P.J. Lang, "State anxiety and affective
physiology: effects of sustained exposure to affective pictures".
Biological Psychology, 2005. 69(3): p. 247-60.
[19] Valenza, G., A. Lanata, and E.P. Scilingo, "The Role of Nonlinear
Dynamics in Affective Valence and Arousal Recognition". IEEE
Transactions on Affective Computing, 2011 PrePrints. 99.
[20] Selvaraj, N., A. Jaryal, J. Santhosh, K.K. Deepak, and S. Anand,
"Assessment of heart rate variability derived from finger-tip
photoplethysmography as compared to electrocardiography". Journal of
Medical Engineering & Technology, 2008. 32(6): p. 479-484.
[21] Benedek, M. and C. Kaernbach, "Decomposition of skin conductance
data by means of nonnegative deconvolution". Psychophysiology, 2010.
47(4): p. 647-58.
[22] Bradley, M.M. and P.J. Lang, "Measuring emotion: the Self-Assessment
Manikin and the Semantic Differential". Journal of Behavior Therapy
and Experimental Psychiatry, 1994. 25(1): p. 49-59.
[23] Kring, A. and D. Sloan, "The Facial Expression Coding System
(FACES): Development, validation, and utility.". Psychological
Assessment, 2007. 19(2): p. 210-224.
[24] Hallgren, K.A., "Computing Inter-Rater Reliability for Observational
Data: An Overview and Tutorial". Tutorials in quantitative methods for
psychology, 2012. 8(1): p. 23-34.