Non-Invasive Technology on a Classroom Chair for Detection of Emotions Used for the Personalization of Learning Resources

Emotions are related with learning processes and physiological signals can be used to detect them for the personalization of learning resources and to control the pace of instruction. A model of relevant emotions has been developed, where specific combinations of emotions and cognition processes are connected and integrated with the concept of 'flow', in order to improve learning. The cardiac pulse is a reliable signal that carries useful information about the subject-s emotional condition; it is detected using a classroom chair adapted with non invasive EMFi sensor and an acquisition system that generates a ballistocardiogram (BCG), the signal is processed by an algorithm to obtain characteristics that match a specific emotional condition. The complete chair system is presented in this work, along with a framework for the personalization of learning resources.

System Module for Student Idol

Malaysia government had been trying hard in order to find the most efficient methods in learning. However, it is hard to actually access and evaluate students whom will then be called an excellent student. It is because in our realties student who excellent is only excel in academic. This evaluation becomes a problem because it not balances in our real life interm of to get an excellent student in whole area in their involvement of curiculum and cocuriculum. To overcome this scenario, we designed a module for Student Idol to evaluate student through three categories which are academic, co-curiculum and leadership. All the categories have their own merit point. Using this method, student will be evaluated more accurate compared to the previously. So, teacher can easily evaluate their student without having any emotion factor, relation factor and others. As conclusion this system module will helps the development of student evaluation more accurate and valid in Student Idol.

Interactive Agents with Artificial Mind

This paper discusses an artificial mind model and its applications. The mind model is based on some theories which assert that emotion is an important function in human decision making. An artificial mind model with emotion is built, and the model is applied to action selection of autonomous agents. In three examples, the agents interact with humans and their environments. The examples show the proposed model effectively work in both virtual agents and real robots.

On the Quality of Internet Users- Behavioral Patterns in Using Different Sites and Its Impact on Taboos of Marriage: A Survey among Undergraduate Students in Mashhad City in Iran

Regarding the multi-media property of internet and the facilities that can be provided for the users, the purpose of this paper is to investigate the users- behavioral patterns and the impact of internet on taboos of marriage. For this purpose a survey technique on the sample size amounted 403 students of governmental guidance schools of city of Mashhad in country of Iran were considered. The results showed, the process of using various internet environments depends on the degree of the users- familiarity with these sites. In order to clarify the effects of the Internet on the taboos of marriage, the non – internet parameters also considered to be controlled. The ttest held among the internet users and non-users, indicated that internet users possess lower taboos of marriage. Extraction of the effects of internet via considering the effects of non-internet parameters, indicate that addiction to the internet, creating a cordial atmosphere, emotional communication, and message attractive factors have significant effects on the family's traditional values.

Predicting Individual Investors- Intention to Invest: An Experimental Analysis of Attitude as a Mediator

The survival of publicly listed companies largely depends on their stocks being liquidly traded. This goal can be achieved when new investors are attracted to invest on companies- stocks. Among different groups of investors, individual investors are generally less able to objectively evaluate companies- risks and returns, and tend to be emotionally biased in their investing decisions. Therefore their decisions may be formed as a result of perceived risks and returns, and influenced by companies- images. This study finds that perceived risk, perceived returns and trust directly affect individual investors- trading decisions while attitude towards brand partially mediates the relationships. This finding suggests that, in courting individual investors, companies still need to perform financially while building a good image can result in their stocks being accepted quicker than the stocks of good performing companies with hidden images.

Emotion Recognition Using Neural Network: A Comparative Study

Emotion recognition is an important research field that finds lots of applications nowadays. This work emphasizes on recognizing different emotions from speech signal. The extracted features are related to statistics of pitch, formants, and energy contours, as well as spectral, perceptual and temporal features, jitter, and shimmer. The Artificial Neural Networks (ANN) was chosen as the classifier. Working on finding a robust and fast ANN classifier suitable for different real life application is our concern. Several experiments were carried out on different ANN to investigate the different factors that impact the classification success rate. Using a database containing 7 different emotions, it will be shown that with a proper and careful adjustment of features format, training data sorting, number of features selected and even the ANN type and architecture used, a success rate of 85% or even more can be achieved without increasing the system complicity and the computation time

Predicting the Three Major Dimensions of the Learner-s Emotions from Brainwaves

This paper investigates how the use of machine learning techniques can significantly predict the three major dimensions of learner-s emotions (pleasure, arousal and dominance) from brainwaves. This study has adopted an experimentation in which participants were exposed to a set of pictures from the International Affective Picture System (IAPS) while their electrical brain activity was recorded with an electroencephalogram (EEG). The pictures were already rated in a previous study via the affective rating system Self-Assessment Manikin (SAM) to assess the three dimensions of pleasure, arousal, and dominance. For each picture, we took the mean of these values for all subjects used in this previous study and associated them to the recorded brainwaves of the participants in our study. Correlation and regression analyses confirmed the hypothesis that brainwave measures could significantly predict emotional dimensions. This can be very useful in the case of impassive, taciturn or disabled learners. Standard classification techniques were used to assess the reliability of the automatic detection of learners- three major dimensions from the brainwaves. We discuss the results and the pertinence of such a method to assess learner-s emotions and integrate it into a brainwavesensing Intelligent Tutoring System.

Nonverbal Expression of Emotions in Conflict Escalation

The purpose of this study is to explore how the emotions at the moment of conflict escalation are expressed nonverbally and how it can be detected by the parties involved in the conflicting situation. The study consists of two parts, in the first part it starts with the definition of "conflict" and "nonverbal communication". Further it includes the analysis of emotions and types of emotions, which may bring to the conflict escalation. Four types of emotions and emotion constructs are analyzed, particularly fear, anger, guilt and frustration. The second part of the study analyses the general role of nonverbal behavior in interaction and communication, which information it may give during communication to the person, who sends or receives those signals. The study finishes with the analysis of the nonverbal expression of analyzed emotions and on how it can be used during interaction.

Identification of Arousal and Relaxation by using SVM-Based Fusion of PPG Features

In this paper, we propose a new method to distinguish between arousal and relaxation states by using multiple features acquired from a photoplethysmogram (PPG) and support vector machine (SVM). To induce arousal and relaxation states in subjects, 2 kinds of sound stimuli are used, and their corresponding biosignals are obtained using the PPG sensor. Two features–pulse to pulse interval (PPI) and pulse amplitude (PA)–are extracted from acquired PPG data, and a nonlinear classification between arousal and relaxation is performed using SVM. This methodology has several advantages when compared with previous similar studies. Firstly, we extracted 2 separate features from PPG, i.e., PPI and PA. Secondly, in order to improve the classification accuracy, SVM-based nonlinear classification was performed. Thirdly, to solve classification problems caused by generalized features of whole subjects, we defined each threshold according to individual features. Experimental results showed that the average classification accuracy was 74.67%. Also, the proposed method showed the better identification performance than the single feature based methods. From this result, we confirmed that arousal and relaxation can be classified using SVM and PPG features.

Regret, Choice, and Outcome

In two studies we challenged the well consolidated position in regret literature according to which the necessary condition for the emergence of regret is a bad outcome ensuing from free decisions. Without free choice, and, consequently, personal responsibility, other emotions, such as disappointment, but not regret, are supposed to be elicited. In our opinion, a main source of regret is being obliged by circumstance out of our control to chose an undesired option. We tested the hypothesis that regret resulting from a forced choice is more intense than regret derived from a free choice and that the outcome affects the latter, not the former. Besides, we investigated whether two other variables – the perception of the level of freedom of the choice and the choice justifiability – mediated the relationships between choice and regret, as well as the other four emotions we examined: satisfaction, anger toward oneself, disappointment, anger towards circumstances. The two studies were based on the scenario methodology and implied a 2 x 2 (choice x outcome) between design. In the first study the foreseen short-term effects of the choice were assessed; in the second study the experienced long-term effects of the choice were assessed. In each study 160 students of the Second University of Naples participated. Results largely corroborated our hypotheses. They were discussed in the light of the main theories on regret and decision making.

A Study on the Circumstances Affecting Elementary School Students in Their Familyand School Lives and Their Consequential Emotions

The purpose of this study is to determine the circumstances affecting elementary school students in their family and school lives and what kind of emotions children may feel because of these circumstances. The study was carried out according to the survey model. Four Turkish elementary schools provided 123 fourth grade students for participation in the study. The study-s data were collected by using worksheets for the activity titled “Important Days in Our Lives", which was part of the Elementary School Social Sciences Course 4th Grade Education Program. Data analysis was carried out according to the content analysis technique used in qualitative research. The study detected that circumstances of their family and school lives caused children to feel emotions such as happiness, sadness, anger, fear and jealousy. The circumstances and the emotions caused by these circumstances were analyzed according to gender and interpreted by presenting them with their frequencies.

Using Emotional Learning in Rescue Simulation Environment

RoboCup Rescue simulation as a large-scale Multi agent system (MAS) is one of the challenging environments for keeping coordination between agents to achieve the objectives despite sensing and communication limitations. The dynamicity of the environment and intensive dependency between actions of different kinds of agents make the problem more complex. This point encouraged us to use learning-based methods to adapt our decision making to different situations. Our approach is utilizing reinforcement leaning. Using learning in rescue simulation is one of the current ways which has been the subject of several researches in recent years. In this paper we present an innovative learning method implemented for Police Force (PF) Agent. This method can cope with the main difficulties that exist in other learning approaches. Different methods used in the literature have been examined. Their drawbacks and possible improvements have led us to the method proposed in this paper which is fast and accurate. The Brain Emotional Learning Based Intelligent Controller (BELBIC) is our solution for learning in this environment. BELBIC is a physiologically motivated approach based on a computational model of amygdale and limbic system. The paper presents the results obtained by the proposed approach, showing the power of BELBIC as a decision making tool in complex and dynamic situation.

RBF Based Face Recognition and Expression Analysis

Facial recognition and expression analysis is rapidly becoming an area of intense interest in computer science and humancomputer interaction design communities. The most expressive way humans display emotions is through facial expressions. In this paper skin and non-skin pixels were separated. Face regions were extracted from the detected skin regions. Facial expressions are analyzed from facial images by applying Gabor wavelet transform (GWT) and Discrete Cosine Transform (DCT) on face images. Radial Basis Function (RBF) Network is used to identify the person and to classify the facial expressions. Our method reliably works even with faces, which carry heavy expressions.

An Artificial Emotion Model For Visualizing Emotion of Characters

It is hard to express emotion through only speech when we watch a character in a movie or a play because we cannot estimate the size, kind, and quantity of emotion. So this paper proposes an artificial emotion model for visualizing current emotion with color and location in emotion model. The artificial emotion model is designed considering causality of generated emotion, difference of personality, difference of continual emotional stimulus, and co-relation of various emotions. This paper supposed the Emotion Field for visualizing current emotion with location, and current emotion is expressed by location and color in the Emotion Field. For visualizing changes within current emotion, the artificial emotion model is adjusted to characters in Hamlet.

Applications of Support Vector Machines on Smart Phone Systems for Emotional Speech Recognition

An emotional speech recognition system for the applications on smart phones was proposed in this study to combine with 3G mobile communications and social networks to provide users and their groups with more interaction and care. This study developed a mechanism using the support vector machines (SVM) to recognize the emotions of speech such as happiness, anger, sadness and normal. The mechanism uses a hierarchical classifier to adjust the weights of acoustic features and divides various parameters into the categories of energy and frequency for training. In this study, 28 commonly used acoustic features including pitch and volume were proposed for training. In addition, a time-frequency parameter obtained by continuous wavelet transforms was also used to identify the accent and intonation in a sentence during the recognition process. The Berlin Database of Emotional Speech was used by dividing the speech into male and female data sets for training. According to the experimental results, the accuracies of male and female test sets were increased by 4.6% and 5.2% respectively after using the time-frequency parameter for classifying happy and angry emotions. For the classification of all emotions, the average accuracy, including male and female data, was 63.5% for the test set and 90.9% for the whole data set.

Emotion Classification for Students with Autism in Mathematics E-learning using Physiological and Facial Expression Measures

Avoiding learning failures in mathematics e-learning environments caused by emotional problems in students with autism has become an important topic for combining of special education with information and communications technology. This study presents an adaptive emotional adjustment model in mathematics e-learning for students with autism, emphasizing the lack of emotional perception in mathematics e-learning systems. In addition, an emotion classification for students with autism was developed by inducing emotions in mathematical learning environments to record changes in the physiological signals and facial expressions of students. Using these methods, 58 emotional features were obtained. These features were then processed using one-way ANOVA and information gain (IG). After reducing the feature dimension, methods of support vector machines (SVM), k-nearest neighbors (KNN), and classification and regression trees (CART) were used to classify four emotional categories: baseline, happy, angry, and anxious. After testing and comparisons, in a situation without feature selection, the accuracy rate of the SVM classification can reach as high as 79.3-%. After using IG to reduce the feature dimension, with only 28 features remaining, SVM still has a classification accuracy of 78.2-%. The results of this research could enhance the effectiveness of eLearning in special education.

The Role of Classroom Management Efficacy in Predicting Teacher Burnout

The purpose of this study was to examine to what extend classroom management efficacy, marital status, gender, and teaching experience predict burnout among primary school teachers. Participants of this study were 523 (345 female, 178 male) teachers who completed inventories. The results of multiple regression analysis indicated that three dimensions of teacher burnout (Emotional Exhaustion, Depersonalization, Personal Accomplishment) were affected differently from four predictor variables. Findings indicated that for the emotional exhaustion, classroom management efficacy, marital status and teaching experience; for depersonalization dimension, classroom management efficacy and marital status and finally for the personal accomplishment dimension, classroom management efficacy, gender, and teaching experience were significant predictors.

Emotion Classification using Adaptive SVMs

The study of the interaction between humans and computers has been emerging during the last few years. This interaction will be more powerful if computers are able to perceive and respond to human nonverbal communication such as emotions. In this study, we present the image-based approach to emotion classification through lower facial expression. We employ a set of feature points in the lower face image according to the particular face model used and consider their motion across each emotive expression of images. The vector of displacements of all feature points input to the Adaptive Support Vector Machines (A-SVMs) classifier that classify it into seven basic emotions scheme, namely neutral, angry, disgust, fear, happy, sad and surprise. The system was tested on the Japanese Female Facial Expression (JAFFE) dataset of frontal view facial expressions [7]. Our experiments on emotion classification through lower facial expressions demonstrate the robustness of Adaptive SVM classifier and verify the high efficiency of our approach.

Development System for Emotion Detection Based on Brain Signals and Facial Images

Detection of human emotions has many potential applications. One of application is to quantify attentiveness audience in order evaluate acoustic quality in concern hall. The subjective audio preference that based on from audience is used. To obtain fairness evaluation of acoustic quality, the research proposed system for multimodal emotion detection; one modality based on brain signals that measured using electroencephalogram (EEG) and the second modality is sequences of facial images. In the experiment, an audio signal was customized which consist of normal and disorder sounds. Furthermore, an audio signal was played in order to stimulate positive/negative emotion feedback of volunteers. EEG signal from temporal lobes, i.e. T3 and T4 was used to measured brain response and sequence of facial image was used to monitoring facial expression during volunteer hearing audio signal. On EEG signal, feature was extracted from change information in brain wave, particularly in alpha and beta wave. Feature of facial expression was extracted based on analysis of motion images. We implement an advance optical flow method to detect the most active facial muscle form normal to other emotion expression that represented in vector flow maps. The reduce problem on detection of emotion state, vector flow maps are transformed into compass mapping that represents major directions and velocities of facial movement. The results showed that the power of beta wave is increasing when disorder sound stimulation was given, however for each volunteer was giving different emotion feedback. Based on features derived from facial face images, an optical flow compass mapping was promising to use as additional information to make decision about emotion feedback.

The Relationship between Adolescent Emotional Inhibition and Depression Disorder: The Moderate Effect of Gender

The association between emotional inhibition strategies linked to depression has been showed inconsistent among studies. Mild emotional inhibition maybe benefit for social interaction, especially for female among East Asian cultures. The present study aimed to examine whether the inhibition–depression relationship is dependent on level of emotion inhibition and gender context, given differing value of suppressing emotional displays. We hypothesized that the negative associations between inhibition and adolescent depression would not directly, in which affected by interaction between emotion inhibition and gender. To test this hypothesis, we asked 309 junior high school students which age range from 12 to14 years old to report on their use of emotion inhibition and depression syndrome. A multiple regressions analysis revealed that significant interaction that gender as a moderator to the relationships between emotion inhibition and adolescent depression. The group with the highest level of depression was girls with high levels of emotion inhibition, whose depression score was higher than that of boys with high levels of emotion inhibition. The result highlights that the importance of context in understanding the inhibition-depression relationship.