You are on page 1of 4

Emotion Recognition using ECG signals through Machine

Learning
Wesly Quiballo John Dale Gutierrez Rocendo Astillero

ABSTRACT In the current research on emotion recognition based on ECG


signals, researchers mainly classify them based on the
Eleectrocardiogram (ECG) is one of the prominent biosensors that characteristics of time domain and frequency domain of ECG
healthcare provider is using in measuring the heart’s activity. signals. But the correlation of ECG signals is rarely used. The
Thus, ECG signals have use from different data gathering for their ECG signals affected by emotion are non-linear and non-
research. This research investigates the application of artificial stationary signals. The correlation is a significant feature of non-
intelligence, specifically Machine Learning, in predicting certain stationary signals, which can mirror the correlation between the
kinds of emotions using the data gathered from the ECG. This present behavior and the past behavior of time series. Therefore,
study will examine the participant’s emotions during actual the correlation characteristics of ECG signal can reflect its
situations while watching different video clips to provide an dynamic alterations under the different emotions. In this paper,
accurate prediction using the research model. the time-domain and frequency-domain statistical features and
correlation features of ECG signals are applied to emotion
recognition, and a comparative study is carried out.
Keywords
Artificial Intelligence, Emotion Detection, ECG 2. REVIEW OF RELATED STUDY
Emotion recognition in HCI [8]
1. INTRODUCTION
Emotion is a compound psychological and physiological process. Emotion recognition or recognition is both essential and beneficial
It shakes every aspect of our daily life. The scheme of a system in human computer and human robot interactions since emotions
that can automatically classify emotions can narrow the indicate human feelings and needs [9]. Emotion recognition in
communication gap amongst highly emotional humans and HCI enables computers and their applications to naturally interact
computers. It can expand the quality of human-computer with human. Computers and their applications can analyze and
collaboration, and has great application prospects in medical, interpret user emotions to deliver appropriate responses.
education, entertainment, commerce and other fields[1-5]. The following are the three main classifications of techniques that
At present-day, emotion recognition approach can be divided into have been proposed to distinguish and classify user emotions:
two classes: emotion recognition centered on human behavior  Speech emotion recognition:
pattern and emotion recognition centered on human physiological identifies emotional states using
signal[6]. The former classifies emotions according to people's voice analysis. These speech
different facial expressions, voice and body activities. The latter features such as pitch, formants, and
classifies emotions according to the physiological signal changes short-term energy [10] are useful for
caused by different emotions. Although the emotion recognition emotion recognition. Thus, several
method based on the behavior pattern is intuitive, people can feature extraction methods have
switch their behavior intentionally and hide the real sentiment. been proposed to extract these
However, the physiological signal of human body will not change features.
due to particular consciousness. So the emotion recognition
method based on the physiological signal of human body can  Emotion recognition from facial
accurately obtain the expression of emotion. expression: recognizes human
emotions from facial muscle
The nerve endings of the autonomic nervous system (ANS) in the movements, the movements of eyes,
heart play a major regulatory role in the heart's activity because mouths or eyebrows and facial
they shake the rhythm in which cardiac myocytes pump blood. textures. Several studies have
The autonomic nervous system is separated into sympathetic applied computer vision systems to
nervous system and parasympathetic nervous system, which are automatically analyze and recognize
antagonistic to each other. The nerve fibers of the cardiac changes in facial motions from
sympathetic nervous system are distributed along the atria and visual information [11].
ventricles. When they are activated, they can stimulate the heart
muscle cells to surge the heart rate. On the other hand, when the
parasympathetic nervous system is excited, it shrinks the load on
the heart. In the presence of external mental stimulation, the
sympathetic nervous system will dominate the regulation of
cardiomyocytes[7]. Therefore, the ECG signal is a kind of
physiological signal closely related to emotion, which is widely
used in emotion recognition system.
 Emotion recognition using Fig 2. AD8232 and ARDUINO UNO basic configuration and
biological signals: analyzes and connection
recognizes human emotional states
from such biological signals as
electroencephalography (EEG), 3.2 About the Dataset
electrocardiography (ECG),
temperature, and galvanic skin The dataset will be get from the Arduino software that comes fom
response (GSR) [12]. its serial monitor that produces list of numbers coming from the
Using facial expressions to distinguish emotions is more suitable ECG signals of the participants.
for our healthcare system because this technique recognizes
emotions from a natural user interface: the face. However,
3.3 Data Gathering
recognizing emotions from only facial expressions might fail to All the data came from 5 partcipants from different age that
suitably classify human emotions, since humans occasionally hide undergoes the ECG signals extraction.
their emotions from their appearance. Since, recognizing emotions
by biological signals can resolve this issue. We apply ECG to
detect emotions with facial expressions for our emotional
healthcare system.
4. RESULTS AND DISCUSSIONS
4.1 Videos
The research study have 5 participants that will watch 15 videos
from different emotions. Shown in the figure below are the list of
3. METHODOLOGY the videos that was used in the experiment.
The objective of this research is ti conduct data gathering through
ECG signals and process data to produce prediction accuracy
through machine learning. The research will conduct experimental
study to understand how data gather from participants can predict
certain emotions. In figure 1 shows the process flow of the
proposed research article.

Figure . Video Clips for different emotions


4.2 Proposed Model
The study used ECG Modules to gathered Signals from the
participants. Those input will run through the machine
learningmodel and provide prediction of a certain emotions. The
accuracy of the training model base on the machine learning that
is shown in the below figure. KNN and SVM was used to get the
accuracy of predicting emotions.
Fig 1. Process flow of Data Gathering

3.1 AD8232 ECG Module and ARDUINO


UNO
In this research the authors use AD8232 ECG
module with the help of the mechanism of
Arduino UNO. In figure 2 below shows the basic
configuration and connection of the two devices.

Figure . Accuracy of the model using SVM


pressure on identifying human’s true inner emotions. Also, in the
aspects of expressing affect, work needs to be done on how
computers identify, handle and express emotions [1].
Based on the accuracy of both KNN and SVM which is 26% and
13% respectively. We cannot say if the model will provide better
help to predict result to predict emotions.
Based on the data gathering, authors having difficulties on finding
participants to join the research study proposal. Hence, we
recommends for the future study to have atleast 50 to 100
Figure . Accuracy of the model using using KNN participants to join the prediction accuracy. This will help to
increase the result and provide more accurate data for the future
To validate the accuracy of the data the study generates confusion studies.
matrix to check the prediction of the 5 emotions: [0] happy, [1]
sad, [3] neutral, [4] Anger and [5] fear.

6. REFERENCES
[1] Rencheng, Zheng, Shigeyuki, et al. (2015) Biosignal Analysis
to Assess Mental Stress in Automatic Driving of Trucks: Palmar
Perspiration and Masseter Electromyography. Sensors, 15(3),
5136-5150
[2] S. D’Mello, S. Craig, B. Gholson, S. Franklin, R. Picard, et al.
(2015) Integrating Affect Sensors in an Intelligent Tutoring
System. Intelligent User Interfaces, 7-13.
[3] Krithika L. B. and Lakshmi Priya G. G. (2016) Student
Emotion Recognition System (SERS) for e-learning Improvement
Based on Learner Concentration Metric. Procedia Computer
Science, 85, 767-776.
[4] Mohammad Soleymani, et al. (2018) Affective
Characterization of Movie Scenes Based on Multimedia Content
Analysis and User's Physiological Emotional Responses. IEEE
Figure . Confusion Matrix of SVM Transactions on Affective Computing, 3(1), 102-115.
[5] W.M. Wang, J.W. Wang and Z. Li. (2019) Multiple affective
attribute classification of online customer product reviews A
heuristic deep learning method for supporting Kansei engineering.
Engineering Applications of Artificial Intelligence, 85, 33-45.
[6] Calvo R. A. and D'Mello S. K. (2010) Affect Detection: An
Interdisciplinary Review of Models, Methods, and Their
Applications. IEEE Transactions on Affective Computing, 1(1),
18-37.
[7] Dorigo Marco, et al. (1996) Ant Colony System : a
Cooperative Learning Approach to the Traveling Salesman. IEEE
Transaction on Evolutionary Computation, 1(1), 53-66.
[8] Kanlaya Rattanyu, and Makoto Mizukawa; Emotion
Recognition Based on ECG Signals for Service Robots in the
Intelligent Space During Daily Life, Journal of Advanced
Computational Intelligence and Intelligent Informatics, 15(5), pp.
582-591, 2011.
[9] Haq Sanaul and Philip JB Jackson; Multimodal Emotion
Recognition, pp.398-423, 2010.
Figure . Confusion Matrix of KNN
[10] Ververidis Dimitrios and Constantine Kotropoulos;
Emotional speech recognition: Resources, features, and methods,
5. CONCLUSION Speech communication, 48(9), pp.1162- 1181, 2006.
Affective computing is a way to reach higher levels of Artificial [11] Hussain, Ayyaz, Muhammad Shahid Khan, Muhammad
Intelligence. After the study of human emotions, we have Nazir, and M. Amjad Iqbal; Survey of various feature extraction
discussed affective computing different features, and finally we and classification techniques for facial expression recognition, In
described the applications, to show the future prospects of Proceedings of the 11th WSEAS international conference on
affective computing. Human real emotions and expressed Electronics, Hardware, Wireless and Optical Communications,
emotions are not the same each time. For example, humans can and proceedings of the 11th WSEAS international conference on
pretend to be happy, when they are not. This puts significant Signal Processing, Robotics and Automation, and proceedings of
the 4th World Scientific and Engineering Academy and Society
(WSEAS) international conference on Nanotechnology, pp.138-
142, 2012.
[12] Murugappan Murugappan, Nagarajan Ramachandran and
Yaacob Sazali; Classification of human emotion from EEG using
discrete wavelet transform, Journal of Biomedical Science and
Engineering, 3(4), pp.390- 396, 2010.

Columns on Last Page Should Be Made As Close As


Possible to Equal Length

You might also like