You are on page 1of 7

Available online at www.sciencedirect.

com
Available online at www.sciencedirect.com
Available online at www.sciencedirect.com

ScienceDirect
Procedia
Procedia
Procedia Computer
Computer Science
Science
Computer 13200
Science (2018)
(2018)
00 000–000
752–758
(2018) 000–000
www.elsevier.com/locate/procedia
www.elsevier.com/locate/procedia

International
International Conference
Conference on
on Computational
Computational Intelligence
Intelligence and
and Data
Data Science
Science (ICCIDS
(ICCIDS 2018)
2018)
EEG
EEG Based
Based Emotion
Emotion Classification
Classification Mechanism
Mechanism in
in BCI
BCI
a, a, a b a b
Barjinder
Barjinder Kaur
Kaur ,, Dinesh
Dinesh Singh
Singh ,, Partha
Partha Pratim
Pratim Roy
Roy
a
a DCRUST, Murthal, Sonipat and 131039, India
DCRUST, Murthal, Sonipat and 131039, India
ab
a bIIT, Roorkee, and 247667, India
IIT, Roorkee, and 247667, India

Abstract
Abstract
Psychological
Psychological changes
changes in
in humans
humans are are the
the result
result of
of emotions
emotions which
which occur
occur due
due toto activities
activities in
in daily
daily life.
life. To
To understand
understand these
these changes
changes inin
behavioral
behavioral pattern,
pattern, research
research on
on aa ective
ective computing
computing hashas emerged.
emerged. Emotions
Emotions areare an
an integral
integral part
part of
of our
our daily
daily lives,
lives, based
based on
on which
which in
in this
this
paper
paper an
an investigation
investigation have
have been
been made
made toto analyze
analyze the
the impact
impact of
of positive
positive and
and negative
negative emotions
emotions using
using Electroencephalogram
Electroencephalogram (EEG).
(EEG).
Three
Three classes of emotions namely calm, anger and happiness have been studied. The EEG signals are recorded in
classes of emotions namely calm, anger and happiness have been studied. The EEG signals are recorded in real
real time
time from
from 10
10
subjects
subjects while
while watching
watching di
di erent
erent emotions
emotions video
video clips
clips of
of 22 minutes
minutes each.
each. Next,
Next, the
the fractal
fractal dimension
dimension feature
feature has
has been
been extracted
extracted from
from raw
raw
EEG.
EEG. ToTo further
further detect
detect emotional
emotional states,
states, the
the extracted
extracted features
features have
have been
been classified
classified using
using Support
Support Vector
Vector Machine
Machine (SVM)
(SVM) with
with radial
radial
basis
basis function (RBF) kernel with an average accuracy of 60%. The proposed methodology shows that emotions recognition is
function (RBF) kernel with an average accuracy of 60%. The proposed methodology shows that emotions recognition is possible
possible
from
from EEG
EEG signals.
signals.
© 2018The
cc 2018 TheAuthors.
Authors.Published
Publishedby by ElsevierB.V. Ltd.
2018
This Theopen
is an Authors.
accessPublished by Elsevier
article under Elsevier
the CC B.V.
BY-NC-ND license (https://creativecommons.org/licenses/by-nc-nd/3.0/)
Peer-review
Peer-review under responsibility of
of the
theofscientific committee of
of the
the International Conference on
on Computational Intelligence
Intelligence and
Peer-review under
under responsibility
responsibility the scientific
scientific committee
committee of the International
International Conference Conference on Computational
Computational and Intelligence and
Data Science (ICCIDS
Data Science (ICCIDS 2018). 2018).
Keywords:
Keywords: Electroencephalography (EEG); Emotions; Fractal Dimension (FD); Support Vector Machine (SVM); Radial Basis Function (RBF).
Electroencephalography (EEG); Emotions; Fractal Dimension (FD); Support Vector Machine (SVM); Radial Basis Function (RBF).

1.
1. Introduction
Introduction
Emotion
Emotion recognition
recognition based
based on
on facial
facial expressions,
expressions, speech,
speech, body
body gesture
gesture and
and image
image processing
processing has
has been
been focused
focused inin the
the past
past decades.
decades.
But in
But in the
the recent
recent past,
past, researchers
researchers have
have started
started recognizing
recognizing brain
brain signals
signals asas aa way
way ofof assessing
assessing ‘inner’
‘inner’ emotional
emotional state
state of
of individuals.
individuals.
Emotions being
Emotions being an
an important
important aa air,
air, shows
shows howhow actively
actively oror passively
passively we
we areare involved
involved with
with our
our surroundings
surroundings or or other
other humans.
humans. The
The
question
question here arises whether emotions are associated with autonomic, i.e., physiological response or with central nervous system in
here arises whether emotions are associated with autonomic, i.e., physiological response or with central nervous system in
particular.
particular. Researchers
Researchers viewpoint
viewpoint bend
bend more
more towards
towards central
central nervous
nervous system.
system. They
They used
used didi erent
erent neuro-imaging
neuro-imaging techniques
techniques like
like
functional
functional Magnetic
Magnetic Resonance
Resonance Imaging
Imaging (fMRI),
(fMRI), Magnetic
Magnetic Resonance
Resonance Imaging
Imaging (MRI),
(MRI), Electroencephalography
Electroencephalography (EEG),(EEG), etc.
etc. to
to support
support
their
their evidence. Research com-munity found that EEG technique provides best insight into the human brain. Due to which EEG has
evidence. Research com-munity found that EEG technique provides best insight into the human brain. Due to which EEG has been
been
used
used to
to detect
detect brain
brain abnormalities
abnormalities [1],
[1], mental
mental workload,
workload, prediction
prediction [2],
[2], person
person identification
identification models
models [3,
[3, 4],
4], etc.
etc. As
As it
it helps
helps in
in detecting
detecting
the
the slightest
slightest of
of modulations
modulations which
which occurs
occurs inin brain,
brain, EEG
EEG hashas been
been found
found useful
useful technique
technique to to study
study emotion
emotion variance.
variance. This
This include
include
happiness, sadness, calm, disgust, fear, stress, surprise, etc. As every individual has its
happiness, sadness, calm, disgust, fear, stress, surprise, etc. As every individual has its own way own way

Barjinder
Barjinder Kaur
Kaur
E-mail
E-mail address:
address: kaur.barjinder@gmail.com
kaur.barjinder@gmail.com
1877-0509
1877-0509 cc©2018
2018 The Authors. Published by
by Elsevier B.V.
1877-0509 2018TheTheAuthors.
Authors.Published
Published Elsevier B.V. Ltd.
by Elsevier
Peer-review
Peer-review under responsibility
underaccess
responsibility of the scientific
of the scientific committee of
of the International Conference
Conference on
on Computational
Computational Intelligence
Intelligence and
and Data
Data Science
This is an open article under the CC committee
BY-NC-ND the International
license (https://creativecommons.org/licenses/by-nc-nd/3.0/) Science
(ICCIDS
(ICCIDS 2018).
2018).
Peer-review under responsibility of the scientific committee of the International Conference on Computational Intelligence and Data Science
(ICCIDS 2018).
10.1016/j.procs.2018.05.087
Barjinder Kaur et al. / Procedia Computer Science 132 (2018) 752–758 753
2 Barjinder Kaur/ Procedia Computer Science 00 (2018) 000–000

to express the emotion based on the stimuli, one of the di culties which EEG based emotion recognition system has
to dealt with is to distinguish these emotion categories. Jatupaiboon et al.[5] proposed an emotion classification
mech-anism which can di erentiate happy and unhappy emotions. The EEG data has been recorded from 10 subjects
using pictures and classical music as stimuli. A Power Spectrum Density feature (PSD) and Support Vector
Machine (SVM) classifier have been used to correctly recognize the emotions. However, their study was based on
the classification of two emotions only.
Jenke et al. [6] selected pictures from International A ective Picture System (IAPS) to be used as emotion stimuli for
16 subjects. They have extracted Higher Order Crossings (HOC), Higher Order Spectra (HOS) and Hilbert-Huang
Spectrum (HHS) features from 64 channels. A Quadratic Discriminant Analysis (QDA) has been used to classify happy,
curious, angry, sad and quiet emotions with an accuracy of 36.8%. A subject-dependent novel algorithm has been
proposed in [7] to study four emotions pleasant, happy, frightened and angry. The EEG signals are recorded in two
sessions consecutively for eight days which shows that emotion recognition systems can be used to analyze individuals
emotional state.
The brains response to di erent stimuli are usually measured by dividing the EEG signals into di erent frequency
rhythms namely, delta (0.5- 4Hz), theta (4 - 8Hz), beta (16 - 32Hz) and gamma (32Hz - above). These band waves are
omnipresent in di erent parts of the brain [8]. Srinivas et al. [9] proposed an emotion recognition model using DEAP
database. To di erentiate emotions authors used Discrete Fourier (DF) and Wavelet transform (WT) techniques. These are
applied to extract di erent frequency bands. The data collected from 11 participants by using video stimuli shows that
gamma and beta wave bands gives better results when classified using radial basis function (RBF). To report the accuracy
they have consider 3 channels only.
Syahril et al. [10] investigated the emotions sadness, fear, happiness and disgust in their proposed methodology. The spectral
features has been used to extract emotion of 15 subjects. The authors worked on alpha and beta wave band. Fp1, Fp2, F3 and F4
channels have been considered to evaluate the system performance. Liu et al. [11] proposed a real time emotion recognition
algorithm to classify eight emotions (happy, surprised, satisfied, protected, angry, frightened, unconcerned, sad). The authors
used four channels to report an accuracy of 53.7% while using audio as stimuli and 87.02% with visual stimuli to incite
emotions. In their work a fractal dimension (FD), statistical and Higher Order Crossings (HOC) features which are further
classified using SVM has been used. A flow model is depicted in Fig. 1 which shows the acquisition of EEG signals while user
is watching di erent emotions video and simultaneously the EEG data is collected. Further, features are extracted from raw
signals using fractal dimension and classification is performed on featured signals.

Fig. 1. The proposed framework used for classification of calm, anger and happiness emotions.

Rest of the paper is systematically organized as follows. Section 2 discusses the analysis of the feature extraction
methods, and the classifiers used in proposed framework. Experimental results are presented in Section 3. Finally,
conclusion and future possibilities of the work are discussed in Section 5.
754 Barjinder Kaur et al. / Procedia Computer Science 132 (2018) 752–758
Barjinder Kaur / Procedia Computer Science 00 (2018) 000–000 3

2. Feature Extraction and Classification

In this section, we present the details of the features and the classifier used to recognize the emotions.

2.1. Feature Extraction

To extract the non stationary and non-linear EEG features, fractal dimension (FD) technique has been applied using higuchi
algorithm. The FD is frequently used [12] to classify the emotions as it is a statistical quantity which shows how the space can be
completely filled [13]. The details of higuchi FD algorithm are defined in (1).
A 1
(i 1
A jt
X ( j  it )  X ( j  (i  1) * t ) )
A j
*t
FD j t  t (1)
t

Where X(1); X(2); :::; X(N) defines finite set of time series samples, j = 1; 2::::t denotes the initial time, t
denotes is the interval time.

2.2. Emotion Classification Using SVM

Support Vector Machine (SVM) has the ability to find decision boundaries with the help of support vectors. SVM is
the considered as the most suitable classifier for biomedical applications where non-stationary signals are to be studied.
Separating the multiple classes by mapping data into high dimensional space is done by providing maximum margin
defined in (2) [14]. SVM also provides di erent kernel functionality for improving the results according to the dataset
availability. More details about SVM can be found in [3]. The experimental results in this paper has been analyzed using
radial basis function (RBF) kernel defined in (3).

f (x) = argmaxb [(wb *x) + tb] ; b = 1:::::k, (2)


Where b defines the classifier constructs a hyperplane between class b and the k- 1 other classes, w is the weight
vector, t bias variable.

 JS ( z  t ) 
rbf JS ( z,  t ,  t )  exp   ,
 (3)
 2  t
2

Where t defines the support vector index, t denotes the support vector radius, JS ( z t ) denotes the Jensen-
Shannon (JS ) divergence between a random vector t and the support vector centroid t.

3. Results

To classify the emotions, real time data has been collected from 10 users while watching videos of di erent emo-tions
namely calm, anger and happiness. The emotions data is recorded in di erent sessions. The videos were selected to
produce emotions according to Valence-Arousal model depicted in Fig. 2(a). The EEG signals were collected ac-cording
to the 10-20 internationally recognized placement system. Emotiv Epoc+ neuro-sensor headset device has been used to
collect EEG data. The electrodes position over the scalp are shown in Fig. 2(b). The signal data has been collected from
all 14 electrodes (AF3, F7, F3, FC5, T7, P7, O1, O2, P8, T8, FC6, F4, F8, AF4) at the sampling rate of 128 Hz. To detect
the emotions, EEG data has been segmented into 1 second each, therefore in total 3300 files of 10 users have been created
from three emotion classes for analysis. A pictorial representation of the EEG signals corresponding to all three emotions
is depicted in Fig. 3 when plotted for the same user. A large variation among the EEG signal of di erent emotions can be
seen in Fig. 3. In this work, we have LIBSVM (A library for SVM) to compute the results.
Barjinder Kaur et al. / Procedia Computer Science 132 (2018) 752–758 755
4 Barjinder Kaur / Procedia Computer Science 00 (2018) 000–000

Fig. 2. Pictorial representation of: (a) Valence-Arousal model depicting emotions (b) Emotiv Epoc+ device with 14 sensors.

Fig. 3. The figure depicts EEG signals of single user with di erent emotional states: (a) Calm (b) Anger (c) Happiness.

Fig. 4. The figure represents the accuracy of anger, calm and happy emotions.

3.1. Emotion Classification results using SVM

The classifier SVM with RBF kernel has been used for classifying the emotions of the users. An average accuracy of 60% has been
achieved from 14 dimensional feature vector. It can be seen from the Fig. 4 that the highest emotion accuracy of 80% is achieved in
happy state while the lowest accuracy of 40% is recorded in anger emotional state.
756 Barjinder Kaur et al. / Procedia Computer Science 132 (2018) 752–758
Barjinder Kaur / Procedia Computer Science 00 (2018) 000–000 5

The results are also computed for all the 10 users to see how accurately the classifier can recognize the emotions
among di erent users. The results are depicted in Fig. 5, where highest recognition is achieved on user 6 with 100%
accuracy when di erent emotions video is shown.

Fig. 5. The figure shows that on classifying di erent emotions, user 6 EEG data signals have been best classified.

3.2. Comparative Analysis with di erent kernels

The results achieved by varying kernels in SVM is discussed in this section. The experiments were conducted
using linear, polynomial and RBF kernels which shows that highest emotion recognition accuracy is achieved using
RBF with 60% and lowest with polynomial kernel at 41.6%. This shows that RBF kernel of SVM is best for
recognizing the emotions as compared to other kernels even on raw EEG signals. The Fig. 6 depicting emotion
results with di erent kernels.

Fig. 6. Emotion recognition performance across di erent SVM kernels.


Barjinder Kaur et al. / Procedia Computer Science 132 (2018) 752–758 757
6 Barjinder Kaur / Procedia Computer Science 00 (2018) 000–000

4. Comparative performance analysis

In this section, a comparative analysis has been performed with the previous studies as shown in Table 1. Lan et al. [7] used
four emotions (pleasant, happy, frightened and angry) to propose a real time emotion monitoring system. The EEG signals are
recorded using an audio stimuli where subjects were asked to listen the music with eyes closed. A maximum accuracy of 49.63%
has been reported with 5 subjects. A novel emotion recognition system has been proposed using two frontal (Fp1 and Fp2)
channels. With the same stimuli an average accuracy of 75.18% is reported with 32 subjects and two emotions [15]. While
comparing with existing works, it is observed that with 10 users, fourteen channels and three emotions the proposed
methodology outperforms on raw signals.

Table 1. Comparative Analysis of the proposed methodology with existing work.

Authors and Year Dataset No. of Channels No. of Emotions Stimuli Used Accuracy

Lan et al.[7],2016 5 5 4 Audio 49.63%


Wu et al.[15],2017 32 2 2 Audio 75.18%
Proposed Approach,2018 10 14 3 Video 60%

5. Conclusion and Future Scope

In this paper, we have proposed EEG based emotion classification methodology that has successfully recognized three
emotional states, namely calm, anger and happiness. The real time data of 10 users has been used in this study in which emotions
are incited using videos of di erent emotions and simultaneously EEG signals are recorded. The EEG signals data is recorded at
di erent sessions to study the influence of emotions. Further, the fractal dimension (FD) feature is extracted using higuchi
technique from raw signals. Next, the feature vector extracted is classified using SVM where an accuracy of 60% on RBF is
achieved. The results also shows that happiness emotion is best classified and dominates the brain.
In future, the study will be done to examine more emotion classes with larger dataset to analyze its impact on brain. As it has
been di cult to find feature that perfectly works on emotions as number of subjects increases, we would be interested to explore
features that works on time-frequency domain signals. To further enhance the classification performance we will incorporate and
experiment with other hierarchical classifiers and combination approaches.

References

[1] T. Liu, Y. Chen, P. Lin, J. Wang, Small-world brain functional networks in children with attention-deficit/hyperactivity disorder revealed by
eeg synchrony, Clinical EEG and neuroscience 46 (3) (2015) 183–191.
[2] H. Gauba, P. Kumar, P. P. Roy, P. Singh, D. P. Dogra, B. Raman, Prediction of advertisement preference by fusing eeg response and
sentiment analysis, Neural Networks.
[3] B. Kaur, D. Singh, P. P. Roy, A novel framework of eeg-based user identification by analyzing music-listening behavior, Multimedia Tools
and Applications 76 (24) (2017) 25581–25602.
[4] R. Saini, B. Kaur, P. Singh, P. Kumar, P. P. Roy, B. Raman, D. Singh, Dont just sign use brain too: A novel multimodal approach for user
identification and verification, Information Sciences 430 (2018) 163–178.
[5] N. Jatupaiboon, S. Pan-ngum, P. Israsena, Real-time eeg-based happiness detection system, The Scientific World Journal 2013.
[6] R. Jenke, A. Peer, M. Buss, Feature extraction and selection for emotion recognition from eeg, IEEE Transactions on A ective Computing 5
(3) (2014) 327–339.
[7] Z. Lan, O. Sourina, L. Wang, Y. Liu, Real-time eeg-based emotion monitoring using stable features, The Visual Computer 32 (3) (2016)
347–358.
[8] C. Pagani, Violence and complexity, Open Psychology Journal 8 (2015) 11–16.
[9] M. V. Srinivas, M. V. Rama, C. R. Rao, Wavelet based emotion recognition using rbf algorithm, brain 4 (5).
[10] S. Syahril, K. S. Subari, N. N. Ahmad, Eeg and emotions: -peak frequency as a quantifier for happiness, in: Control System, Computing and
Engineering (ICCSCE), 2016 6th IEEE International Conference on, 2016, pp. 217–222.
[11] Y. Liu, O. Sourina, Real-time subject-dependent eeg-based emotion recognition algorithm, in: Transactions on Computational Science
XXIII, 2014, pp. 199–223.
758 Barjinder Kaur et al. / Procedia Computer Science 132 (2018) 752–758
Author name / Procedia Computer Science 00 (2018) 000–000 7

[12] O. Sourina, Y. Liu, A fractal-based algorithm of emotion recognition from eeg using arousal-valence model., in: Biosignals, 2011, pp. 209–214.
[13] S. Kesic,´ S. Z. Spasic,´ Application of higuchi’s fractal dimension from basic to clinical neurophysiology: A review, Computer methods
and programs in biomedicine 133 (2016) 55–70.
[14] J. Weston, C. Watkins, et al., Support vector machines for multi-class pattern recognition., in: Esann, Vol. 99, pp. 219–224.
[15] S. Wu, X. Xu, L. Shu, B. Hu, Estimation of valence of emotion using two frontal eeg channels, in: Bioinformatics and Biomedicine (BIBM),
2017 IEEE International Conference on, 2017, pp. 1127–1130.

You might also like