You are on page 1of 8

Journal of Biomedical Informatics 62 (2016) 202–209

Contents lists available at ScienceDirect

Journal of Biomedical Informatics


journal homepage: www.elsevier.com/locate/yjbin

Assessing the user experience of older adults using a neural network


trained to recognize emotions from brain signals
Victoria Meza-Kubo ⇑, Alberto L. Morán, Ivan Carrillo, Gilberto Galindo, Eloisa García-Canseco
Facultad de Ciencias, Universidad Autónoma de Baja California, Km. 103 Carretera Tijuana-Ensenada, Mexico

a r t i c l e i n f o a b s t r a c t

Article history: The use of Ambient Assisted Living (AAL) technologies as a means to cope with problems that arise due to
Received 1 March 2016 an increasing and aging population is becoming usual. AAL technologies are used to prevent, cure and
Revised 22 June 2016 improve the wellness and health conditions of the elderly. However, their adoption and use by older
Accepted 4 July 2016
adults is still a major challenge. User Experience (UX) evaluations aim at aiding on this task, by identify-
Available online 5 July 2016
ing the experience that a user has while interacting with an AAL technology under particular conditions.
This may help designing better products and improve user engagement and adoption of AAL solutions.
Keywords:
However, evaluating the UX of AAL technologies is a difficult task, due to the inherent limitations of their
EEG signals
Emotion recognition
subjects and of the evaluation methods. In this study, we validated the feasibility of assessing the UX of
Neural networks older adults while they use a cognitive stimulation application using a neural network trained to recog-
User experience nize pleasant and unpleasant emotions from electroencephalography (EEG) signals by contrasting our
Evaluation results with those of additional self-report and qualitative analysis UX evaluations. Our study results pro-
Older adults vide evidence about the feasibility of assessing the UX of older adults using a neural network that take as
input the EEG signals; the classification accuracy of our neural network ranges from 60.87% to 82.61%. As
future work we will conduct additional UX evaluation studies using the three different methods, in order
to appropriately validate these results.
Ó 2016 Elsevier Inc. All rights reserved.

1. Introduction differences in the circumstances during and after an interaction


with a technological product [5,6]. The main effect of a good UX
Inherent problems due to a world wide increasing and aging is user engagement. This is characterized by attributes of chal-
population require the development of solutions to cope with lenge, positive affect, endurability, aesthetic and sensory appeal,
them. These problems include the physical and cognitive problems attention, feedback, variety/novelty, interactivity, and perceived
of the elderly [1]. Solutions include Ambient Assisted Living (AAL) user control [7]. All these attributes are measured through UX eval-
technologies that can be used to prevent, cure, and improve well- uations, where understanding the users’ experiences that a product
ness and the health conditions of older adults [2]. However, due to evokes, either pleasant or unpleasant, is crucial [8].
a decline in the elderly’s physical and cognitive skills, assessing In the literature, several UX evaluations have been reported to
their perception regarding the use, acceptance and adoption of assess the perception of the elderly concerning the use of technol-
these technologies is of great interest. Technology designed for ogy [9], nevertheless, this kind of assessments may be little reliable
elderly users should not only focus on usability issues, but also due to the inherent limitations of the subjects, and also the limita-
seriously investigate their motivation to engage them on the use tions of the evaluation methods. For example, Arhippainen con-
of the new technology [3]. These assessments are usually per- cluded ‘‘in self-report based techniques, participants tend to
formed through user experience (UX) evaluations. respond what the researcher wants to hear, or tend to be insincere
Although, there is no clear consensus about the definition of UX and improve their perception of the results because they felt
[4], it is generally understood as inherently dynamic, given the assessed, or because they have forgotten the details of their expe-
ever-changing internal and emotional state of a person and the rience” [10]. On other hand, the use of qualitative analysis tech-
niques include coding the gestures, verbal and non-verbal
expressions and other subject data as an indicator of human expe-
⇑ Corresponding author.
rience, however, this technique requires a lengthy and rigorous
E-mail addresses: mmeza@uabc.edu.mx (V. Meza-Kubo), alberto.moran@uabc.
edu.mx (A.L. Morán), ivan.carrillo@uabc.edu.mx (I. Carrillo), eloisa.garcia@uabc.edu. process [11].
mx (G. Galindo), gilberto.galindo.aldana@uabc.edu.mx (E. García-Canseco).

http://dx.doi.org/10.1016/j.jbi.2016.07.004
1532-0464/Ó 2016 Elsevier Inc. All rights reserved.
V. Meza-Kubo et al. / Journal of Biomedical Informatics 62 (2016) 202–209 203

An alternative to these methods is the neurophysiological emotional states based on EEG features, with 54–67% of accuracy
method, which infers emotions during the use of a product by for four emotional states. Petrantonakis [24] used neural networks
measuring and interpreting physiological signals [11]. These sig- to classify EEG signals into six emotions based on emotional
nals include respiration and hearth rate (electrocardiography), gal- valence and arousal, with a 61% success rate. Zhang and Lee [25],
vanic skin response, electric muscle (electromyography) and brain proposed an emotion understanding system that classified users’
signals (electroencephalography). In particular, the use of elec- status into two emotional states with accuracy of 73% during
troencephalography (EEG) signals is more reliable because of its image viewing. Chai [26] evaluated three mobile phone applica-
high accuracy and objective evaluation in comparison with other tions using a self-report technique and recording brain activity
external appearance clues like facial expressions and gestures using EEG signals from participants in order to determine the pos-
[12]. Various psychophysiological studies have demonstrated the itive and negative states in the UX of the applications used. Gürkök
correlations between human emotions and EEG signals [13–15]. [27] evaluated the UX of a brain-computer interface game based on
Nonetheless advances in EEG algorithms creation, simplification the expectation of users; they used an evoked visual response as
of devices and real life applications, many features of emotional stimuli, and an EEG to register physiological data. [28] used
states still require to be improved as to obtain accurate interpreta- machine-learning algorithms to categorize EEG dynamics accord-
tions of emotions from EEG signals in the human brain. ing to subject self-reported emotional states during music learning,
In this study, we aim at validating the feasibility of assessing the identifying four emotional states with 82.9% of accuracy.
UX of older adults while they use a cognitive stimulation (CS) Although some of the aforementioned studies suggest that EEG
application using a neurophysiological technique, namely a neural signals can be used to identify the users’ emotional states, there are
network trained to recognize pleasant and unpleasant emotions issues that remain open and should be still explored. In this work,
from EEG signals. we will present the results of using a neural network, trained to
In the first part of the study, we constructed, trained and vali- recognize emotional states evoked from visual stimuli from EGG
dated a Patternnet neural network to identify pleasant and signals (Stage 1), to infer the UX of older adults while they conduct
unpleasant emotions. The neural network was trained using EEG a cognitive stimulation (CS) activity (Stage 2); that is, we will try to
signals recorded during the presentation of visual stimuli that recognize emotional states evoked from visual, auditory and cogni-
induced emotions known a priori. The accuracy obtained was of tive processing stimuli.
93.3% for 600 characteristics of the preprocessed signals [16].
In the second part of the study, we validated the use of the
3. Stage 1: Assessing pleasant and unpleasant emotions from
trained neural network to assess the UX of the older adults while
visual stimuli via EEG signals
they use a CS application. A summary of the main results of the
first study, along with the details of the second study and its main
This section summarizes the procedure performed in [16] to
results are presented here.
design, train and validate the neural network used to recognize
The remainder of the paper is organized as follows: Section 2
pleasant and unpleasant emotions.
presents a summary of related work concerning emotion recogni-
tion and UX evaluation using EEG signals. Sections 3 and 4 present
the process of recognition of emotions from EEG signals. Section 3 3.1. Introduction
summarizes the construction, training and validation of the neural
network used to differentiate between pleasant and unpleasant The EEG signals were recorded using the EMOTIV EPOC + Head-
emotions, while Section 4 uses the previously trained neural net- set1 (Fig. 1), which is a neuro-signal acquisition and processing wire-
work to classify the brain signals recorded during the use of a CS less neuroheadset with 14 saline sensors that is able not only to
application. Section 5 presents the evaluation results, while Sec- detect brain signals but also users facial expressions. The EMOTIVE
tion 6 discusses these results. Finally, concluding remarks and EPOC + Headset provides access to dense array, high quality, raw
future work are presented in Section 7. EEG data by software.
In order to evoke emotions from participants, we used a set of
the International Affective Picture System (IAPS),2 which is a free
2. Emotions and EEG signals access database of pictures designed to provide a standardized set
of pictures for studying emotion and attention that has been widely
Regardless of the context of the situation, people experience a used in psychological research. During the first stage of the study, we
range of emotions whether positive (e.g. joy, gratefulness, sympa- presented a set of the selected IAPS pictures to stimulate known a
thy, happiness, love, etc.) or negative (e.g. displeasure, irritability, priori emotions while recording the EEG response from participants
disgust, anger, sadness, etc.) [17]. Positive emotions are associated with the EMOTIV EPOC + headset. Then, the EEG signals were filtered
with the activation of regions in the left hemisphere of the brain and processed to design and train a neural network that we used to
while negative emotions are related to the activation of regions identify two basic emotional states, namely, pleasant or unpleasant.
in the right hemisphere [18,19]. Participants were eight older adults, 2 male and 6 female, aged
In the frequency domain, the spectral power in various fre- 60–83 years old (AVG 72.3 years, SD 8.46 years). Inclusion criteria
quency bands has been implicated in the emotional state. The were: older adults aged over 60 years, not having suffered a head
asymmetry analysis of the alpha power is recognized as a useful trauma, absence of moderate or severe cognitive problems, and
procedure for the study of emotional reactivity [20]. Furthermore, absence of visual problems.
it is common to find asymmetries in the frontal region of the brain,
which may be perceived on a subject since childhood [21]. In a 3.2. Procedure
study conducted in [22] a spectral analysis of the electrical activity
obtained through an EEG demonstrated that the alpha power var- Phase 1. First, participants were introduced to the experiment,
ies depending on the emotion present (positive or negative). and the EMOTIV EPOC + Headset device was calibrated prior to
In recent years, a growing number of efforts to recognize a per- use in order to set base lines signals. The calibration process was
son’s emotion in real time using EEG have been reported in litera-
ture. For instance, Ishino & Hagiwara [23] proposed a system that 1
http://emotiv.com/.
estimated subjective feeling using neural networks to categorize 2
http://csea.phhp.ufl.edu/Media.html.
204 V. Meza-Kubo et al. / Journal of Biomedical Informatics 62 (2016) 202–209

Fig. 2. Presentation scheme of IAPS pictures.

video recorded, and analyzed to identify verbal and non-verbal


expressions of older adults during their conducting the activity.
Fig. 1. EMOTIVE EPOC + Headset is a low cost 14 channel wireless EEG.

4.1. Participants

realized through the EMOTIV Control Panel Application,3 where Participants were 23 older adults, 10 male and 13 female, aged
participant’s facial expressions were imitated by a virtual robot, 60–83 years old (AVR 65.5 years, SD 6.227 years). Inclusion criteria
and participants practiced mental commands control to move a were: aged over 60 years, not having suffered a head trauma,
floating cube. absence of moderate or severe cognitive problems, not having
Second, a set of IAPS pictures were presented to each partici- motor impairments and absence of visual problems (i.e., being able
pant as follows: pleasant, fear, unpleasant and neutral for 6 s each, to see without glasses at a distance of 30–50 cm).
and immediately after, each participant was asked to indicate what
was his/her impression upon seeing the picture, and selecting one 4.2. Materials
of the following categories for the picture: pleasant, unpleasant,
neutral and fear respectively (see Fig. 2). In this manner, the col- To carry out the study we used:
lected brain signals were automatically labeled with the corre-
sponding participant’s answers.  CS application. We used Abueparty [29], a cognitive wellness
Phase 2. In this phase the EEG signals were processed. First, system that uses a game board metaphor similar to the tradi-
these signals were preprocessed and their feature signals were tional ‘‘Snakes and Ladders” game (Fig. 3). The main game (up
extracted using the Fast Fourier Transform. Second, the signal pow- to 4 players) integrates several mini-games that implement
ers were classified into brain waves: Alpha, Beta and Theta. And cognitive challenges that users have to complete in order to
finally, these records where used as input for a Patternnet neural advance through the board to the finish line. The application
network with a 151-neuron hidden layer. The neural network uses a tactile screen and a custom tangible control to ease inter-
was trained using the brain signals from pleasant and unpleasant action for the elderly.
emotions and the participants’ verbal responses. The network clas-  EMOTIV EPOC + Headset to register EEG signals through 14
sification accuracy of the brain signals in the corresponding emo- electrodes (AF3, F7, F3, FC5, T7, P7, O1, O2, P8, T8, FC6, F4, F8,
tions of pleasant or unpleasant obtained was of 93.3% for 600 AF4) distributed according to the International System 10–20
characteristics of the preprocessed signals during the validation [30].
process (refer to [16] for a more detailed explanation).  EMOTIV Control Panel for calibrating the headset and training.
 Video camcorder for registering the activities carried out during
the study for future qualitative analysis.
4. Stage 2: Assessing the UX of older adults conducting CS  Questionnaires (Likert TAM-Like) for assessing user experience
activities by means of a neural network that recognizes [31].
emotions from EEG signals
4.3. Procedure
In this section we report on a study to evaluate the feasibility of
using the neural network obtained in the previous stage to assess Fig. 4 depicts the different evaluation phases. A brief description
the UX of older adults conducting CS activities. It should be high- of these follows.
lighted that the neural network was trained to recognize pleasant
and unpleasant emotional states from EEG signal evoked by visual 4.3.1. Introduction
stimuli using IAPS pictures. However, in this stage, the target emo- Firstly, participants where individually introduced to the exper-
tional states will be evoked by the UX of older adults while using a iment and to the main features of the EMOTIV EPOC + Headset so
CS application that generates not only visual stimuli, but also audi- that they could get familiar with its use. Participants were also
tory and cognitive processing stimuli. asked to sign a consent form, and the Mini Mental State Examina-
In order to validate the recognition accuracy of the neural net- tion (MMSE) was applied [32]. The MMSE is the most commonly
work, we conducted additional self-report and qualitative analysis used test for complaints of problems with memory or other mental
by observation UX evaluations of the CS activity to contrast the abilities. It consists of a series of questions and tests, each of which
obtained results. scores points if answered correctly. After the introduction, the
For the self-report evaluation participants were asked to answer device was calibrated for each participant by using the EMOTIV
a questionnaire immediately after concluding the CS activity. For Control Panel Application. Finally, participants were introduced
the qualitative analysis evaluation, the CS activity sessions were to Abueparty, its use and interaction with its mini-games. Also
the instructions regarding the use of the tactile screen and the tan-
3
http://emotiv.com/product/xavier-control-panel/. gible control were provided.
V. Meza-Kubo et al. / Journal of Biomedical Informatics 62 (2016) 202–209 205

Fig. 3. Abueparty’s main screen.

Fig. 4. Emotion recognition process using EEG signals.

4.3.2. Data collection complete interaction of the participants was video recorded for
The activity was performed in pairs, competing among them to later analysis.
reach the goal. At this stage, each participant wore the EMOTIV
EPOC + headset, so that the brain activity may be recorded using 4.3.3. Data processing
the Test-Bench software. During the game, an observer inserted a The recorded EEG signal included some artifacts (e.g., electrical
mini-game label that served for future signal segmentation. The signals due to facial movements, eye blinking, etc.), which must be
206 V. Meza-Kubo et al. / Journal of Biomedical Informatics 62 (2016) 202–209

removed in order to enhance the EEG signal. To remove these arti- participants reported being a little anxious (9.78/100). In particu-
facts we applied a pre-processing procedure consisting of average lar, participants P3, P4 and P7 indicated that they had been a little
elimination and best linear fit of the signal mean, a Hamming win- more anxious during the activity (37.5/100, 50/100, and 50/100,
dow and a Finite Impulse Response (FIR) bandpass filter (1–30 Hz) respectively).
were used. After that, a Fast Fourier Transform (FFT) was applied to Regarding the ‘‘I enjoyed playing Abueparty” (pleasant) ques-
each brain wave (Alpha, Beta and Theta), in order to extract signal tion, all participants reported having enjoyed (AVG 99.63/100)
features such as magnitude, frequency and spectral power. Further the activity and the application (see Fig. 5).
detail about the data processing procedure is provided elsewhere This result would be usually interpreted and reported as ‘‘all
[16]. participants (23/23, 100%) had a pleasant experience conducting
the CS activities; although some of them (3, 13.04%) reported being
4.3.4. Data classification a little anxious also”.
Previous to classification the processed EEG signals must be
segmented in order to fit as input data to the neural network. 5.2. Qualitative analysis UX evaluation
Finally, we used the Patternnet neural network previously trained
for signal classification. The neural network provides an output in As described in the previous section, the activity of the partici-
the form (P, U), where P is the percentage of inputs classified as pants was videotaped during the study for later analysis. The
‘‘pleasant” and U is the percentage of inputs classified as videos were analyzed to assess the UX using the qualitative tech-
‘‘unpleasant”. niques described in [33]. This method suggests observing verbal
At the end of the game, participants were asked to answer an and non-verbal expressions that may indicate the emotional states
on-exit survey to obtain their perception on the intention of use of participants. Through a quantification of the frequencies of these
and on their UX perception. expressions and a qualitative analysis of them, an UX expert deter-
mined the overall user experience classifying it as pleasant, neutral
or unpleasant (see Fig. 6).
5. Results The results of the qualitative evaluation concluded that most
older adults mainly showed expressions and attitudes of pleasure
In this section we present the results from the self-report, qual- during the activity (15/23, 65.21%). Further, there were some par-
itative assessment, and neurophysiological evaluations in order to ticipants (P4, P7, P9 and P17) that mainly showed expressions and
compare and validate them. attitudes of displeasure (4/23, 17.39%). In addition, there was a
participant that showed both types of expressions almost equally
5.1. Self-reporting UX evaluation (P2, 1/23, 4.34%) and some participants that mainly were inexpres-
sive (P1, P10 and P11); all of them were classified as Neutral (4/23,
To obtain the older adults’ perception on the use and on the UX 17.39%).
of the application by self-report we used a TAM-Like 5 point Likert Participant P1, although focused, remained serious throughout
scale questionnaire used in [31]. Although the questionnaire com- the activity and showed no expression indicating either pleasant
prises several UX aspects, for this study we focused on the percep- (enjoy) or unpleasant (anxiety) emotions; he did not even inter-
tion of enjoyment (pleasant) and anxiety (unpleasant) aspects. acted much with his activity partner. It was the same case for par-
Regarding the ‘‘Felt anxiety while using the system” (unpleas- ticipants P10 and P11, whom conducted the activity together.
ant) question, participants reported being little anxious during Participants P2 and P17 played together. They remained
the game (see Fig. 5; data were normalized 0 to 100). On average, focused and participating in the activity, although both of them

Fig. 5. Results reported by participants in the self-report UX evaluation.

Fig. 6. Results reported by the expert observer in the qualitative analysis UX evaluation.
V. Meza-Kubo et al. / Journal of Biomedical Informatics 62 (2016) 202–209 207

Fig. 7. Results obtained by the neural network in the neurophysiological evaluation.

Table 1 This result would be usually interpreted and reported as ‘‘most


Summary of results of the UX assessment studies, by emotion.
participants (17/23, 73.91%) had a pleasant experience conducting
Emotion Self-report Qualitative analysis EEG signal classification the CS activities, while the remaining participants (6/23, 26.09%)
Pleasant 23 (100%) 15 (65.22%) 17 (73.91%) had an unpleasant experience”.
Unpleasant 0 4 (17.39%) 6 (26.09%)
Neutral 0 4 (17.39%) 0

6. Discussion
looked nervous at the beginning. There was very little interaction
between them. P17 looked very serious throughout the activity;
6.1. UX of the older adults while conducting the CS activity
for this reason, he was considered as having unpleasant emotions.
P2 was considered as being neutral.
Regarding the meaning of these results from the point of view
Participants P9 and P14 played together. During the activity
of the UX of older adults, as can be seen in Table 1, their user expe-
participant P14 was more active than participant P9. However, par-
rience during their conducting the CS activity using the Abueparty
ticipant P14 criticized P9’s plays, causing him to get bothered and
application could be considered as pleasant (from 65.22% to 100%).
make comments such as ‘‘It can’t be, the same mini-game again!”,
Only 4–6 participants (from 17.39% to 26.09%) considered that
‘‘I do not want to play anymore”, ‘‘It can’t be, another wrong
they had an unpleasant experience. It should be noted that this
answer”, ‘‘Better luck next time, I better leave”. Thus, although
value could increase up to 34.78% if those participants identified
P14 enjoyed the game, P9 did not.
as having a neutral experience turned out to have an unpleasant
Participants P4 and P7 played together, and were evaluated as
one.
having an unpleasant UX during their participation. The reason Regarding the meaning of these results from the point of the
for this was because they looked very serious during their partici-
accuracy of the proposed EEG signal neural network based classifi-
pation, and because it was observed that in one occasion during cation method; it is necessary to compare and take a more in-detail
the ‘‘Puzzle” mini-game, participant P7 gave instructions to P4 on
look at the particular results from each of the used methods. To do
how to move the puzzle piece on the screen, which apparently so, we will group the obtained emotions and analyze them as
annoyed P4, who replied in an annoyed tone ‘‘Do not tell me what
ordered triplets (SR, QA, EC), with the results from the Self-report
to do!”. (SR), Qualitative Analysis (QA), and EEG signal Classification (EC),
This result would be usually interpreted and reported as ‘‘most
as shown in the first column of Table 2.
participants (15/23, 65.21%) had a pleasant experience conducting Firstly, we could consider valid results from the EEG signal clas-
the CS activities; others (4/23, 17.39%) had an unpleasant experi-
sification method when the three methods arrive to the same
ence, and the rest (4/23, 17.39%) were considered as having a neu-
result. The only case where it happens is when the three methods
tral experience”.
coincide in (P, P, P – Pleasant, Pleasant, Pleasant) emotions. Sec-
ondly, we could also consider valid results from the EEG signal
5.3. Neurophysiological UX evaluation classification method when two methods arrive to the same result.
On the one hand, this occurs when the results of the EEG signal
The results concerning the neural network classification of the classification coincides with those of qualitative analysis (either
EEG signals into pleasant or unpleasant emotional states reported (? P P) or (? U U)). The ? symbol in the triplet is a placeholder that
indicate that (17/23, 73.91%) of the participants showed EEG sig- stands for any value. On the other hand, it happens when the
nals classified with pleasant as a predominant emotion. The results of EEG signal classification coincides with those of self-
remaining participants (P1, P2, P9, P11, P17, P19, 6/23, 26.09%) report (P ? P).
were classified as predominantly showing unpleasant emotions. Additionally, we will also consider valid results from the EEG
Participants P1, P11 and P17 showed a little difference between signal classification method when its results coincide with the
the signals classified as pleasant and unpleasant, with a slight pre- results of any of the other methods. That is, this occurs when the
dominance of the latter. Further, participants P2, P9 and P19 results of EEG signal classification coincides with those of qualita-
showed a higher percentage of unpleasant emotions (greater than tive analysis (either (? P P) or (? U U)) and with those of self-report
67%). Details of the results of the classification of EEG signals into (P ? P).
pleasant and unpleasant emotions by the neural network are sum- Accuracy results for the EEG signal classification method, con-
marized in Fig. 7. sidering the previous analysis, are presented next.
208 V. Meza-Kubo et al. / Journal of Biomedical Informatics 62 (2016) 202–209

Table 2 between methods, as in self-report the participant stated that it


Detailed summary of results of the UX assessment studies, as triplets of emotions (X, was a Pleasant experience, in EEG signal classification it was clas-
Y, Z).
sified as Unpleasant, and the observer classified their emotions as
Recognized emotions N % neutral (that is, the participants remained inexpressive during
PPP 14/23 60.87 their conducting the CS activity).
PUP 2/23 13.04
PNP 1/23 4.35
PUU 2/23 8.70 6.3. Validity of UX evaluation results vs. Cost of achieving the results
PPU 1/23 4.35
PNU 3/23 13.04 As a final point of the discussion, and considering the cost and
P = Pleasant, U = Unpleasant, N = Neutral. benefit of conducting an UX evaluation with any of the three pro-
posed methods, our results suggest that it would be advisable to
complement the results of the three methods in order to increase
6.1.1. Coincidence among the three methods (P P P) their validity.
As can be seen in Table 2, there are 14 cases out of 23 where the
results are (P, P, P). This means that (i) participants expressed that The rationale behind our suggestion includes:
they had a pleasant experience, (ii) the expert observer noticed
that participants presented verbal expressions and gestures that i Although a qualitative analysis by an expert observer could
denoted a pleasant experience, and (iii) that the neural network be the method that provides the most informed results about
classified the EEG signals as a pleasant emotional state for that par- the UX of a participant while conducting a certain activity, it
ticipant. Then this represents an accuracy of 60.87% for the neural is clear that it is also the most effort expensive method, and
network used in the neurophysiological UX evaluation. that the observer may introduce a bias due to his/her own
beliefs or previous experiences [10];
6.1.2. Coincidence between qualitative analysis and EEG signal ii Although a self-report could be the least effort expensive
classification (? P P) or (? U U) method to conduct an UX evaluation, we could not blindly
As can be seen in Table 2, there are 16 cases where the results believe in its results, as it is well known that participants tend
from the neural network coincide with those of the qualitative anal- to ameliorate the reported perception of the results because
ysis of the expert observer (14 (? P P) and 2 (? U U)). In this case, it they feel evaluated or because they say what they think the
could be considered that the accuracy of the neural network for the researcher wants to hear [10];
recognition of pleasant or unpleasant emotional states is 69.56%. iii Although a neurophysiological UX evaluation could be the
most objective method to conduct an UX evaluation, we can-
6.1.3. Coincidence between self-report and EEG signal classification (P not blindly believe in its results (as there could be cases
? P) where neurophysiological data is erroneously captured), and
As can be seen in Table 2, there are 17 cases where the results completely ignore what the users could say in a self-report
from the neural network coincide with those of the Self-report or during an observation [12].
evaluation (P ? P). In this case, it could be considered that the accu-
racy of the neural network for the recognition of pleasant or Further, based on the evidence obtained from this study, we
unpleasant emotional states is 73.91%. could suggest conducting firstly both self-report and EEG signal
classification evaluations and contrast the results (considering
them as the least expensive methods). Our data suggests that the
6.2. Coincidence between EEG signal classification and either self-
accuracy of these results would be around 70%.
report or qualitative analysis (? P P) or (? U U) or (P ? P)
Also, it would be advisable to consider the cases where the
results do not coincide between self-report and EEG signal classifi-
As can be seen in Table 2, there are 19 cases where the results
cation, and then conduct a qualitative analysis by an expert obser-
from the neural network coincide with those of either additional
ver on these cases, not only to validate the data, but to be informed
method (17 (P ? P) or (? P P)) plus 2 more in (? U U). In this case,
about the causes of those differences. Based on the results of this
it could be considered that the accuracy of the neural network for
study, the evidence suggests that with an effort of 30% during
the recognition of pleasant or unpleasant emotional states
the qualitative analysis (on the results that do not coincide with
increases to 82.61%.
the other methods), the accuracy of the data could be increased
up to 80%.
6.2.1. Difference between EEG signal classification and either self-
Finally, it should be clear that it is necessary to conduct further
report or qualitative analysis results (P P U) and (P N U)
complementary UX studies using the three different methods, in
The remaining 4 cases where the EEG signal classification
order to appropriately validate these suggestions no matter how
results does not coincide with any of the other methods represent
promising the evidence could be.
a 17.39%.
From these, there is 1 case (P P U) where the results of self-
report coincide with those of qualitative analysis but not with that 7. Conclusions and future work
of EEG signal classification, and where from the qualitative analysis
we know that the EEG signal classification result is wrong. In the In this work we report the results of a study to evaluate the fea-
video recording of the session of P19, s/he sees focused, active sibility of using a neural network previously obtained to assess
and enjoying the activity (Pleasant). From here, we are certain that pleasant and unpleasant emotions of older adults. The neural net-
in this case the neural network did not correctly recognized the work was trained to recognize known a priori emotional states
emotions. A possible explanation for this could be that the EEG sig- from EEG signals evoked by visual stimuli using IAPS pictures.
nal was incorrectly captured, for instance, due to a bad placement The challenge in this study was that the target emotional states
of the EEG headset in the head of the participant at capture time. will be evoked by the UX of older adults while using a CS applica-
Finally, in the remaining 3 cases (P N U), we do not have enough tion, that is considering not only visual stimuli, but also auditory
information as to assert whether or not the results coincide and cognitive processing stimuli.
V. Meza-Kubo et al. / Journal of Biomedical Informatics 62 (2016) 202–209 209

The results from our study suggest that it is feasible to assess [9] A.P. Vermeeren, E.L.-C. Law, V. Roto, M. Obrist, J. Hoonhout, K. Väänänen-
Vainio-Mattila, User experience evaluation methods: current state and
the UX of older adults while they use a CS application using the
development needs, in: Proceedings of the 6th Nordic Conference on
neural network trained to recognize pleasant and unpleasant emo- Human–Computer Interaction: Extending Boundaries, ACM, 2010, pp. 521–
tions from EEG signals. The recognition accuracy of the proposed 530, http://dx.doi.org/10.1145/1868914.1868973.
method ranges from 60.87% to 82.61%, which is in the same range [10] L. Arhippainen, M. Tähti, Empirical evaluation of user experience in two
adaptive mobile application prototypes, in: Proceedings of the 2nd
of results from previous related studies [23–26,28]. International Conference on Mobile and Ubiquitous Multimedia, 2003, pp.
Also, our study results suggest that it is feasible to conduct UX 27–34.
evaluations following a complementing approach among the 3 pre- [11] R.L. Mandryk, K.M. Inkpen, T.W. Calvert, Using psychophysiological techniques
to measure user experience with entertainment technologies, Behav. Inform.
sented methods, in order to increase the accuracy of the results Technol. 25 (2) (2006) 141–158, http://dx.doi.org/10.1080/
while reducing the costs associated to actually conducting the 01449290500331156.
three studies. To do so, researchers may conduct complete (least [12] G.L. Ahern, G.E. Schwartz, Differential lateralization for positive and negative
emotion in the human bra, in: EEG spectral analysis, Neuropsychologia 23 (6)
cost demanding) Self-report and Neurophysiological UX evalua- (1985) 745–755, http://dx.doi.org/10.1016/0028-3932(85)90081-8.
tions, contrast results, and conduct (most cost demanding) qualita- [13] D. Sammler, M. Grigutsch, T. Fritz, S. Koelsch, Music and emotion:
tive analysis by expert observer only on the results that do not electrophysiological correlates of the processing of pleasant and unpleasant
music, Psychophysiology 44 (2) (2007) 293–304, http://dx.doi.org/10.1111/
coincide between the two former evaluations (a possible reduction j.1469-8986.2007.00497.x.
of 70% on the effort required to conduct the latter evaluation). [14] G.G. Knyazev, J.Y. Slobodskoj-Plusnin, A.V. Bocharov, Gender differences in
In both cases, it is necessary to conduct further, and larger, com- implicit and explicit processing of emotional facial expressions as revealed by
event-related theta synchronization, Emotion 10 (5) (2010) 678–687, http://
plementary UX studies using the three different methods, in order
dx.doi.org/10.1037/a0019175.
to appropriately validate these results. [15] D. Mathersul, L.M. Williams, P.J. Hopkinson, A.H. Kemp, Investigating models
For this reason, as future work, we propose to retrain the neural of affect: relationships among EEG alpha asymmetry, depression, and anxiety,
network using not only EEG signals evoked with visual stimuli, but Emotion 8 (4) (2008) 560–572, http://dx.doi.org/10.1037/a0012811.
[16] I. Carrillo, V. Meza-Kubo, A.L. Morán, G. Galindo, E. García-Canseco, Processing
also including auditory and cognitive processing generated stimuli. EEG signals towards the construction of a user experience assessment method,
Then, we will use the trained neural network to recognize pleasant in: Ambient Intelligence for Health, Lecture Notes in Computer Science, vol.
and unpleasant emotional states to evaluate specific segments of 9456, Springer, 2015, pp. 281–292, http://dx.doi.org/10.1007/978-3-319-
26508-7_28.
interaction (for instance, by single mini-game). Furthermore, we [17] B.L. Fredrickson, M.F. Losada, Positive affect and the complex dynamics of
propose to reuse these techniques on other CS and rehabilitation human flourishing, Am. Psychologist 60 (7) (2005) 678–686, http://dx.doi.org/
applications that we have developed, to validate the proposed 10.1037/0003-066X.60.7.678.
[18] E. Harmon-Jones, P.A. Gable, C.K. Peterson, The role of asymmetric frontal
complementing approach. cortical activity in emotion-related phenomena: a review and update, Biol.
Psychol. 84 (3) (2010) 451–462, http://dx.doi.org/10.1016/j.
biopsycho.2009.08.010.
Conflict of interest [19] I. Winkler, M. Jäger, V. Mihajlovic, T. Tsoneva, Frontal EEG asymmetry based
classification of emotional valence using common spatial patterns, World
None declared. Acad. Sci. Eng. Technol. 45 (2010) 373–378.
[20] R.J. Davidson, P. Ekman, C.D. Saron, J.A. Senulis, W.V. Friesen, Approach-
withdrawal and cerebral asymmetry: emotional expression and brain
Acknowledgements physiology: I, J. Personality Social Psychol. 58 (2) (1990) 330–341.
[21] R.J. Davidson, N.A. Fox, Frontal brain asymmetry predicts infants’ response to
maternal separation, J. Abnormal Psychol. 98 (2) (1989) 127.
We acknowledge the support of UABC, in the form of Programa [22] M. Kostyunina, M. Kulikov, Frequency characteristics of EEG spectra in the
de Servicio Social 212 and Internal project number 231; and of emotions, Neurosci. Behav. Physiol. 26 (4) (1996) 340–343, http://dx.doi.org/
CONACYT in the form of scholarship number 538130 for the third 10.1007/BF02359037.
[23] K. Ishino, M. Hagiwara, A feeling estimation system using a simple
author. We also acknowledge the elderly participants from Ense- electroencephalograph, IEEE International Conference on Systems, Man and
nada and Mexicali, B.C., Mexico for their support and participation Cybernetics, 2003, vol. 5, IEEE, 2003, pp. 4204–4209, http://dx.doi.org/
in the study. 10.1109/ICSMC.2003.1245645.
[24] P.C. Petrantonakis, L.J. Hadjileontiadis, Emotion recognition from EEG using
higher order crossings, IEEE Trans. Inform. Technol. Biomed. 14 (2) (2010)
References 186–197, http://dx.doi.org/10.1109/TITB.2009.2034649.
[25] Q. Zhang, M. Lee, Analysis of positive and negative emotions in natural scene
using brain activity and GIST, Neurocomputing 72 (4) (2009) 1302–1306,
[1] R.C. Petersen, R. Doody, A. Kurz, R.C. Mohs, J.C. Morris, P.V. Rabins, K. Ritchie,
http://dx.doi.org/10.1016/j.neucom.2008.11.007.
M. Rossor, L. Thal, B. Winblad, Current concepts in mild cognitive impairment,
[26] J. Chai, Y. Ge, Y. Liu, W. Li, L. Zhou, L. Yao, X. Sun, Application of frontal EEG
Arch. Neurol. 58 (12) (2001) 1985–1992, http://dx.doi.org/10.1001/
asymmetry to user experience research, in: Engineering Psychology and
archneur.58.12.1985.
Cognitive Ergonomics, Lecture Notes in Computer Science, vol. 8532, Springer,
[2] P. Rashidi, A. Mihailidis, A survey on ambient-assisted living tools for older
2014, pp. 234–243, http://dx.doi.org/10.1007/978-3-319-07515-0_24.
adults, IEEE J. Biomed. Health Inform. 17 (3) (2013) 579–590, http://dx.doi.org/
[27] H. Gürkök, G. Hakvoort, M. Poel, Evaluating user experience with respect to
10.1109/JBHI.2012.2234129.
user expectations in brain-computer interface games, in: Proceedings of the
[3] W. Ijsselsteijn, H.H. Nap, Y. de Kort, K. Poels, Digital game design for elderly
5th International Brain-Computer Interface Conference, BCI 2011, Verlag der
users, in: Proceedings of the 2007 Conference on Future Play, ACM, 2007, pp.
Technischen Universität Graz, 2011, pp. 348–351.
17–22, http://dx.doi.org/10.1145/1328202.1328206.
[28] Y.-P. Lin, C.-H. Wang, T.-P. Jung, T.-L. Wu, S.-K. Jeng, J.-R. Duann, J.-H. Chen,
[4] C. Lallemand, G. Gronier, V. Koenig, User experience: a concept without
EEG-based emotion recognition in music listening, IEEE Trans. Biomed. Eng. 57
consensus? Exploring practitioners perspectives through an international
(7) (2010) 1798–1806, http://dx.doi.org/10.1109/TBME.2010.2048568.
survey, Comput. Human Behav. 43 (2015) 35–48, http://dx.doi.org/10.1016/j.
[29] V. Meza-Kubo, A. Morán, Abueparty: an everyday entertainment system for
chb.2014.10.048.
the cognitive wellness of the worried-well, in: 5th International Symposium of
[5] M. Hassenzahl, User experience (UX): towards an experiential perspective on
Ubiquitous Computing and Ambient Intelligence, UCAMI, 2011.
product quality, in: Proceedings of the 20th International Conference of the
[30] H. Jasper, Report of the committee on methods of clinical examination in
Association Francophone d’Interaction Homme-Machine, ACM, 2008, pp. 11–
electroencephalography, Electroencephalogr. Clin. Neurophysiol. 10 (2) (1958)
15, http://dx.doi.org/10.1145/1512714.1512717.
370–375.
[6] E.L.-C. Law, V. Roto, M. Hassenzahl, A.P. Vermeeren, J. Kort, Understanding,
[31] V. Meza-Kubo, A.L. Morán, UCSA: a design framework for usable cognitive
scoping and defining user experience: a survey approach, in: Proceedings of
systems for the worried-well, Personal Ubiquitous Comput. 17 (6) (2013)
the SIGCHI Conference on Human Factors in Computing Systems, ACM, 2009,
1135–1145, http://dx.doi.org/10.1007/s00779-012-0554-x.
pp. 719–728, http://dx.doi.org/10.1145/1518701.1518813.
[32] J. Cockrell, M. Folstein, Mini-mental state examination (MMSE),
[7] H.L. O’Brien, E.G. Toms, What is user engagement? A conceptual framework for
Psychopharmacol. Bull. 24 (4) (1988) 689–692.
defining user engagement with technology, J. Am. Soc. Inform. Sci. Technol. 59
[33] A.L. Morán, V. Meza-Kubo, Evaluating the user experience of a cognitive
(6) (2008) 938–955, http://dx.doi.org/10.1002/asi.20801.
stimulation tool through elders’ interactions, in: Intelligent Environments
[8] M. Hassenzahl, N. Tractinsky, User experience – a research agenda, Behav.
(Workshops), IOS Press, 2012, pp. 66–77, http://dx.doi.org/10.3233/978-1-
Inform. Technol. 25 (2) (2006) 91–97, http://dx.doi.org/10.1080/
61499-080-2-66.
01449290500330331.

You might also like