You are on page 1of 6

11th

11th IFAC
IFAC Symposium
Symposium on on Robot
Robot Control
Control
11th IFAC
August
August Symposium
26-28,
26-28, 2015. on Robot
2015. Salvador,
Salvador, Control
BA,
BA, Brazil
Brazil
11th IFAC Symposium on Robot Control
August 26-28, 2015. Salvador, BA, Brazil
Available online at www.sciencedirect.com
August 26-28, 2015. Salvador, BA, Brazil

ScienceDirect
IFAC-PapersOnLine 48-19 (2015) 136141
Development of a Human Machine
Development
Development of
of aa Human
Human Machine
Machine
Interface
Interface for
for Control
Control of
of Robotic
Robotic Wheelchair
Wheelchair
Interface for Control of Robotic
and Smart Environment Wheelchair
and Smart Environment
and Smart Environment

Richard
Richard J. J. M.
M.G.
G. Tello
Tello ,, Alexandre
Alexandre L.
L. C.
C. Bissoli
Bissoli ,,
Richard
Flavio
Richard J.
FerraraM.
J. M. G. Tello
,, Sandra , Alexandre
Tello Muller
L.
, AndreC. Bissoli
Bissoli ,, ,,
L. C. Ferreira
Flavio Ferrara G. Sandra , Alexandre
Muller , Andre Ferreira
Flavio Ferrara
Flavio Ferrara , Sandra
Teodiano

, SandraF. Muller ,
Bastos-Filho
Andre

Ferreira
Ferreira ,,
Teodiano F. Muller , Andre
Bastos-Filho
Teodiano F. Bastos-Filho
Teodiano F. Bastos-Filho

Post-Graduate Program in Electrical Engineering (PPGEE). Federal
Post-Graduate Program in Electrical Engineering (PPGEE). Federal
Post-Graduate Program in
University
University of
Post-Graduate of Espirito
ProgramSanto
Espirito in Electrical
(UFES).
Electrical
Santo (UFES).
Engineering
Av.
Engineering (PPGEE).
Av. Fernando
Fernando Ferrari
(PPGEE). Federal
514.
Ferrari Federal
514.
University of
of Espirito
University Vitoria, Santo
Brazil.
Espirito (UFES).
(e-mail:
Santo (UFES). Av. Fernando Ferrari
richard@ele.ufes.br;
Av. Fernando Ferrari 514.
514.
Vitoria, Brazil. (e-mail: richard@ele.ufes.br;
Vitoria, Brazil. (e-mail:
alexandre-bissoli@hotmail.com; richard@ele.ufes.br;
andrefer@ele.ufes.br;
Vitoria, Brazil. (e-mail: richard@ele.ufes.br;
alexandre-bissoli@hotmail.com; andrefer@ele.ufes.br;
alexandre-bissoli@hotmail.com;
teodiano.bastos@ufes.br)
alexandre-bissoli@hotmail.com; andrefer@ele.ufes.br;
andrefer@ele.ufes.br;
teodiano.bastos@ufes.br)

Politecnico di Milano:teodiano.bastos@ufes.br)
Piazza
Politecnico di Milano: Piazza Leonardo da
Leonardo
teodiano.bastos@ufes.br) da Vinci,
Vinci, 20133,
20133, Milano,
Milano,
Politecnico di Milano: Piazza Leonardo da Vinci, 20133,
Italy
Politecnico di Milano:
Italy (e-mail:
Piazza
(e-mail: femferrara@gmail)
Leonardo da
femferrara@gmail) Vinci, 20133, Milano,
Milano,
Italy
Electrical Engineering
Italy (e-mail:
(e-mail: femferrara@gmail)
Department, Federal Institute
femferrara@gmail) Institute of
of Esprito
Esprito
Electrical Engineering Department, Federal
Electrical Engineering Department, Federal Institute of
Santo
Santo (IFES).
Electrical Av.
Av. Vitoria,
(IFES). Engineering 1729,
1729, 29040-780.
Vitoria, Department, Vitoria,
Federal
29040-780. Institute
Vitoria, of Esprito
Brazil
Brazil (e-mail:
Esprito
(e-mail:
Santo (IFES). Av. Vitoria, 1729,
1729, 29040-780. Vitoria, Brazil (e-mail:
29040-780.
sandra.muller@ifes.edu.br)
Santo (IFES). Av. Vitoria, Vitoria, Brazil (e-mail:
sandra.muller@ifes.edu.br)
sandra.muller@ifes.edu.br)
sandra.muller@ifes.edu.br)
Abstract:
Abstract: In In this
this work,
work, we we address
address thethe problem
problem of of integrating
integrating aa robotic
robotic wheelchair
wheelchair into into aa smart
smart
Abstract:
environment.
Abstract: In
In this
This
this work,
approach
work, we
we address
allows
address the problem
people
the with
problem of
of integrating
disabilities
integrating to aacontrol
robotic
robotic wheelchair
home into
into aa of
appliances
wheelchair smart
the
smart
environment. This approach allows people with disabilities to control home appliances of the
environment.
environment
environment.using This
using
This a approach
aapproach
Human Computer allows
Computer people
allows people with
Interface disabilities
(HCI)
with (HCI) basedto
disabilities to control
oncontrol
differenthomehome appliances
biological signals.
appliances of the
The
of The
the
environment Human Interface based on different biological signals.
environment
home appliances
environment using
using a Human
includes
a Human TV, Computer
radio, Interface
lights/lamp (HCI)
and based
fan. Three on different
control biological
paradigms signals.
using The
surface
home appliances includes TV,Computer Interface (HCI)
radio, lights/lamp and fan. based
Three oncontrol
different biologicalusing
paradigms signals. The
surface
home
home appliances
appliances includes
Electromyography (sEMG), TV, radio,
radio, lights/lamp
lights/lamp and
TV,Electrooculography fan.
fan. Three
(EOG) and control
control paradigms
paradigms using using surface
Electromyography includes
(sEMG), Electrooculography and (EOG) and Electroencephalography
Three Electroencephalography (EEG)
surface
(EEG)
Electromyography
signals were
Electromyography used. (sEMG),
These
(sEMG), signals Electrooculography
are captured
Electrooculography through (EOG)
(EOG)a and
biosignal
and Electroencephalography
acquisition
Electroencephalographysystem. Three (EEG)
(EEG)sub-
signals were used. These signals are captured through a biosignal acquisition system. Three sub-
signals
paradigms were
signals werefor used.
for
used. These
sEMG/EOG signals
These signals are
analyzes captured through
were defined:
are captured defined:
through a
movingbiosignal acquisition
eyes horizontally
a biosignal horizontally system. Three
(left/right),
acquisition (left/right),
system. Three sub-
raising
sub-
paradigms sEMG/EOG analyzes were moving eyes raising
paradigms
brow
paradigmsand for sEMG/EOG
prolonged
for sEMG/EOG clench. analyzes
On the were
other defined:
hand, moving
the eyes
navigation horizontally
of the (left/right),
wheelchair is raising
executed
brow and prolonged clench.analyzes
On the wereotherdefined:
hand, the moving eyes horizontally
navigation of the wheelchair (left/right), raising
is executed
brow
brow and
through andan prolonged
Steady-State clench. On
On the
Visually other
Evoked hand,
hand, the
otherPotentials navigation
navigation of
the(SSVEP)-BCI. the
the wheelchair
of Each stage is executed
through anprolonged
Steady-State clench.Visually the
Evoked Potentials (SSVEP)-BCI. Each stage of
wheelchair of ouris proposed
our executed
proposed
through
system
throughshowedan Steady-State
showed a good
an Steady-State Visually
good performance
performance Evoked
Visually Evoked for mostPotentials
most subjects.
Potentials (SSVEP)-BCI. Each
Therefore, volunteers
(SSVEP)-BCI. volunteers stage
Each stagewere of
were our proposed
recruited
of our proposed to
system a for subjects. Therefore, recruited to
system
system showed
participate of the
showed a
a good
the studyperformance
good and were
performance for
for most
were distributedmost subjects.
distributed in two
two groups
subjects. Therefore,
groups (subjects
Therefore, volunteers
for home
volunteers home were recruited
appliances
were recruitedand andto
to
participate of study and in (subjects for appliances
participate
subjects
participate forof the study
SSVEP-BCI). and Thewere distributed
average in
accuracy two for groups
prolonged(subjects
clench for home
approach appliances
was of and
95%,
subjects forofSSVEP-BCI).
the study andThe wereaverage
distributed in two
accuracy forgroups
prolonged(subjects
clenchforapproach
home appliances
was of 95%, and
subjects
the raising for SSVEP-BCI).
forbrow was The average accuracy for prolonged clench approach
approach was was of 95%,
subjects
the raising brow was 85%
SSVEP-BCI). 85% and Themoving
and averageeyes
moving eyes achieved
accuracy
achieved for 93%. Multivariate
prolonged
93%. clench Synchronization
Multivariate Synchronization of Index
95%,
Index
the
(MSI) raising
the raising brow
was used
used
browfor was
for
was 85%
feature
85% and and moving
extraction
movingfrom eyes
from achieved
EEG
eyes EEG
achieved 93%.
signals.
93%. TheMultivariate Synchronization
flickering frequencies
Multivariate frequencies
Synchronization were 8.0 Index
8.0
IndexHz
(MSI) was feature extraction signals. The flickering were Hz
(MSI)
(top),
(MSI) 11.0was
was used
Hz for
for feature extraction from EEG signals. The flickering frequencies were 8.0
8.0 Hz
(top), 11.0 Hz (right),
used (right), 13.0
feature Hz
Hz (bottom)
13.0 extraction
(bottom) fromand 15.0
andEEG Hz
Hz (left).
15.0signals. The
(left). Results
flickering
Results from this
this approach
fromfrequencies approachwereshowed
showed Hz
(top),
that 11.0
11.0 HzHz (right),
(top),classification varies 13.0
in Hz
Hz (bottom) and
and 15.0 Hz
Hz (left). Results from this
this approach showed
that classification(right),
varies in the
13.0 the range
range of
(bottom) of 45-77%
45-77%15.0among
among subjects
(left). Results
subjects using
using from window
window length
length of
approach 11 s.
of showeds.
that
that classification
classification varies
varies in the
the range
rangeof of
inFederation of 45-77% among
among subjects
45-77% Control) subjects using window length of 11 s.
2015, IFAC (International Automatic Hosting byusing Elsevier window
Ltd. Alllength
rightsofreserved.
s.
Keywords: SSVEP-BCI, Robotic Wheelchair, EEG,
Keywords: SSVEP-BCI, Robotic Wheelchair, EEG, sEMG, EOG, Smart Environment. sEMG, EOG, Smart Environment.
Keywords:
Keywords: SSVEP-BCI,
SSVEP-BCI, Robotic Robotic Wheelchair,
Wheelchair, EEG, EEG, sEMG,sEMG, EOG,EOG, Smart Smart Environment.
Environment.
1. INTRODUCTION
1. INTRODUCTION temporal resolution
temporal resolution generated
generated by by neuronal
neuronal dynamics
dynamics from from
1.
1. INTRODUCTION temporal
the resolution generated by
by neuronal dynamics from
INTRODUCTION the scalp.
temporal Therefore,
scalp. resolution
Therefore,generatedaa BCI
BCI records
records brain
brain signals,
neuronal dynamics
signals, and
and EEGfrom
EEG
A Human
Human Machine
Machine Interface
Interface (HMI)(HMI) is is a a platform
platform that that the the
signal scalp.
scalp. Therefore,
features
Therefore,are a
then BCI
a BCI records
translated brain
into
records brain signals,
artificial and EEG
outputs
A signal features are then translated into signals,
artificialand EEG
outputs
signal
or features are then translated into artificial outputs
AA Human
allows
Human
allows Machine
interaction
Machine
interaction Interface
between
Interface
between user(HMI)
user and is
is aa platform
and automatized
(HMI) platform
automatized that or commands
that signal
system.
system. featuresthat
commands are act
that thenin
act aa real
real world.
intranslated intoBCI
world. is
is aa potential
artificial
BCI outputs
potential
allows
On theinteraction
other hand, between
a user
Brain-computerand automatized
interface system.
(BCI) is or
or commands
alternative
commands and that
that act
augmentative
act in
in a
a real
real world.
communication
world. BCI
BCI is
is a
a potential
(AAC) and
potential
allows
On theinteraction
other hand, between user and automatized
a Brain-computer interface (BCI)system. is alternative and augmentative communication (AAC) and
alternative and augmentative communication (AAC)
and
On
a
On the
the other
technology
other hand,
that
hand, a
a Brain-computer
provides human
Brain-computer with interface
direct
interface (BCI)
communi-
(BCI) is
is control
alternative
control solution
and
solution for people
augmentative
for people with severe
communication
with severe motor
motor disabilities
(AAC) and
disabilities
a technology that provides human with direct communi- control solution for people with severe motor disabilities
acation
technology
cation between
a technology
between that
thatthe provides
theprovides
users brain
users human
brain
human with
signals
with and
signals direct
and
direct communi-
a computer,
a computer,
communi- (Wolpaw (Wolpaw et
control solution al. (2000),
for people
et al. (2000), Kelly
Kellywithet al. (2005)
severe
et al. (2005) motorand Gao
Gao et al.
et
and disabilities al.
cation (Wolpaw
(2003)). et al. (2000), Kelly et al. (2005) and Gao et al.
cation between
generating an
between
generating the
the users
an alternative
alternative brain
brain signals
users channel
channel of and
and aa computer,
of communication
signalscommunication computer,that (Wolpaw
that (2003)). et al. (2000), Kelly et al. (2005) and Gao et al.
(2003)).
generating
does not an
an alternative channel of communication that
does not involve
generating involve the
the traditional
alternative channel
traditional way
wayofas muscles
muscles and
ascommunication and nervesthat (2003)).
nerves One
One kindkind ofof BCI
BCI named
named SSVEP-BCI
SSVEP-BCI uses uses the the excitation
excitation
does not
(Wolpaw involve
et
et al.
does not involve
(Wolpaw the
al. (2000)).traditional
Among
the traditional
(2000)). Amongway way as
current muscles
BCIs,
as muscles
current and
a nerves
a noninva-
BCIs, and nerves One
noninva- kind of BCI named SSVEP-BCI uses the excitation
of
One
of the
the retina
kind of
retina BCIof
of eye
named
eye by
by a
a stimulus
SSVEP-BCI
stimulus at
at a
a certain
uses
certainthe frequency,
excitation
frequency,
(Wolpaw
sive
(Wolpaw
sive brain et
brain et al.
imaging (2000)).
method
al. (2000)).
imaging method Among
commonly
Among
commonly current
current BCIs,
employed a
BCIs, ain
employed noninva-
innoninva-
BCIs is
BCIs is of the retina of eye by a stimulus at a certain frequency,
making
of the
making the brain
retina
the brain
of eye generating
by
generatinga an electrical
stimulus
an electrical
at a activity
certain of the
frequency,
activity of the
sive
sive brain
EEG, which imaging
has
has the method commonly
advantages of
of lower employed
risk, in
in BCIs
BCIs is
risk, inexpensive is making the
EEG,
EEG,
brain
which
which
imaging
has
the
the
method commonly
advantages
advantages of
employed
lower
lower risk,
inexpensive
inexpensive same
making
same the brain
frequency
frequencybrainwith generating
its
generating
with its
an
an electrical
multiples
multiples or
electrical
or
activity
activity of
harmonics.
harmonics.
the
ofThis
the
This
and
EEG,
and easily
which
easily measurable (Chen
(Chen et
has the advantages
measurable et al.
of (2014)
al. lower risk,
(2014) and Kelly
Kelly et
and inexpensiveet al.
al. samestimulus frequency
produces with
a stableits multiples
Visual Evokedor harmonics.
Potential This
(VEP)
and easily measurable (Chen et al. (2014) and Kelly et al. same
stimulus frequency
produces with
a stableits Visual
multiples Evokedor harmonics.
Potential This
(VEP)
(2005)). Further, EEG(Chen provides electrical signals of et
high
al. stimulus produces aa stable Visual Evoked
and easilyFurther,
(2005)).
(2005)).
measurableEEG provideset al. (2014) and
electrical Kelly
signals of high of
of small
stimulus amplitude
small produces
amplitude termed
stable
termed as Evoked Potential
as Steady-State
Visual Steady-State
(VEP)
PotentialVisually
(VEP)
Visually
(2005)). Further,
Further, EEG EEG provides
provides electrical
electrical signals
signals of of high
high of Evoked
of small
small amplitude
Potentials
amplitude termed
(SSVEPs)
termed as
of
as Steady-State
the human
Steady-State visual Visually
system.
Visually
 Evoked Potentials (SSVEPs) of the human visual system.
 The authors
authors thank
thank FAPES
FAPES (a (a foundation
foundation of of the
the Secretary
Secretary ofof Sci- Evoked
To Potentials (SSVEPs) of the
the human visual system.
 The
 The
ence authors
and thank of
Technology FAPES
the (a foundation
State of Espirito of the Secretary
Santo, Brazil),
Sci-
of Sci-
CAPES To produce
Evokedproduce such
such potentials,
Potentials (SSVEPs)the
potentials, of user
the user gazes
gazes at
human one
one flickering
atvisual system.
flickering
ence and Technology
thank of the State of Espirito of Santo, Brazil), CAPES To produce
stimulus such potentials, the user gazes at one flickering
The
ence
(a
authors
and Technology
foundation of the
FAPES
of
(a foundation
the State
Brazilian of Espirito
Ministry of
the Secretary
Santo, Brazil),
Education) and
of Sci-
CAPES
CNPQ stimulus oscillating at a certain frequency (He (2013)). In
To produce oscillating
such at
potentials,a certain
the frequency
user gazes (He
at one (2013)).
flickeringIn
(a foundation
ence of the of
and Technology Brazilian
the State Ministry
of Espiritoof Education) andCAPES
Santo, Brazil), CNPQ stimulus
typical oscillating
a typical
stimulus SSVEP-BCI
oscillating at aa certain
system,
at system, frequency
several
certainseveral
frequencystimuli (He (2013)).
(2013)). In
flickering
(Heflickering at
In
(a
(Thefoundation
Brazilian ofNational
the Brazilian
CouncilMinistry
for of Education)
Scientific and and CNPQ
Technological a SSVEP-BCI stimuli at
(The
(a BrazilianofNational
foundation Council
the Brazilian for Scientific
Ministry and Technological
of Education) and CNPQ aadifferent
typical
typical SSVEP-BCI
frequencies
SSVEP-BCI aresystem,
presented
system, several
to the
several stimuli
the user. flickering
stimuli The subject
subject
flickering at
at
(The Brazilian for
Development),
Development),
(The
National
Brazilian for the Council
support
the support
National
fortoScientific
given
given
Council this work.
fortoScientific
and Technological
this work.and Technological different frequencies are presented to user. The
Development), for the support given to this work. different
different frequencies
frequencies are
are presented
presented to
to the
the user.
user. The
The subject
subject
Development), for the support given to this work.
Copyright
2405-8963
Copyright 2015 IFAC
2015,
2015 IFAC 138 Hosting by Elsevier Ltd. All rights reserved.
IFAC (International Federation of Automatic Control)
138
Copyright
Peer review 2015 responsibility
IFAC 138Control.
Copyright under
2015 IFAC of International Federation of Automatic
138
10.1016/j.ifacol.2015.12.023
IFAC SYROCO 2015
August 26-28, 2015. Salvador, BA, Brazil
Richard J. M. G. Tello et al. / IFAC-PapersOnLine 48-19 (2015) 136141 137

overtly directs attention to one of the stimuli by changing paradigms using, respectively, muscle (sEMG), EOG and
his/her gaze attention (Zhang et al. (2010)). This kind of brain signals (EEG) as shown in Fig. 1.
SSVEP-BCI was evaluated in this study and is commonly
called as dependent since muscle activities, such as gaze
shifting, are necessary. USER INTERFACE LEVELS OF
CAPACITY
One of the first studies related to control of smart home Face muscles:
clenching and
applications using biological signals, such as EEG, was sEMG SIGNAL 1 raising brow
reported in (Holzner et al. (2009)). In that work a BCI
Ocular globe: left
based on P300 approach is used for TV channels switching, EOG SIGNAL 2 and right
for opening and closing doors and windows, navigation
and conversation, but all in a controlled environment of a EEG SIGNAL 3 SSVEP

virtual reality (VR) system. Twelve subjects were evalu-


ated and an average of 67.51% in the classification for all
subjects and all decisions was achieved. Other study (Ou
et al. (2012)) based on VR in order to create a controlled
environment was performed. In that work, the term Brain Fig. 1. Levels of capacity of our proposed system.
computer interface-based Smart Environmental Control
System (BSECS) was introduced and a BCI technique 2. METHODS
with Universal Plug and Play (UPnP) home networking for
smart house applications environment was proposed. Also, Different stages were addressed as follows:
an architecture where the air conditioner and lights/lamp
can be successfully and automatically adjusted in real- 2.1 Assistive System
time based on the change of cognitive state of users was
designed.
In the context of smart environments applied to assistive
A hybrid BCI for improving the usability of a smart home technologies, this paper proposes an input interface allow-
control was reported in (Edlinger and Guger (2012)). In ing people with disabilities to turn on and off appliances
that study, P300 and SSVEP approaches were used. Re- without help, from the wheelchair. Part of this work was to
sults indicated that P300 is very suitable for applications design and build a smart box that allows controlling up to
with several controllable devices, where a discrete control four appliances in an environment, including TV set, radio,
command is desired. However, that study also reports that lamp/lights and fan. sEMG, EOG and EEG signals were
SSVEP is more suitable if a continuous control signal is recorded using a biological signal acquisition device. The
needed and the number of commands is rather limited. A user can issue commands from the wheelchair, and then
simple threshold criterion was used to determine if the the signal is transmitted through RF to the smart box,
user is looking at the flickering light. All the different when the corresponding equipment is finally operated. We
commands were summarized in 7 control masks: a light used RF communication to turn the appliances on and
mask, a music mask, a phone mask, a temperature mask, off remotely. The RF transmitter and receiver work in
a TV mask, a move mask and a go to mask. That study a frequency of 433MHz, controlled by an Arduino Mega
was also tested in a VR. A similar approach using a hybrid microcontroller. The communication is unidirectional, that
BCI paradigm based on P300 and SSVEP is reported is, only the transmitter sends the data to the receiver.
in (Wang et al. (2014)), where a Canonical Correlation
Analysis (CCA) technique was applied for the SSVEP On the other hand, an SSVEP-BCI was used to control the
detection. Applications involving robotic wheelchairs and navigation of the wheelchair. The wheelchair is equipped
SSVEP signals were also reported in (Muller et al. (2011); with a small 10.1 display that exhibits the Control
Xu et al. (2012); Diez et al. (2013); Singla et al. (2014)). Interface (CI). In addition, four small boxes containing
four Light-Emitting Diodes (LEDs) of white color were
A recent study using SSVEP and P300 approaches for placed in the four side of this display as visual stimuli for
wheelchair control was reported in (Li et al. (2013)). On the generation of evoked potentials.
the other hand, a hybrid BCI based on SSVEP and visual
motion stimulus was applied to a robotic wheelchair in The CI uses the display to visualize a menu, through which
(Punsawad and Wongsawat (2013)). Finally, studies that the user can navigate or operate the desired device. It is
combine motor imagery (MI) and SSVEPs to control a real worth noting that this menu is dynamic, as device options
wheelchair were reported in (Bastos et al. (2011) and Cao can change according to the current room, or be cus-
et al. (2014)). tomized by the user before running the system. Moreover,
for some device we provide additional operations that can
In this work, we address the problem of integrating a be performed using a sub-menu. For example, after turning
wheelchair into a smart environment. Due to the variety on the TV, the display shows a sub-menu with options
of disabilities that benefit from assistive technologies, an such as Channel Up, Channel Down, Volume Up, Volume
optimal approach could allow the user to choose the pre- Down. It is always possible to go back to the main menu
ferred control paradigm according to the degree of his/her and turn off the system.
disability. The system allows the handling of various de-
vices in a real environment, e.g. a room, by means of The CI was presented in (Ferrara et al. (2015)). It offers
biological signals controlled from a robotic wheelchair. We an interface of procedures that can be accessed through
present this system with three kind of assistive control Remote Procedure Call (RPC), that is, one for turning the
interface on, one for turning it off, and one for transmitting

139
IFAC SYROCO 2015
138 Richard J. M. G. Tello et al. / IFAC-PapersOnLine 48-19 (2015) 136141
August 26-28, 2015. Salvador, BA, Brazil

logical commands encoded as integers. In (Ferrara et al. empirically; we found them to provide good results with
(2015)), it has been demonstrated that choosing three most subjects. For detecting eye movements, our system
kinds of paradigms is optimal to operate the menu. In provides a discrete number with values 0 and 1. We
this work, we used a novel method to assess the overall analyzed the classification accuracy of 720 total trials.
performance of the system, named Utility (Dal Seno et al. Hence, the average accuracy and trial duration using
(2010)) while using the preferred control paradigm. Utility were computed.

2.2 sEMG and EOG control 2.3 Estimation of MSI for SSVEP-BCI
For individuals with disabilities that do not affect volun- This stage is in charge of controlling the navigation of
tary control of facial muscles, we propose a system based the robotic wheelchair. Five subjects (three males and two
on sEMG and EOG signals. With the aim of ensuring females), with ages from 21 to 27 years old, were recruited
a high reliability, we defined the three paradigms afore- to participate in this study (average age: 25.6; Standard
mentioned: moving eyes horizontally (left/right), raising Deviation (STD): 2.61). The research was carried out in
brow and prolonged clench. For processing sEMG/EOG compliance with Helsinki declaration, and the experiments
signals, a comparison of the signal amplitude with a pre- were performed according to the rules of the ethics com-
defined threshold value was performed. Aspects involving mittee of UFES/Brazil, under registration number CEP-
the width and duration of the signal components in the 048/08.
decision of classification were observed. The signals corre-
sponding to the left-right eye movements were dominant For the development of the SSVEP-BCI, 12 channels of
in the horizontal axes. For those signals, we found an EEG with the reference at the left ear lobe were recorded
amplitude and duration very prolonged, opposite signals at 600 samples/s, with 1 to 100 Hz pass-band. The ground
and high amplitude. Thus, the user is able to control the electrode was placed on the forehead. The EEG electrode
options through moving eyes whereas the raising brow is placements were based on the International 10-20 System.
used to confirm the highlighted option. Finally, prolonged The electrodes used were: P7, PO7, PO5, PO3, POz, PO4,
clench option is used for activation and deactivation of PO6, PO8, P8, O1, O2 and Oz. The equipment used
the SSVEP-BCI. Fig. 2 shows a summary of the three for EEG signal recording was BrainNet-36. The timing
paradigms allocated for the sEMG/EOG approaches. of the four LEDs flickers was precisely controlled by a
microcontroller (PIC18F4550, Microchip Technology Inc.,
USA) with 50/50% on-off duties, and frequencies of 8.0
Hz (top), 11.0 Hz (right), 13.0 Hz (bottom) and 15.0 Hz
(left). To send commands to the wheelchair, the user has
to fix the attention to one of the flickering frequencies.
The EEG data are segmented and windowed in window
lengths (WL) of 1 s with an overlap of 50%. Then, a spatial
filtering is applied using a Common Average Reference
(CAR) filter and a band-pass filter between 3-60 Hz for
the twelve channels. Several studies (Vialatte et al. (2010);
Fig. 2. Biological signal transducer module for Pastor et al. (2003)) confirm that visual evoked potentials
sEMG/EOG signals. are generated with greater intensity on the occipital area
of the cortex. Thus, the twelve electrodes were used in
Each individual performed the tasks regarding the sEMG/ the initial stage only for application of a CAR spatial
EOG in a different personal fashion. Thus, it is quite filter. According to our observations, the application of
optimistic to expect that a new user is able to achieve this spatial filter to the twelve electrodes improves the
an optimal performance since the first trial. However, classification performance when selecting O1, O2 and Oz
our tests revealed than for most users, just a very brief electrodes. Based on that fact, we have evaluated the
adaptation period was required to figure out the best way detection of SSVEPs using these three channels as input
to perform the gesture. This adaptation period consisted vector for the feature extractor after the filtering process.
of 2 or 3 minutes while the user tries to execute a command Multivariate Synchronization Index (MSI) was used for
and observe a visual feedback in a LCD screen. Although feature extraction. A brief description of this technique is
this step is not required, it is recommended considering explained below.
that this procedure can accelerate the development of
MSI is a novel method to estimate the synchronization
the users control skills. Online experiments with eight
between the actual mixed signals and the reference signals
healthy subjects were performed. Further, the subjects
as a potential index for recognizing the stimulus frequency.
were seated on the wheelchair, in front of the display where
(Zhang et al. (2014)) has proposed the use of a S-estimator
the program was running, and asked to perform thirty
as index, which is based on the entropy of the normalized
repetitions per command, resulting in 90 total trials. In
eigenvalues of the correlation matrix of multivariate sig-
a control panel, the level of clenching and raising brow
nals. Autocorrelation matrices C11 and C22 for X and
through continuous values between 0 and 1 is expressed,
Yi , respectively, and crosscorrelation matrices C12 and
so it is sufficient to set predefined thresholds to detect
C21 for X and Yi can be obtained as (Tello et al. (2014)),
the two movements. We opted to trigger an action when
where i refers to the number of targets
the correspondent value is greater than 0.8 and the other
one is less than 0.2. These thresholds were obtained

140
IFAC SYROCO 2015
August 26-28, 2015. Salvador, BA, Brazil
Richard J. M. G. Tello et al. / IFAC-PapersOnLine 48-19 (2015) 136141 139

C11 = (1/N ).XXT (1) in the lateral canthus of the eye (F7 and F8 position
T according to 10-20 standard) in order to monitoring the
C22 = (1/N ).Yi .Yi (2)
frontal lobe. In addition, Fig. 4 shows the general block
T
C12 = (1/N ).XYi (3) diagram of the proposed system.
C21 = (1/N ).Yi .XT (4)
A correlation matrix Ci can
 be constructed
 as
C11 C12
Ci = (5)
C21 C22

The internal correlation structure of X and Yi contained


in the matrices C11 and C22 , respectively, is irrelevant for
the detection of stimulus frequency (Carmeli et al. (2005)).
It can be removed by constructing a linear transformation
matrix  
C11 1/2 0
U= (6)
0 C22 1/2

So that C11 1/2 C11 1/2 =C11 , C22 1/2 C22 1/2 =C22 and by
applying the transformation Ci = UCU which results in
a transformed correlation matrix of size P P , where
P = M + 2H (Carmeli et al. (2005)). The eigenvalues
P
i1 , i2 ,...,iP of Ci , normalized as im = im / m=1 im for
m = 1, 2, ..., P , can be used to evaluate the synchronization Fig. 3. (a) Electrode placement location using 10-20 sys-
index Si for matrix Yi as tem for our system; (b) an user using the wheelchair.
P
i log(im ) 4. EXPERIMENTAL RESULTS
Si = 1 + m=1 m (7)
log(P )
Table 1 summarizes the classification outcome for muscle
see (Zhang et al. (2014)). Using S1 ,S2 , ...,SK computed movement tasks. Each movement is treated independently
for the stimulus frequencies f1 ,f2 , ...,fK , the MSI can be because a subject may be able to execute some movement
estimated as very better (or worse) than the others. It can be noticed
S = maximum Si (8) that the average accuracy is remarkably higher than ran-
1iK dom guessing and often expresses a very good performance
In this case, the way to assess the performance of the of the classifier (between 85% to 95%).
SSVEP-BCI system was the Shannons Information Trans- Nevertheless, as often in assistive technologies, some users
fer Rate (ITR), see details in (Vialatte et al. (2010)). show troubles while realizing a certain movement that
preclude them to give the correspondent command. For
3. SYSTEM ARCHITECTURE example, subject 1 was not able to succeed while raising
brow. Since the computation of Utility depends on which
sEMG, EOG and EEG signals are captured by the signal menu option the user wants to operate, we consider the
acquisition equipment, which has inputs for electromyo- mean Utility as the arithmetic mean of the Utility in a
graphic and electroencephalographic signals. Through a menu with four options and express the result in bits per
computational sniffer the biological signals are read from minute to facilitate the comparison with ITR.
the equipment, then these signals are transmitted and pro- A critical advantage of control by means of muscle move-
cessed in an embedded computer by algorithms developed ment is the speed of recognition. It can be seen that an
in Matlab. This embedded computer in the wheelchair operation is triggered after a very small amount of time.
has the following specifications: Mini ITX motherboard, Fig. 5 represents the results of online tests using facial
3.40 GHz Intel Core i5 processor, and 4GB RAM. The expressions. Each point represents an expression made
data are analyzed in the main routine and the prolonged by a user. Points in the top-left corner indicate better
clench signal works as a switch, which determines the oper- performance. The size of the dots represents the value
ation of the navigation or the smart environment control. of Utility. This could lead to a fast and efficient control
Raising brow or moving the eyes are used to control the paradigm, especially after a period of training and self-
devices inside the house. The navigation of the wheelchair improvement.
is executed through the SSVEP-BCI approach, which is
based on commands and these are used for directional For evaluation of the EEG control, each volunteer (during
control of the wheelchair. The LED on the above side (8 30 seconds) fixes his/her attention to each stimulus and the
Hz) indicates forward, the LED on the right indicates the results are shown in Table 2. The results from the SSVEP-
movement to the right turn (11 Hz), the LED on the left BCI were acceptable considering that subjects never used
side (15 Hz) indicates the movement to the left turn and, a BCI and neither had previous training. The highest value
finally, the LED on the bottom (13 Hz) indicates stopping. of accuracy was for the subject 3 with 77% and 51.27
Fig. 3 shows all electrode placement locations and a user bits/min of ITR achieved from the average value of the
on the wheelchair. The sEMG/EOG electrodes were placed accuracy.

141
IFAC SYROCO 2015
140 Richard J. M. G. Tello et al. / IFAC-PapersOnLine 48-19 (2015) 136141
August 26-28, 2015. Salvador, BA, Brazil

Navigation Control

Pre-processing Feature Extraction Classification

Switch Brain Smart Environment


sEMG Control Signals
(Clenching) (EEG)
EOG +
sEMG UFES
(raising Robotic
brow) Wheelchair
Main Biosignal
Routine Acquisition
PC Equipment
Switch
Control
Brain
Signals Ocular
(EEG) globe 8 Hz
(EOG)

Visual
15 Hz 11 Hz stimuli Navigation
Face through
muscles 13 Hz SSVEP
(sEMG)

Fig. 4. The system paradigm of our multimodal system.

wheelchair. Three control paradigms using sEMG, EOG


and EEG signals were introduced and each stage showed a
good performance for most subjects. Our strategy provides
the reliability in terms of classification result and safety
of wheelchair control. To evaluate the sEMG/EOG sys-
tem, eight subjects participated of the experiments. The
average accuracy for prolonged clench approach was of
95%, the raising brow was 85% and moving eyes achieved
93%. On the other hand, five subjects participated of the
study for the SSVEP-BCI. The total values of classification
vary in the range of 45-77% among subjects, considering
a WL of 1s. Our results would even improve because it
is widely known that increase in accuracy is related with
increase of the time windows (Tello et al. (2014)), this
fact suggests that while more information is processed, the
feature extractor can detect visual evoked potentials with
more precision.
Fig. 5. Comparison between control of the system from
different muscle movements and speed of recognition. ACKNOWLEDGEMENTS
Table 1. Accuracy results for EEG control The authors wish to thank all the volunteers for their
using WL of 1 s. participation in the experiments.
SSVEP Subjects
Frequency s1 s2 s3 s4 s5 REFERENCES
8 Hz 0.88 0.82 0.90 0.68 0.58 Bastos, T., Muller, S., Benevides, A., and Sarcinelli-Filho,
11 Hz 0.80 0.70 0.78 0.92 0.52 M. (2011). Robotic wheelchair commanded by SSVEP,
13 Hz 0.55 0.73 0.82 0.67 0.42
motor imagery and word generation. In IEEE Engineer-
15 Hz 0.54 0.42 0.58 0.56 0.27
Mean Acc. [%] 0.70 0.67 0.77 0.71 0.45
ing in Medicine and Biology Society, (EMBC).
ITR [bits/min] 38.29 33.49 51.27 39.71 7.90 Cao, L., Li, J., Ji, H., and Jiang, C. (2014). A hybrid brain
computer interface system based on the neurophysiolog-
5. CONCLUSION ical protocol and brain-actuated switch for wheelchair
control. Journal of Neuroscience Methods, 229, 3343.
In this paper we presented a multimodal system capable Carmeli, C., Knyazeva, M.G., Innocenti, G.M., and Feo,
to employ different biological signals in order to control O.D. (2005). Assessment of EEG synchronization based
several home appliances and the navigation of a robotic on state-space analysis. NeuroImage, 25, 339354.

142
IFAC SYROCO 2015
August 26-28, 2015. Salvador, BA, Brazil
Richard J. M. G. Tello et al. / IFAC-PapersOnLine 48-19 (2015) 136141 141

Chen, C.H., Ho, M.S., Shyu, K.K., Hsu, K.C., Wang, Conference (BRC), 5th ISSNIP-IEEE, 16.
K.W., and Lee, P.L. (2014). A noninvasive brain Vialatte, F.B., Mauriceb, M., Dauwelsc, J., and Cichocki,
computer interface using visually-induced near-infrared A. (2010). Steady state visually evoked potentials: Focus
spectroscopy responses. Neur. Letters, 580, 2226. on essential paradigms and future perspectives. Progress
Dal Seno, B., Matteucci, M., and Mainardi, L. (2010). The in Neurobiology, 90, 418438.
Utility Metric: A Novel Method to Assess the Over- Wang, M., Daly, I., Allison, B.Z., Jin, J., Zhang, Y., Chen,
all Performance of Discrete Brain-Computer Interfaces. L., and Wang, X. (2014). A new hybrid BCI paradigm
Neural Systems and Rehabilitation Engineering, IEEE based on P300 and SSVEP. Jour of Neur Methods.
Transactions on, 18(1), 2028. Wolpaw, J., Birbaumer, N., Heetderks, W., McFarland,
Diez, P.F., Muller, S.M.T., Mut, V.A., Laciar, E., D., Peckham, P., Schalk, G., Donchin, E., Quatrano, L.,
Avila, E., Bastos-Filho, T.F., and Sarcinelli-Filho, M. Robinson, C., and Vaughan, T. (2000). Brain-computer
(2013). Commanding a robotic wheelchair with a high- interface technology: a review of the first international
frequency steady-state visual evoked potential based meeting. Rehabilitation Engineering, IEEE Transac-
brain-computer interface. Med Eng Phys. tions on, 8(2), 164173.
Edlinger, G. and Guger, C. (2012). A hybrid Brain- Xu, Z., Li, J., Gu, R., and Xia, B. (2012). Steady-
Computer Interface for improving the usability of a State Visually Evoked Potential (SSVEP)-Based Brain-
smart home control. In Complex Medical Engineering Computer Interface (BCI): A Low-Delayed Asyn-
(CME), 2012 ICME, 182185. chronous Wheelchair Control System. Neural Informa-
Ferrara, F., Bissoli, A., and Bastos-Filho, T. (2015). De- tion Processing. 19th ICONIP. Springer., 305314.
signing an Assistive Control Interface based on Utility. Zhang, D., Maye, A., Gao, X., Hong, B., Engel, A.K., and
Proceedings of the 1st International Workshop on Assis- Gao, S. (2010). An independent brain-computer inter-
tive Technology IWAT 2015, Vitoria, Brazil, 142145. face using covert non-spatial visual selective attention.
Gao, X., Xu, D., Cheng, M., and Gao, S. (2003). A BCI- J. Neural Eng.
based environmental controller for the motion-disabled. Zhang, Y., Xu, P., Cheng, K., and Yao, D. (2014). Multi-
Neural Systems and Rehabilitation Engineering, IEEE variate synchronization index for frequency recognition
Transactions on, 11(2), 137140. of SSVEP-based brain computer interface. Journal of
He, B. (2013). Neural Engineering. Springer. 2nd ed. Neuroscience Methods, 221(0), 32 40.
Holzner, C., Guger, C., Edlinger, G., Gronegress, C., and
Slater, M. (2009). Virtual Smart Home Controlled by Table 2. Summary results for sEMG/EOG
Thoughts. In Enabling Technologies: Infrastructures control.
for Collaborative Enterprises, 2009. WETICE 09. 18th Prolonged clench
IEEE International Workshops on, 236239. Subject Acc. [%] Time [s] Utility [bits/min]
Kelly, S., Lalor, E., Reilly, R., and Foxe, J. (2005). Visual 1 1.00 1.06 58.80
spatial attention tracking using high-density SSVEP 2 1.00 0.86 72.60
data for independent brain-computer communication. 3 0.97 1.29 45.18
Neural Systems and Rehabilitation Engineering, IEEE 4 1.00 2.08 30.00
Transactions on, 13(2), 172178. 5 0.93 3.09 17.52
6 0.83 2.07 20.13
Li, Y., Pan, J., Wang, F., and Yu, Z. (2013). A Hybrid 7 0.87 3.86 11.87
BCI System Combining P300 and SSVEP and Its Appli- 8 1.00 2.20 28.38
cation to Wheelchair Control. Biomedical Engineering, Average 0.95 2.06 35.56
IEEE Transactions on, 60(11), 31563166. Raising brow
Muller, S.M.T., de S, A.M.F.L.M., Bastos-Filho, T.F., Subject Acc. [%] Time [s] Utility [bits/min]
and Sarcinelli-Filho, M. (2011). Spectral techniques for 1 - - -
incremental SSVEP analysis applied to a BCI imple- 2 0.90 1.35 37.20
mentation. V CLAIB La Habana, 10901093. 3 1.00 1.12 55.80
Ou, C.Z., Lin, B.S., Chang, C.J., and Lin, C.T. (2012). 4 0.93 1.87 28.92
Brain Computer Interface-based Smart Environmental 5 0.73 2.41 12.00
6 0.80 1.55 24.18
Control System. In Intelligent Information Hiding and
7 0.97 3.41 17.10
Multimedia Signal Processing (IIH-MSP), 281284. 8 0.63 2.81 5.88
Pastor, M., Artieda, J., Arbizu, J., Valencia, M., and Average 0.85 2.07 25.87
Masdeu, J. (2003). Human cerebral activation during Moving eyes
steady-state visual evoked response. J. Neurosci. Subject Acc. [%] Time [s] Utility [bits/min]
Punsawad, Y. and Wongsawat, Y. (2013). Hybrid SSVEP- 1 1.00 1.60 39.00
motion visual stimulus based BCI system for intelligent 2 1.00 1.58 39.60
wheelchair. In Engineering in Medicine and Biology 3 1.00 1.87 33.60
Society (EMBC), 74167419. 4 0.73 6.32 4.56
Singla, R., Khosla, A., and Jha, R. (2014). Influence of 5 1.00 2.39 26.15
6 0.97 1.43 40.79
stimuli colour in SSVEP-based BCI wheelchair control
7 0.73 4.07 7.16
using support vector machines. Journal of Medical 8 0.97 3.67 15.89
Engineering & Technology, 38, 125134. Average 0.93 2.87 25.85
Tello, R., Muller, S., Bastos-Filho, T., and Ferreira, A.
(2014). A comparison of techniques and technologies
for SSVEP classification. In Biosignals and Biorobotics

143

You might also like