Professional Documents
Culture Documents
I. I NTRODUCTION
II. M ETHODS
This work adapted and optimized the SSVEP-BCI previously used by people with disabilities for navigating a
robotic wheelchair [2]. As a way of validating the system
here developed, our proposal is to take a car tour at the
campus of the Federal University of Esprito Santo (UFES),
with the vehicle being completely commanded by a person
with disabilities, using his/her brain signals.
In this study, an EEG signal database with 19 people aged
28 7 years was used, out of which 6 of them had motor
disabilities [2]. A FPGA was used for stimuli generation
with an accuracy of 0.01Hz [2]. The algorithms for signal
processing were implemented on a PC of 2.4 GHz and 2
GB of RAM, using MATLAB Version 2012b. The signal
recording protocol consisted of presenting the visual stimuli
for 2 minutes, with intervals of 30 s in frequencies of 5.6, 6.4,
6.9 and 8.0 Hz. The stimulus was presented to the user at a
distance of 70 cm. The following methods were evaluated in
this work:
Brain-computer interface (BCI) is a non-muscular communication channel for the transmission of signals from
the brain to the outside world. A BCI processes mental
decision control signals through the analysis of brain bioelectric activity allowing the user to interact with the interface
itself [1]. A BCI operates based on different paradigms of
electroencephalographic (EEG) signals, which can be motor imagination, mu rhythm variation, or evoked potentials.
Among evoked potentials there are Steady-State Visual Evoked
Potentials (SSVEP), which are associated to visual stimuli
frequencies observed by the user. A BCI based on these
potentials is called SSVEP-BCI [2]. Besides, the process of
commanding an autonomous car requires the generation of
relevant information related to the address or route to go.
The use of brain signals provides a new communication
channel through which it is possible to generate actions to
command such autonomous car. The car used in this work is an
unmanned autonomous car with high level of autonomy, shown
in Fig. 1. The main features of this autonomous car are the
ability of locating itself, orienting and planning ways, through
laser, radar, GPS and computer vision. Advanced control
systems interpret this information to identify the appropriate
path, obstacles, and relevant signals. However, this navigation
Sk = 10log10
nP (fk )
Pn/2
m=1
X
Trials
!
,
Y1
(1)
Euclidean
Distance
CCA
|fk fh |
error
=
,
ampl
P (fh )
Yp
CCA
(2)
k = Wx ,Wy (x, y) =
E [xT y ]
E[xT x]E[y T y]
sin (2fk )
cos (2fk t)
..
Y =
(3)
sin (2hfk t)
cos (2hfk )
where k = 1, 2 N , N is the number of stimuli and h is the
number of harmonics of the target frequency.
Considering Wx and Wy the matrices that represent a
combination linear of vector X and Y, the CCA method
seeks to find the linear combination of the frequency domain
matrices Wx and Wy that presents the highest correlation and
define the constraints expressed by Equations 4 and 5.
E xxT = E xT x = E WxT XX T Wx = 1
E yy T = E y T y = E WyT Y Y T Wy = 1
>
Reference
signals
generator
(4)
(5)
WxT XY T Wy
E[
(6)
E[WxT XX T Wx ]E [WyT Y Y T Wy ]
X
X
yi
j xij
b, b = argmin
i1
j
(7)
40
30
20
10
0
10
20
30
40
100
150
200
250
300
350
400
450
500
550
PlanningcLevel
Bellcetcal3cw2SS8R;
Valvuenacetcalcw2SS7R;
Ferreiracetcal3w2SS9R
1500
300
200
Mag
100
100
13
f (Hz)
d
dt
1000
SteeringcLevel
ControlcLevel
None
Millncetcalcw2SS4R;
Leebcetcalcw2SS6R;
Pfurtschellercetcal3cw2SS6R;
Trejocetcal3cw2SS6R;
Macetcal3cw2SS7R;
Martinezcetcalcw2SS7R;
Pirescetcalcw2SS8R;
McFarlandcetcalcw2SS8R;
Galncetcalcw2SS8R;
Brouwercetcal3w2SS8R;
Ron1angevincetcal3cw2SS9R;
Mllercetcalcw2SC3R
500
200
500
1000
1500
2000
2500
100
200
300
400
500
600
700
F. Autonomous car
The autonomous car used in this work was developed to
emulate the behavior and decision-making in the same manner
as a common driver, using the necessary sensors for this
task. From this perspective, an overview of a system for an
autonomous vehicle command is shown in Fig. 4.
Decision
Perception
Planning
Steering
Actions
Enviroment
Asinchronous
BCI
ERD
Sense
Control
Synchronous
BCI
Stimuli
Autonomous
Car
Planning
Feed-Back
Steering
Environment Menu
Control
User
A. Synchronous BCI
Table II presents the results of data classification for different users. The first 6 users have some type of disability.
For each one of the classification techniques the accuracy and
the Information Transfer Rate (ITR) were calculated. ITR is a
standard measure of communication systems, which calculate
the amount of information communicated per unit of time.
ITR depends on both speed and accuracy and is defined by
Equation 8 [10].
100
Turn on
Stimuli
Yes
Closed
Eyes?
Accuracy [%]
90
Not
80
70
60
50
Y Start N
Asynchronous
BCI
40
TradPSDA
RatioPSDA
CCA
LASSO
Turn off
Stimuli
14
TradPSDA
RatioPSDA
CCA
LASSO
12
Send
Command
10
B = (1 Pu )
1P
log2 N + P log2 P + (1 P ) log2 N
,
1
(8)
0
40
50
60
70
80
90
100
Accuracy [%]
(b) Success rate of the different classifiers.
TABLE II: Results for 19 users, out of which 6 (*) have motor disabilities.
Subject
m1R-
TraditionalLPSDA
AccuracyR[v]
PuR[v]
ITRR[bit/min]
90C16
4C38
79C19
AccuracyR[v]
85C06
RatioLPSDA
PuR[v]
ITRR[bit/min]
6C88
64C55
AccuracyR[v]
96C81
CCA
PuR[v]
3C13
ITRR[bit/min]
101C48
AccuracyR[v]
96C81
LASSO
PuR[v]
3C13
ITRR[bit/min]
101C48
40C86
m2R-
52C61
27C50
10C92
74C24
40C63
27C39
75C97
26C25
36C45
79C38
27C50
m3R-
91C47
5C00
82C33
92C59
6C25
84C47
96C22
3C13
99C28
96C22
3C13
99C28
m4R-
86C89
7C50
68C39
95C19
15C00
83C92
95C39
4C38
95C11
95C39
4C38
95C11
m5R-
76C25
13C13
43C42
97C08
28C13
76C07
87C14
15C00
63C39
87C03
15C00
63C15
m6R-
84C25
8C13
61C86
46C16
40C63
5C39
93C63
50C63
46C14
94C37
16C25
80C32
80C07
m7
93C05
15C63
77C26
90C69
28C75
60C12
91C58
11C88
76C67
93C03
12C50
m8
76C19
14C38
42C69
88C18
31C88
52C68
88C47
16C25
65C41
88C47
16C25
65C41
m9
90C21
7C50
76C75
97C43
2C50
104C57
96C22
0C63
101C84
96C22
0C63
101C84
m10
73C77
6C25
42C44
96C71
5C00
99C13
95C59
1C25
98C93
95C59
1C25
98C93
m11
99C38
3C75
111C78
98C06
3C75
105C76
99C38
1C25
114C69
99C38
1C25
114C69
m12
70C15
30C63
26C96
91C70
20C00
69C90
95C73
11C25
106C18
95C73
11C25
106C18
m13
90C74
5C00
80C29
94C86
2C50
95C17
96C18
2C50
99C80
96C18
2C50
99C80
m14
79C83
10C00
51C58
93C31
26C88
67C57
96C81
10C00
94C26
96C81
10C00
94C26
m15
59C72
26C88
17C08
91C25
21C88
67C19
92C65
11C25
80C13
92C65
11C25
80C13
m16
96C22
0C63
101C84
96C18
1C25
101C08
97C50
0C63
106C85
97C50
0C63
106C85
m17
96C84
1C25
103C57
96C84
3C13
101C60
96C25
0C63
101C97
96C25
0C63
101C97
m18
93C49
2C50
90C65
96C25
1C25
101C33
96C11
3C75
98C26
96C11
3C75
98C26
m19
Average
97C36
84C13
2C50
10C13
104C26
67C01
96C11
90C42
3C13
15C23
98C90
77C20
98C13
93C99
0C63
9C18
109C48
89C28
98C13
94C28
0C63
7C47
109C48
91C48
ACKNOWLEDGMENT
The authors want to thank CAPES, CNPq for funding
(process 133707/2013-0) and the Graduate Program of the
School of Electrical and Electronic Engineering, University
of Valle (Colombia).
R EFERENCES
[1] J. Wolpaw, N. Birbaumer, M. Dennis, G. Pfurtscheller, and T. Vaughan,
Brain-computer interfaces for communication and control. Clinical
neurophysiology : official journal of the International Federation of
Clinical Neurophysiology, vol. 113, no. 6, pp. 767791, 2002. [Online].
Available: http://dx.doi.org/10.1016/S1388-2457(02)00057-3
[2] S. Muller, T. Bastos, and M. Sarcinelli, Proposal of a SSVEP-BCI to
command a robotic wheelchair. Journal of Control, Automation and
Electrical Systems, vol. 24, pp. 97105, 2013.
[3] F. Lobo, Sistemas e veculos autnomos - aplicaes na defensa, Masters
thesis, Faculdade de Engenharia da Universidade do Porto, revised 4
sep/2013. 2005. [Online]. Available: http://paginas.fe.up.pt/flp/papers/
SVA-AD flp cdn05.pdf
[4] Q. Wei, M. Xiao, and Z. Lu, A comparative study of canonical correlation analysis and power spectral density analysis for SSVEP detection.
Third International Conference on Intelligent Human-Machine Systems
and Cybernetics, pp. 710, 2011.
[5] T. Tanaka, C. Zhang, and H. Higashi, SSVEP frequency detection
methods considering background EEG, IEEE SCIS-ISIS, pp. 2024,
2012.
[6] R. Tibshirani, Regression shrinkage and selection via the LASSO,
Journal aof the Royal Statistical Society. Series B (Methodological).,
vol. 58, pp. 267288, 1996.
[7] A. Ferrerira, T. Bastos, M. Sarcinelli, J. Martn, J. Garca, and M. Mazo,
Improvements of a brain-computer interface applied to a robotic
wheelchair. in: Ana fred; joaquim filipe; hugo gamboa. (org.). Biomedical Engineering Systems and Technologies. Berlim: Springer Berlin
Heidelberg, vol. 52, pp. 6473, 2009.
[8] L. DeCarlo, On the meaning and use of kurtosis. Psychological
Methods,, vol. 2, pp. 292307., 1997.
[9] M. Thurling, J. Van, A. Brouwer, and P. Werkhoven, Brain-Computer InterfaIn Applying our Minds to Human-Computer Interaction. Springer,
2010, ch. EEG-Based Navigation form a Human Factors Perspective,
pp. 99105.
[10] J. Millan, F. Renkens, J. Mourino, and W. Gerstner, Noninvasive brainactuated control of a mobile robot by human EEG. IEEE transactions
on bio-medical engineering, vol. 51, no. 6, pp. 10261033, 2004.
[Online]. Available: http://dx.doi.org/10.1109/TBME.2004.827086
[24]
[25]
[26]
[27]
official journal of the International Federation of Clinical Neurophysiology, vol. 111, no. 12, pp. 21382144, 2000.
E. Sutter, The brain response interface: communication through
visually-induced electrical brain responses, J Microcomput Appl, 1992.
L. Trejo, R. Rosipal, and B. Matthews, Brain-computer interfaces
for 1-D and 2-D cursor control: designs using volitional control
of the EEG spectrum or steady-state visual evoked potentials.
IEEE transactions on neural systems and rehabilitation engineering
: a publication of the IEEE Engineering in Medicine and Biology
Society, vol. 14, no. 2, pp. 225229, 2006. [Online]. Available:
http://dx.doi.org/10.1109/TNSRE.2006.875578
D. Valbuena, M. Cyriacks, O. Friman, I. Volosyak, and A. Grser, Braincomputer interface for high-level control of rehabilitation robotic systems, IEEE 10th International Conference on Rehabilitation Robotics,
ICORR 07, art no 4428489,, p. 619625, 2007.
J. Wolpaw and M. Dennis, Control of a two-dimensional movement
signal by a noninvasive brain-computer interface in humans.
Proceedings of the National Academy of Sciences of the United States
of America, vol. 101, no. 51, pp. 17 84917 854, 2004. [Online].
Available: http://dx.doi.org/10.1073/pnas.0403504101