You are on page 1of 13

Cognitive Robotics 1 (2021) 12–24

Contents lists available at ScienceDirect

Cognitive Robotics
journal homepage: http://www.keaipublishing.com/en/journals/cognitive-robotics/

A survey on robots controlled by motor imagery brain-computer


interfaces
Jincai Zhang, Mei Wang∗
Xi’an University of Science and Technology, Xian 710054, China

a r t i c l e i n f o a b s t r a c t

Keywords: A brain-computer interface (BCI) can provide a communication approach conveying brain infor-
Robot mation to the outside. Especially, the BCIs based on motor imagery play the important role for
Brain-computer interface the brain-controlled robots, such as the rehabilitation robots, the wheelchair robots, the nurs-
Motor imagery
ing bed robots, the unmanned aerial vehicles and so on. In this paper, the developments of the
Signal processing
robots based on motor imagery BCIs are reviewed from three aspects: the electroencephalogram
(EEG) evocation paradigms, the signal processing algorithms and the applications. First, the dif-
ferent types of the brain-controlled robots are reviewed and classified from the perspective of
the evocation paradigms. Second, the relevant algorithms for the EEG signal processing are intro-
duced, which including feature extraction methods and the classification algorithms. Third, the
applications of the motor imagery brain-controlled robots are summarized. Finally, the current
challenges and the future research directions of the robots controlled by the motor imagery BCIs
are discussed.

1. Introduction

Brain-computer interfaces (BCIs) are special information exchange systems that enable the brain to interact with the external
environment directly without relying on the peripheral nervous system of the brain and the motor system of the human body [1,2]. It
can decode and interpret information about the activity of human neurons. And establish the connection between human brain and
external things. It can also enhance or improve the normal output of the central nervous system [3,4].
With the rapid development of the brain-computer interfaces, for patients with normal thinking but suffering from neurological
diseases or severe disabilities, BCIs can enable them to regain the ability to exercise or communicate with the environment and improve
their quality of life [5]. For healthy people, BCIs can provide an unprecedented sensory experience and help improve attention. The
system can also provide additional control methods for controlling external devices.
Recently, Brain-computer interface technology has become more and more mature and the technology has gained more attention
and recognition in the fields of the medical rehabilitation, entertainment, education, military and so on. Brain-computer interface
systems are divided into exogenous and endogenous. The Exogenous brain-machine interface needs to use external conditions to
stimulate the brain to generate specific responses. The electroencephalogram (EEG) evoked patterns of the brain-machine interfaces
mainly include event-related potential P300 [6] and steady-state visual evoked potentials (SSVEP) [7].
Endogenous BCIs are based on self-regulation of the brain rhythm and does not require any external stimulation. They are closely
related to human’s motion intention and can better reflect the subject’s autonomous intention. It is also known as active BCIs, the
motor imagery (MI) is the most commonly used paradigm for exogenous BCIs [8]. The advantage of the exogenous BCIs is that the


Corresponding author.
E-mail address: wangm@xust.edu.cn (M. Wang).

https://doi.org/10.1016/j.cogr.2021.02.001
Received 30 December 2020; Received in revised form 25 February 2021; Accepted 26 February 2021
Available online 10 March 2021
2667-2413/© 2021 The Authors. Publishing Services by Elsevier B.V. on behalf of KeAi Communications Co. Ltd. This is an open access article
under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)
J. Zhang and M. Wang Cognitive Robotics 1 (2021) 12–24

Fig. 1. Evocation paradigms, signal processing and applications overview diagram of robots controlled by the motor imagery BCIs.

Fig. 2. Overall framework of the MI-BCI control system.

signal is stable, less time consuming on special training and the control signal is easy to set, and thus the system is applicable to a
wide range. However, exogenous BCIs are not directly modulated by the user, so it not only depends on external stimulus, but also
requires the user’s attention, which is likely to cause the user’s fatigue.
Since active BCIs do not need external stimulus, it can better reflect the user’s autonomous intention and realize real mind
control, which has been widely concerned by researchers. MI is a subjective motor intention of the brain. It can induce the event-
related desynchronization and synchronization phenomenon (ERD/ERS) of mu rhythm and beta rhythm in the primary motor area
of the brain without external stimulus and obvious action output [9], that is, the energy attenuation or enhancement of certain
frequency components. By analyzing the MI tasks corresponding to different feature changes, the real motion intention of users can
be understood. Therefore, MI can be used as a control strategy of the BCI systems. In this paper, we review the development of the
robots based on motor imagery BCIs from three aspects: evocation paradigms, signal processing algorithms and application scenarios.
Fig. 1 shows the research contents involved in this survey.

2. Evocation paradigms

2.1. The single MI paradigm

The MI-BCI systems realize the control purpose of the user by detecting and quantifying the brain signals of the user’s movement
intention and converting them into output control instructions [10]. The system usually consists of three parts: signal acquisition,
signal processing and output equipment. The overall framework of the system is shown in Fig. 2. The single MI-BCI system control a

13
J. Zhang and M. Wang Cognitive Robotics 1 (2021) 12–24

Fig. 3. Typical robot control examples based on MI-BCI. (a) From [11]. (b) From [18]. (c) From [24]. (d) From [25]. (e) From [26]. (f) From [30].

robot first appeared in 2005. Tanaka et al. [11] translated the EEG signals generated by the user’s brain imagination on the movement
of the left or right limb as a command to directly control the left or right turn of the electric wheelchair. This study also provides a
good reference value for other studies later. Fig. 3(a) shows their test of the system in the actual environment.
Choi et al. [12,14] also designed a brain-controlled wheelchair robot based on motor imagery BCI, which can not only execute
left and right movement commands, but also add forward movement commands, and verified the robot in the real world. Akce and
his colleagues [16] used pilot EEG signals to remotely control an unmanned aircraft flying at a fixed altitude. Chae et al. [18] studied
a humanoid robot controlled by a MI-BCI system, extracted the amplitude characteristics of EEG by power spectrum analysis, and
selected the information characteristicomponents according to Fisher’s ratio Subjects control a humanoid robot to navigate to the
target in an indoor maze using EEG signals based on real-time images of the robot head cameras. Fig. 3(b) shows the process of their
experiment.

14
J. Zhang and M. Wang Cognitive Robotics 1 (2021) 12–24

Table 1
Summary of robot control examples based on MI-BCI.

Publication year classifier Control objects Output commands

Tanaka et al. [11]. 2005 Nearest neighbor Electric wheelchair Turning left and right
Choi et al. [12,14]. 2008, 2011 SVM Electric wheelchair Turning left and right and going forward
Hema et al. [13]. 2009 Artificial neural networks Electric wheelchair Relax, going forward, turning left and
right
Royer et al. [15]. 2010 Linear classifier Virtual helicopter Right and left, up and down
Akce et al. [16]. 2010 Binary classifier Fixed-wing Choosing trajectory
Bhattacharyya et al. [17]. 2012 K-nearest neighbor Mobile robot Speed and position
Chae et al. [18]. 2012 Autoregressive spectral analysis Humanoid Robot Turning left and right, going forward
Lafleur et al. [19]. 2013 Linear classifier Quadcopter Up and down, left and right, forward and
backward
Kosmyna et al. [20]. 2014, 2015 nearest neighbor Quadcopter Up and down, forward
Shi et al. [21]. 2015 logistic regression Hexrcopter Left and right, forward
Bhattacharyya et al. [22]. 2015 OVO-IT2FLF- ANFIS axis robotic Relax, left and right, forward and
manipulator with a3 backward
fingers hand
Krishna et al. [23]. 2016 SVM Mobile robot Turning left and right, going forward and
backward
Plechawska et al. [24] 2016 Artificial neural networks Robotic arm Left and right, start and stop
Vijayendra et al. [26] 2018 Artificial neural networks Quadcopter Left and right
Cantillo et al. [27]. 2018 LDA Robotic hand orthosis Flexion and extension of the hand fingers
Bousseta et al. [28]. 2018 RBF-SVM Robotic arm The base moves righ and left, the elbow
points up
Aljalal et al. [29]. 2019 LDA Mobile robot Turning left and right, going forward and
stopping
Liu et al. [30]. 2019 SVM Dual-arm robot Lift and drop
Xu et al. [31] 2019 LDA Robotic arm Left fornt and right front movement

Lafleur et al. [19] designed a brain-controlled quadcopter, which classifies various tasks through a linear classifier, realizing the
aircraft’s movement of lifting, landing, left, right, forward and backward. The real-time environment during the movement of the
unmanned aerial vehicle is provided by a forward camera on the shell, and visual feedback is used to allow subjects to adjust their
sensorimotor rhythm and precisely control the movement track of the unmanned aerial vehicle. The experimental results proved that
their design success rate reached 90.5%.
As shown in Fig. 3(c), Plechawska et al. [24] constructed a direct brain-to-machine interface (BCI) that could be used to control
movements of a robotic arm. However, the classification accuracy of the EEG signals is low. Therefore, the performance of the
system needs to be improved. In Fig. 3(d), Norman et al. [25] designed a finger robotic exoskeleton for Rehabilitation Therapy.
Twelve unimpaired subjects played a computer game designed for rehabilitation therapy with their fingers using the robot. The study
provides a new idea for rehabilitation treatment. Vijayendra et al. [26] used the artificial neural network to classify the EEG data
and realize the translation and angular velocity control of the four-axis unmanned aerial vehicle, as shown in Fig. 3(e).
Liu et al. [30] proposed a remotely controlled dual-arm robot based on brain-machine interface. According to the ERD/ERS
phenomenon, the support vector machine(SVM) based on classifier is used to identify the operator’s specific intention and realize the
control of both arms of the robot. Five volunteers took part in the experiment, and all successfully completed the task of using their
brain signals to direct a robotic arm to lift and lower a box. Fig. 2(f) proves the feasibility and universality of this framework.
The EEG signals induced by the single MI paradigm control the external robot system using fewer channels and lower structural
complexity. Therefore, it is easier to be accepted by users. However, its disadvantage is that the effective control instructions for
external devices are few, and the overall performance of the system is relatively low. Table 1 summarizes some examples of controlling
robots using a single MI pattern.

2.2. Hybrid paradigms

Hybrid paradigms refer to the combination of EEG signals and other signals in BCI system, such as electromyography, functional
near infrared spectroscopy, electrooculography, or mixes the two features of EEG signals. The main purpose is to increase the number
of control instructions and improve the classification accuracy of BCI. Fazli et al. proposed a hybrid paradigm which combines EEG
and near infrared spectroscopy for the first time. Functional near infrared spectroscopy and EEG signals have different information
content and complement each other, and the experiment proved that this hybrid pattern can improve the classification and recognition
accuracy of MI tasks [33]. The same study involved combining EEG with electromyography and electrooculography.
Liu et al. [34] proposed a new framework that integrated sensorimotor rhythm and motion-related cortical potential asynchronous
signal modes to control lower exoskeleton gait training. The method provided new ideas for the development of mixed-mode brain-
computer interfaces. The fusion of different features of the same signal in the mixed mode has also been extensively studied. The
main research focuses on the fusion between MI and visual induced brain response, namely, MI-SSVEP and MI-P300 at the moment.
Horki et al. [35] combined MI and SSVEP to design an artificial prosthesis that provides independent elbow bending and extension
control based on SSVEP-BCI, while hand opening/closing is controlled through an MI-BCI. This approach is novel for both types of

15
J. Zhang and M. Wang Cognitive Robotics 1 (2021) 12–24

BCI and the hybrid approach. The hybrid design shown here allows the user to continuously operate two BCIs to achieve the common
goal of controlling a artificial arm.
Cao et al. [41] presented a self-paced hybrid BCI wheelchair control system that provides more control commands. When the
electric wheelchair is decoded in an idle state with no mental activity, it moves in a straight line, and left-right movement imagination
is used to adjust the direction of the wheelchair. Meanwhile, the steady-state visual evoked potentials signal induced by the gaze
specific blinking button is used to accelerate or decelerate the wheelchair according to the actual situation. In other aspects, using
the MI-SSVEP mixed paradigm, Pfurtscheller [37] effectively reduce the false positive rate of the control manipulator. Zhang et al.
[44] also achieved accurate control of the movement of the quadcopter.
Although only a few studies on MI-SSVEP have emerged, they have successfully demonstrated the great advantages of this hybrid
paradigm in the field of control applications and rehabilitation training. Especially, in the field of the rehabilitation, there is almost
no additional cost to the user compared to pure MI-BCI. Despite encouraging results, practical issues such as reduction and selection
of channels must be considered before this system can be accepted clinically [45].The hybrid paradigm of the MI-P300 has also been
widely designed for use in real life. P300 refers to a positive crest appearing near 300 ms after a specific mental activity or a specific
stimulus, which can be captured in the Pz, Fz, Cz, Oz and other electrodes in the central parietal region [39].
Bhattacharyya et al. [40] designed a unique asynchronous position control scheme by decoding imaginary motion signals in
the left, right, forward and static motion of the manipulator. When the robot arm reaches the target position, the subjects stop the
movement of the robot arm by generating a P300 signal, and the success rate steady -state error and other indicators are adopted to
study the real-time performance of the proposed EEG driven position control scheme.
The hybrid paradigm not only improves control accuracy, but also increases control instructions to external devices. On the
basis of the medical assistance wheelchair designed by [36], Yu et al. [46] increased the control instructions such as turning left
45°, turning right 45°, accelerating and decelerating, and the total control instructions reached 11, realizing more refined control of
the wheelchair. It is not difficult to see from the above studies that MI-SSVEP or MI-P300 has a significant effect in improving the
recognition accuracy and expanding the instruction set, etc. However, the visual stimulation introduced by the mixed paradigm is
not only easy to cause visual fatigue and competition for attention resources, but also unsuitable for patients with eye movement
disorder. So researchers are working to find better solutions to solve these problems.
In recent years, steady-state somatosensory evoked potentials (SSSEP) were proposed, which was evoked by somatosensory stim-
ulation on the skin surface of the body, and attention transfer would adapt the potentials, thus establishing a BCI system based on
electromyography [51]. Studies have recently shown that the SSSEP-BCI with body stimulation or without it have comparable perfor-
mance with MI-BCI [52]. Therefore, Yao et al. [53] proposed a new mixed paradigm of MI-SSSEP. Based on these hybrid paradigm,
Lee et al. [54] also improved the accuracy of decoding left and right movement intentions, providing a basis for EEG to control the
movement direction of theexoskeleton robot. Table 2 summarizes some typical examples of controlling robots using hybrid BCIs.
In general, the innovative design of mixed paradigm has the great significance for improving the overallperformance of the current
MI-BCI. A hybrid BCI may be more complex than a single BCI and more difficult for all users to accept. Therefore, the paradigm design
of hybrid brain-computer interface plays a very important role in the overall performance of the system. Further study is needed to
optimize the mixed paradigm system to improve the stability and practicability of the system.

3. Signal processing algorithms

3.1. Feature extraction methods

Feature extraction is an important part of signal processing in MI-BCI system. Its purpose is to transform pre-processed signals
into feature vectors, remove redundant data in feature vectors, and highlight the important features of pre-processed signals. Feature
extraction under motor imagery task is mainly aimed at frequency domain and spatial information. Common methods include Fourier
transform [57], auto-regressive model [58], wavelet transform [12], common space pattern [59], etc. The types of Fourier transform
include fast Fourier transform and discrete Fourier transform. Compared with the discrete Fourier transform, the fast Fourier transform
has the advantages of simple calculation and short calculation time, so it has been widely used.
Shedeed et al. [60] designed a robotic arm and used FFT to extract feature of EEG signals from three tasks: arm closed, arm
extended and hand held. Wavelet transform has flexible time-frequency resolution. The variable time and frequency window can be
used to gradually refine the signal, and the signal energy intensity or density can be expressed in the time domain and frequency
domain simultaneously.
Therefore, it is frequently used in EEG signal analysis [62]. Auto-regression model is more widely used in the processing of short
data. Its advantages are small computation, fast speed and high efficiency, so it is also applicable to the feature processing of EEG
signal [63].
Independent component analysis algorithm is a blind edge separation method, which can not only separate the interfering elements
from the effective features in the original brain signals, but also extract the energy distribution information of EEG signals [64].The
essence of principal component analysis (PCA) algorithm is to reduce the dimension of data. It converts a set of related features into a
set of comprehensive features without correlation through orthogonal vectors. The new features obtained are linear combinations of
the original features, and they are arranged in order of variance from the largest to the smallest. The algorithm reduces the possibility
of the feature dimension. When noise interference and redundancy factors are removed, the main information of the original features
can still be reflected [65].

16
J. Zhang and M. Wang Cognitive Robotics 1 (2021) 12–24

Table 2
Summary of the robot examples based on hybrid-BCIs.

Publication year Paradigm classifier Control objects Control commands

Horki et al. [35]. 2010 ERD/ERS&SSVEP LDA Artificial arm Open and close, elbow flexion and
extension
Rebsamen et al. [36]. 2010 ERD/ERS&P300 SVM and linear classifier Electric wheelchair Moving and stopping
Pfurtscheller et al. [37]. 2010 ERD/ERS&SSVEP LDA Orthosis Opening and closing
Alomari et al. [38]. 2013 ERD/ERS&MRCP ANN and SVM Wheelchair Turning left and right
Bhattacharyya et al. [40] 2014 ERD/ERS&P300 SVM Robotic arm Left, right, forward, stop
movement, continue
Cao et al. [41]. 2014 ERD/ERS&SSVEP RBF-SVM Wheelchair Turning left and right, forward,
accelerate and decelerate, constant
driving
Rupp et al. [42]. 2015 FES&ERD/ERS SVM Orthosis Hand fully open, hand fully close
Khan et al. [43]. 2015 ERD/ERS&SSVEP LDA Quadcopter Up and down, forward
Zhang et al. [44]. 2015 ERD/ERS&SSVEP RBF-SVM Quadcopter Up and down, left and right,
forward and backward
Yu et al. [46]. 2017 ERD/ERS&P300 LDA Wheelchair Moving forward/backward, moving
left /right, moving left45/right45,
accelerate/decelerate, turning
left/right, stopping
Zhang et al. [47]. 2017 ERD/ERS&SSVEP LR-LDS Robotic hand Up and down, left and right
Gao et al. [48]. 2017 ERD/ERS&SSVEP&EMG Linear classifier Robotic arm Upward and downward, lef and
right, Forward and backward
Bhattacharyya et al. [49] 2017 ERD/ERS&Errp SVM Robotic arm Turn the link clockwise or
counterclockwise, move the link
forward
Yan et al. [50]. 2019 ERD/ERS&SSVEP LDA Quadcopter Left and right, forward and
backward
Lee et al. [54]. 2019 ERD/ERS&SSSEP LDA and SVM Exoskeleton robot Moving left and right
Choi et al. [55]. 2020 ERD/ERS&SSVEP SVM Quadcopter Activating, turning left and right
Rakshit et al. [56]. 2020 ERD/ERS&SSVEP&P300 LSVM, RBF-SVM Robot arm Link selection, motion initiation,
automatic reversal, oscillation
continues, object position

Table 3
Methods of EEG feature extraction and their application in robots.

Feature extraction methods References

Common spatial pattern(CSP) [12,15,46,62,69,71,93]


Discrete wavelet transform (DWT) [35,43,62,72]
Principal component analysis(PCA) [60]
Fast Fourier transform(FFT) [60,61]
Common spatial frequency subspace decomposition (CSFSD) [14]
Auto-regressive model(AR) [17,18,73,93]
Adaptive Auto-regressive (AAR) [40]
continuous wavelet transform (CWT) [66]
Power spectral density(PSD) [32,37,61,80,81,94]
Linear discriminant analysis(LDA) [73,82]
Independent component analysis (ICA) [20,38,67,95]
others [22,23,54,61,83,91]

CSP algorithm has been widely used to improve MI classification performance. It provides a set of spatial filters which simul-
taneously make the data of two types’ covariance matrix diagonalize, thus extracting separable features to represent most of the
information of EEG signals [68,69]. Zhang et al. [70] introduces a new method, which introduced sparse Bayesian learning algorithm
on the basis of common space pattern for the selection of effective features. The experimental results showed the effectiveness of the
method which is expected to develop an effective classifier to improve the classification performance of MI. Choi and Jo [55] uses fil-
ter sub-band common space pattern feature extraction method to filter EEG signals to different frequency bands, and then performed
common spatial pattern processing for each band, which improved the classification accuracy of the two types of EEG signals.
Among all kinds of feature extraction methods summarized above, compared with other algorithms, CSP algorithm and its im-
proved algorithm can maximize the signal difference of different MI tasks.It has been widely used in improving MI classification
performance.This filtering method has good separation characteristics and can effectively extract the different task components in
the EEG signals. Table 3 summarizes some methods for feature extraction of EEG signals.

17
J. Zhang and M. Wang Cognitive Robotics 1 (2021) 12–24

3.2. Classification methods

Accurately extracting the effective features of EEG signals and improving the rate of classification and recognition are most
important in the research of the MI-BCI. Common classification algorithms in MI-BCI systems include linear discriminant analysis,
support vector machine, k-nearest neighbor, artificial neural networks and fuzzy system, etc. [74]. LDA is one of the traditional
pattern recognition algorithms. In essence, It is a method of data dimensionality reduction based on supervised learning. It maps
sample data to a low dimensional space to form subpoint differentiated by categories, so as to achieve the purpose of compressing
feature dimensions and extracting classification information. Samples are classified after projection to minimize the variance within
the class and maximize the variance between classes. Linear discriminant analysis algorithm is relatively easy to implement, and the
calculation process is simple, which takes less time [75].
Yu et al. [46] applied the LDA algorithm to the classification of 11 types of tasks for MI-P300 hybrid brain-computer interface
system control wheelchair. The classification accuracy rate reached 87.2%. However, due to the many tasks of classification, it takes
more time to calculate. Spectral regression discriminant analysis classifier is an improvement of LDA algorithm. Spectral regression
discriminant analysis algorithm combines spectral analysis and linear regression [76]. It effectively solves the problem of feature
decomposition in LDA algorithm and saves a lot of classification time and storage space.
SVM is a machine learning method based on statistical learning theory. It maps the input vector to a high-dimensional feature
space through appropriate nonlinear mapping, so that the data can always be segmented by an optimal hyperplane. The optimal
classification surface requires that it needs to correctly separate the two types of data and make the maximum classification interval
[77]. Bhattacharyya successfully controlled the motion of the manipulator by decoding four types of motion imagination EEG signals
using a support vector machine classifier [40]. k-nearest neighbor (KNN) is a common machine learning classification algorithm. Its
basic idea is to extract the features of the data to be classified, calculate the distance between the features of the existing sample data,
and then sort them from small to large. Finally, select the features of the sample data with the closest distance, and determine the
attribute with the most tag times in the samples as the category of data to be classified [78].
In addition to the above classification algorithms, random forest(RF) algorithm is also a common classification algorithm. It is a
pattern recognition algorithm based on statistical theory proposed by Breiman in 2001 [79]. Essentially, it is a classifier that contains
multiple decision trees. The RF algorithm which gains sample data is based on bootstrap method. Decision tree modeling is carried
out for each sampled data. After generating many decision trees, the final classification results are obtained by voting. Lee et al. uses
RF classifier to effectively control the exoskeleton in three different directions: go forward, turn left and turn right [80].
In recent years, with the continuous development of artificial neural network(ANN), many scholars at home and abroad began to
use artificial neural network as a classifier of EEG signals. As an operational model composed of a large number of nodes intercon-
nected, artificial neural network widely used in regression analysis and classification problems. It can approach arbitrary nonlinear
decision functions by minimizing errors of classification training data, and so it is a nonlinear modeling method [84]. He et al. pro-
posed a general Bayesian network to distinguish multiple types of EEG signals and shorten the classification response time.However,
whether this method can be combined with the robot system to improve the overall performance of the system is still unknown [85].
Among numerous ANNs, the most commonly used one is multi-layer perceptron [86]. However, when EEG signal noise is relatively
high, over-fitting may occur, resulting in the decrease of classification accuracy [87]. Hettiarachchi et al. [88] studied the application
of functional chain neural network as a classifier in MI-BCI applications. The results showed that this method could not only make
up for the defects of multi-layer perceptron effectively, but also showed the best performance compared with various classifiers. So it
can be used for real-time MI-BCI control. In the past few years, due to the great success of deep neural network (DNN) in image and
speech recognition [89]. And other fields, more and more researchers have turned their attention to deep learning technology and
tried to use deep learning method to classify EEG signals. DNN can perform very complex nonlinear transformation classification by
increasing computing power and learning algorithms, achieving unprecedented complex classification results.
Yang et al. [90] constructed an end-to-end scheme involving multi-layer convolutional neural networks to accurately represent
multi-channel packet MI-EEG signals. The scheme is also used to extract useful information in multi-channel MI signals. Lu et al.
[32] have proposed a tractor assisted driving control method based on human-machine interface, which utilizes low-cost brain-
machine interface to collect EEG signals from tractor drivers, then perform wavelet packet de noising, calculate and extract the
spectral features of EEG signals, as inputs of a recurrent neural network. Finally, the EEG auxiliary recursive neural network drive
model is trained to control the tractor going straight, braking, left turning and right turning, with the control accuracy of 94.5% and
time cost of 0.61 ms.
Jeong et al. [91] proposed a double-layer two-way long and short term memory deep neural network to decode the motor imagery
EEG signals and precisely control a mechanical arm in a real-time scene. Although the above classification methods have achieved
good results in the classification of motor imagination EEG signals, there are still some limitations. It takes a lot of time to manually
extract features and then classify them. Therefore, it is important for us to propose more efficient MI-EEG classification methods to
promote the development of the BCI technology.
In 2018, Chiarelli and his colleagues [92] combined hybrid paradigm with the latest deep learning algorithm to evaluate the
accuracy of deep neural network classification algorithm in multi-modal records. The results showed that the combination of advanced
nonlinear deep neural network classification program could significantly improve the performance of brain-computer interface. In
addition to the methods just discussed, fuzzy systems can also be used to distinguish between different patterns. These systems are
able to detect patterns in the data that are often difficult to detect. Fuzzy systems rely on the tolerance of imprecision and uncertainty
to achieve easy and robust solutions to classification.

18
J. Zhang and M. Wang Cognitive Robotics 1 (2021) 12–24

Table 4
Methods of EEG classification and their application in robots.

Classification methods Author(s) Feature extract methods Robot type performance

SVM Choi et al. [12]. CSP Electric wheelchair Success rate 90–95%
Liu et al. [28] CSP Dual-arm robot Success rate 69.14%
Krishna et al. [23]. Cross-correlationmethods Mobile robot Success rate 59–68%
Choi et al. [55]. FBCSP Quadcopter Accuracy 75%
Rupp et al. [42]. CSP Orthosis NA
Bhattacharyya et al. [40]. AAR Robot arm Accuracy 95%
Choi et al. [14]. CSFSD Electric wheelchair Success rate above 90%
Sakamaki et al. [94]. PSD Mobile robot Accuracy above 70%
LDA Pfurtscheller et al. [37]. PSD Orthosis Accuracy 85%
Lee et al. [54]. FBCSP Exoskeleton robot Accuracy 94.9%
Khan et al. [43]. DWT Quadcopter Accuracy 70–90%
Horki et al. [35]. DWT Artificial arm Accuracy 78–92%
Yan et al. [50]. CSP Quadcopter Accuracy 85–89%
Yu et al. [46]. CSP Wheelchair Accuracy 87.2%
ANN Plechawska et al. [24]. – Robot arm Accuracy above 60%
Vijayendra et al. [26]. DWT Quadcopter Accuracy 84.5% and 98.8%
LU et al. [32]. DWT&PSD Tractor driving robot Success rate 93.1%
Alomari et al. [38]. AAR&ICA Wheelchair Accuracy 86.5%
Wenjia et al. [95]. ICA 7-DOF robotic arm The training time increases with
the number of commands
Jeong et al. [91]. – Robot arm Success rate above 75%
KNN Kosmyna et al. [21]. ICA Quadcopter Success rate 75%
Bhattacharyya et al. [18]. AAR Mobile robot Accuracy above 83.3%
RF Lee et al. [80]. PSD Lower-limb exoskeleton Average 10.2% decrease in overall
task completion time compared to
the baseline protocol.
ANFN Bhattacharyya et al. [23]. MFDFA Robot arm Two classifications’ average
success rate of reaching a target
to 65% and 70%
Jafarifarmand et al. [93]. AR-CSP radio-control (RC) car Success rate 93.29%

Neural networks are combined with fuzzy systems to obtain new classifiers named ANFN [22]. Traditional signal processing
algorithms need a large amount of data for training classifiers, so MI-BCI needs a long time to record enough data, resulting in its
low efficiency and easy fatigue. Hossain et al. designed an optimal combination of information and direct transfer learning method,
which significantly reduced the need for new users’ training data [96] and improved the work efficiency of BCI system, especially in
case of the less training data [90]. Table 4 shows some frequently used classification algorithms of EEG signals.
In general, we believe that the classification algorithm based on deep learning can not only shorten the user training time.
Moreover, compared with other algorithms, the classification accuracy is improved to some extent. It provides a feasible solution for
more efficient and more complex motion intention recognition in the robot system controlled by MI-BCI.
In the future, with the development of computer hardware technology and the improvement of computer computing ability, the
research on the combination of deep learning algorithm and MI-BCI systems will become a hot spot.

4. Applications

MI-BCI systems have been widely studied for their potential applications in the fields of motor control, neurological rehabilitation
training and intelligent operation of other special environments. Meanwhile, the neural mechanism involved in a MI-BCI system is
closely related to the motor function, so it is expected to play an important role in improving the information processing efficiency
of users’ brain regions. According to different application scenarios, the robot controlled by the MI-BCI system can be divided into
motion control type and medical rehabilitation assistance type. The following is a brief description of its application.
At present, a growing number of people suffer from dyspraxia or reduced mobility because of traffic accidents, disease or aging. In
order to solve this problem, Choi et al. [12] used the MI-BCI system to translate the motion imagination EEG signals into the control
instructions of wheelchair movement, providing a way for those disabled people to move independently and improving the patients’
daily life quality. Other relevant studies can be found in [11,13,14,36,38,41,46].
Exoskeleton is a wearable robot that can have closely physical interaction with human users [97]. Exoskeleton robots have
been widely concerned for helping patients with high paraplegia or patients with advanced amyotrophic lateral sclerosis to conduct
complete rehabilitation training or provide walking assistance [99]. Among them, dynamic upper and lower extremity exoskeletons
have been emphasized as a potential technique to assist paraplegia or quadriplegia.
Somestudies have combined BCI with the exoskeleton robot system, which can not only improve the quality of life of some patients
with neurological loss such as stroke and cerebral palsy, but the more importantly, promote the activation of the damaged neurons in
these patients and help them recover the injured motor function [100]. Maamari et al. [101] designed a brain-controlled exoskeleton
for patients with motor neurone disease and successfully helped them grasp and release a light ball.

19
J. Zhang and M. Wang Cognitive Robotics 1 (2021) 12–24

Table 5
Summary of the MI-BCI controlled robot applications.

Category Authors Device to be controlled Output commands

Medical rehabilitation Tanaka et al. [11]. Electric wheelchair Turning left and right
Choi et al. [12]. Electric wheelchair Turning left and Right and going forward
Hema et al. [13]. Electric wheelchair Relax, going forward,and turning left/right
Rebsamen et al. [36]. Electric wheelchair Moving and stopping
Pfurtscheller et al. [37]. Orthosis Opening and closing
Alomari et al. [38]. Wheelchair Turning left and right
Lee et al. [54]. Exoskeleton robot Moving left and right
Rupp et al. [42]. Orthosis Hand fully open, hand fully close
Horki et al. [35]. Artificial arm Open and close, elbow flexion and extension
Lee et al. [80]. Lower-limb Walk forward, turn left and turn right
exoskeleton
Rakshit[59] Robot arm Link selection, motion initiation, automatic
reversal, oscillation continues, object position
Bhattacharyya [98] Multi-DOF Robot Up and down,left and right
Other application scenarios Royer et al. [15]. Virtual helicopter Right and left, up and down
Akce et al. [14]. Fixed-wing Choosing trajectory
Bhattacharyya et al. [17] Mobile robot Speed and position
Chae et al. [18]. Humanoid robot Stop, turn the head to the left, turn the head to
the right, turn the body, walk forward
Lafleur et al. [19]. Quadcopter Up and down, left and right, forward and
backward
Kosmyna et al. [20]. Quadcopter Up and down, forward
Shi et al. [21]. Hexrcopter Left and right, forward
Bhattacharyya et al. [22] axis robotic Relax, left and right, forward and backward
manipulator with a 3
fingered hand
Krishna et al. [23]. Mobile robot Turning left and right, going forward and
backward
Liu et al. [30]. Dual-arm robot Lift and drop
LU et al. [32]. Tractor driving robot Go straight, turning left, turning right, brake
Bhattacharyya et al. [40] Robot arm Left, right, forward, stop move, continue
Khan et al. [43]. Quadcopter Up and down, forward
Yan et al. [50]. Quadcopter Left and right, forward and backward
Choi et al. [55]. Quadcopter Activating, turning left and right

Hortal et al. [102] used MI-BCI system to assist control of mixed upper limb exoskeleton for rehabilitation training of patients
with nerve injury. However, the above studies ignored the strong regulation of ERD/ERS by the somatosensory feedback of EEG tasks.
Sensory feedback can increase the regulation of electroencephalogram. Furthermore, more cortical regions involved in processing
sensory information function may also be involved in the proposed task [103]. This may have important implications for clinical
design of rehabilitation tasks.
Ang et al. also believe that MI-BCI technology has the potential to restore motor function by inducing activity-dependent brain
plasticity. They investigated the efficacy of an electroencephalogram based MI-BCI system combined with MIT-Manus shoulder and
elbow robotic feedback in patients with chronic stroke and upper limb hemiplegia. In the subsequent experiment, 26 hemiplegic
subjects received post stroke motor recovery treatment and 60% of the subjects achieved significant improvement in motor function
safely. It has been proved that EEG adjuvant therapy is safe and effective for the rehabilitation of patients with severe post-stroke
hemiplegia [104].
Woosang et al. [105] provided a closed-loop sensorimotor integrated motor rehabilitation system. Rehabilitation training for a
stroke patient using MI-BCI system combined with information control functional electrical stimulation shows that the accuracy of
BCI is improved to over 80%. The above studies indicate that training MI-BCI is effective and important for rehabilitation training of
stroke patients. However, more studies with a larger sample size are needed to improve the reliability of these results. MI-BCI system
has been widely used in other fields besides medical rehabilitation.
Marialuisa et al. [106] suggested that a dual focused robotic assisted training program could improve upper limb motor function
in subacute and chronic stroke patients. It can also induce specific changes in sensorimotor networks .The above studies indicate that
training MI-BCI is effective and important for rehabilitation training of stroke patients. However, more studies with larger sample
size are needed to improve the reliability of these results, so as to explore the influence of intelligent motion-assisted rehabilitation
on the cerebral cortex of subjects [107].
Chae et al. [82] introduces a new brain-driven humanoid robot navigation system based on motor imagery, which can realize
asynchronous direct control of humanoid robot movement by using the active brain-computer interface system. Users can explore
the environment by controlling the direction of the robot’s head and body and move the robot in any direction they want. In order
to verify the effectiveness of the system, they conducted a navigation experiment of humanoid robot in a maze.
To achieve indoor target search for low-speed unmanned aerial vehicles that are easy to use and stable to control, Shi et al.
[21] proposed a non-invasive BCI system. The system consists of two main subsystems responsible for decision-making and semi-

20
J. Zhang and M. Wang Cognitive Robotics 1 (2021) 12–24

autonomous navigation. Among them, the decision subsystem is established on the basis of EEG analysis of motor imagination. The
average classification accuracy of the BCI system reached 94.36%. Table 5 shows some examples of MI-BCI systems controlled robots
applied in different scenarios.

5. Summary and future directions

In this paper, we comprehensively review the development of robots controlled by motor imagery EEG signals. Ref. [108,109] only
introduced the development of wheelchairs and unmanned aerial vehicles controlled by BCI system, while this paper conducted a
more comprehensive investigation of robots in various scenarios under the control of motor imagery EEG signals. The development
and trend of the hybrid paradigms are introduced emphatically. We believe that the control of external devices in the MI-BCI systems
have gradually evolved from the single MI evocation paradigms to the hybrid paradigms. The system increases the control commands
of the robots and makes the function of the robots more diversified. Like the Ref. [110], this paper also reviews the key techniques
of EEG feature extraction and classification. However, in the study of the classification problem, we studied more examples of deep
learning algorithm combined with BCIs. With the development of computer hardware technology and the improvement of computer
computing ability, this method has great potential in promoting the development of the BCI technology.
At present, MI-BCI technology is still in the stage of rapid development, in the signal processing or asynchronous control research,
there are still many problems and challenges to be solved. In future research, the innovation of hybrid paradigm and the growing
maturity of deep learning will greatly improve the overall performance of the robot system controlled by MI-BCI. Rehabilitation robots
based on MI training for reverse recovery of upper limb motor function of stroke patients will be a hot area of research. The innovation
of hybrid paradigms can enhance the patient’s participation, stimulate the patient’s intention to move, and improve the efficiency of
robot-assisted rehabilitation. The development of rehabilitation training robots with good robustness, the ability to correct certain
classification errors and strong adaptive learning ability will provide medical institutions with tools to assist the functional recovery
of stroke patients, which can improve the traditional treatment methods and more effectively help patients recover the injured motor
function.

Declaration of Competing Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to
influence the work reported in this paper.

Acknowledgements

This work was supported by Scientific and Technological Projects of Shaanxi Province under Grant 2016GY-040 and 2020JM-525.

References

[1] M. Clerc, Brain computer interfaces, principles and practise, Biomed. Eng. Online 12 (1) (2013) 1–4.
[2] H. Wang, G. Xu, Y. Wu, et al., A new brain-computer interface paradigm based on steady-state visual evoked potential of illusory pattern motion perception,
in: Proceedings of the 16th International Conference on Ubiquitous Robots (UR), 2019.
[3] R.S. Leow, F. Ibrahim, M. Moghavvemi, Development of a steady state visual evoked potential(SSVEP)-based brain computer interface(BCI) system, in: Pro-
ceedings of the International Conference on Intelligent and Advanced Systems, 2007, pp. 321–324.
[4] W. Zheng, Q. Liu, K. Chen, et al., Brain-robot shared control based on motor imagery and improved bayes filter, in: Proceedings of the IEEE/ASME International
Conference on Advanced Intelligent Mechatronics (AIM), IEEE, 2019.
[5] G. Prasad, P. Herman, D. Coyle, S. McDonough, et al., Applying a brain-computer interface to support motor imagery practice in people with stroke for upper
limb recovery: a feasibility study, J. Neuroeng. Rehabil. 7 (1) (2010) 60.
[6] J. Park, K. Kim, A POMDP Approach to Optimizing P300 Speller BCI Paradigm, IEEE Trans. Neural Syst. Rehabil. Eng. 20 (4) (2012) 584–594.
[7] Y. Zhang, P. Xu, K. Cheng, et al., Multivariate synchronization index for frequency recognition of SSVEP-based brain–computer interface, J. Neurosci. Methods
221 (Complete) (2014) 32–40.
[8] H. Yuan, B. He, Brain-computer interfaces using sensorimotor rhythms: current state and future perspectiv-es, IEEE Trans. Biomed. Eng. 61 (5) (2014)
1425–1435.
[9] A. Bamdadian, C. Guan, K.K. Ang, J. Xu, Towards improvement of MI-BCI performance of subjects with BCI deficiency, in: Proceedings of the 7th International
IEEE/EMBS Conference on Neural Engineering (NER), Montpellier, 2015, pp. 17–20.
[10] A. Nijholt, D. Tan, Brain-computer interfacing for intelligent systems, IEEE Intell. Syst. 23 (3) (2008) 72–79.
[11] K. Tanaka, K. Matsunaga, H.O. Wang, Electroencephal -ogram based control of an electric wheelchair, IEEE Trans. Robot. 21 (4) (2005) 762–766.
[12] K. Choi, A. Cichocki, Control of a wheelchair by motor imageryin real time, in: Proceedings of the 9th International Conference on Intelligent Data Engineering
and Automated Learning, 2008, pp. 330–337.
[13] C.R. Hema, M.P. Paulraj, S. Yaacob, et al., Mu and beta EEG rhythm based design of a four state brain machine interface for a wheelchair, in: Proceedings of
the International conference Control, Automation, Communication Energy, 2009, pp. 401–404.
[14] K. Choi, Control of a vehicle with EEG signals in real-time and system evaluation, Arbtsphysiologie 112 (2) (2011) 755–766.
[15] A.S. Royer, A.J. Doud, M.L. Rose, et al., EEG control of a virtual helicopter in 3-dimensional space using intelligent control strategies, IEEE Trans. Neural Syst.
Rehabil. Eng. 18 (6) (2010) 581–589.
[16] A. Akce, M. Johnson, T. Bretl, Remote teleoperationof an unmanned aircraft with a brain-machine interface: theory and preliminary results, in: Proceedings of
the IEEE International Conference on Robotics & Automation, IEEE, 2010, pp. 5322–5327.
[17] S. Bhattacharyya, A. Sengupta, T. Chakraborti, et al., EEG controlled remote robotic system from motor imagery classification, in: Proceedings of the Third
International Conference on Computing, Communication and Networking Technologies (ICCCNT’12), 2012, pp. 1–8. Coimbatore.
[18] Y. Chae, J. Jeong, S. Jo, Toward brain-actuated humanoid robots: asynchronous direct control using an EEG-based BCI, IEEE Trans. Robot. 28 (5) (2012)
1131–1144.
[19] K. LaFleur, K. Cassady, A. Doud, et al., Quadcopter control in three-dimensional space using a noninvasive motor imagery-based brain–computer interface, J
Neural Eng 10 (4) (2013) 046003.

21
J. Zhang and M. Wang Cognitive Robotics 1 (2021) 12–24

[20] N. Kosmyna, F. Tarpin-Bernard, B. Rivet, Drone, your brain, ring course: accept the challenge and prevail, in: Proceedings of the 2014 ACM International Joint
Conference on Pervasive and Ubiquitous Computing Adjunct Publication - UbiComp, 2014, pp. 243–246.
[21] T. Shi, H. Wang, C. Zhang, Brain Computer Interface system based on indoor semi-autonomous navigation and motor imagery for Unmanned Aerial Vehicle
control, Expert Syst. Appl. 42 (9) (2015) 4196–4206.
[22] S. Bhattacharyya, D. Basu, A. Konar, D.N. Tibarewala, Interval type-2 fuzzy logic based multiclass ANFIS algorithm for real-time EEG based movement control
of a robot arm, Rob. Auton. Syst. 68 (2015) 104–115.
[23] D.H. Krishna, I.A. Pasha, T.S. Savithri, Autonomuos robot control based on EEG and cross-correlation, in: Proceedings of the 10th International Conference on
Intelligent Systems and Control (ISCO), 2016, pp. 1–4. Coimbatore.
[24] M. Plechawska-Wojcik, P. Wolszczak, R. Cechowicz, et al., Construction of neural nets in brain-computer interface for robot arm steering, in: Proceedings of
the International Conference on Human System Interactions, IEEE, 2016.
[25] S. Norman, M. Dennison, E. Wolbrecht, et al., Movement anticipation and EEG: implications for BCI-contingent robot therapy, IEEE Trans. Neural Syst. Rehabil.
Eng. 24 (8) (2016) 1.
[26] A. Vijayendra, S.K. Saksena, R.M. Vishwanath, S.N. Omkar, A performance study of 14-channel and 5-channel EEG systems for real-time control of unmanned
aerial vehicles (UAVs), in: 2018 S Proceedings of the IEEE International Conference on Robotic Computing (IRC), IEEE, 2018, pp. 183–188.
[27] J. Cantillo-Negrete, R.I. Carino-Escobar, P. Carrillo-Mora, et al., Motor imagery-based brain-computer interface coupled to a robotic hand orthosis aimed for
neurorehabilitation of stroke patients, J. Healthc. Eng. (2018).
[28] R. Bousseta, El.I. Ouakouak, M. Gharbi, et al., EEG based brain computer interface for controlling a robot arm movement through thought, Innov. Res. Biomed.
Eng. 39 (2) (2018) 129–135.
[29] M. Aljalal, R. Djemal, S. Ibrahim, Robot navigation using a brain computer interface based on motor imagery, J. Med. Biol. Eng. 39 (4) (2019) 508–522.
[30] Y. Liu, W. Su, Z. Li, et al., Motor-imagery-based teleoperation of a dual-arm robot performing manipulation tasks, IEEE Trans. Cognit. Dev. Syst. 11 (3) (2019)
414–424.
[31] Y. Xu, C. Ding, X. Shu, et al., Shared control of a robotic arm using non-invasive brain–computer interface and computer vision guidance, Rob. Auton. Syst.
115 (2019) 121–129.
[32] W. Lu, Y. Wei, J. Yuan, Y. Deng, et al., Tractor assistant driving control method based on EEG combined with RNN-TL deep learning algorithm, IEEE 8 (2020)
163269–163279.
[33] S. Fazli, J. Mehnert, J. Steinbrink, et al., Enhanced performance by a hybrid NIRS–EEG brain computer interface, Neuroimage 59 (1) (2012) 519–529.
[34] D. Liu, W. Chen, Z. Pei, et al., A brain-controlled lower-limb exoskeleton for human gait training, Rev. Sci. Instrum. 88 (10) (2017) 104302.
[35] P. Horki, T. Solis-Escalante, C. Neuper, et al., Combined motor imagery and SSVEP based BCI control of a 2 DoF artificialupper limb, Med. Biol. Eng. Comput.
49 (5) (2011) 567–577.
[36] B. Rebsamen, C. Guan, H. Zhang, et al., A brain controlled wheelchair to navigate in familiar environments, IEEE Trans. Neural Syst. Rehabil. Eng. 18 (6) (2010)
590–598.
[37] G. Pfurtscheller, T. Solis-Escalante, R. Ortner, et al., Self-paced operation of an SSVEP-based orthosis with and without an imagery-based “brain switch:” a
feasibility study towards a hybrid BCI, IEEE Trans. Neural Syst. Rehabil. Eng. 18 (4) (2010) 409–414.
[38] M.H. Alomari, A. Samaha, K. AlKamha, Automated classification of L/R Hand Movement EEG Signals using advanced feature extraction and machine learning,
Int. J. Adv. Comput. Sci. Appl. 4 (6) (2013) 207–212.
[39] J. Long, Y. Li, T. Yu, Z. Gu, Target selection with hybrid feature for BCI-based 2-D cursor control, IEEE Trans. Biomed. Eng. 59 (1) (2012) 132–140.
[40] S. Bhattacharyya, A. Konar, D.N. Tibarewala, Motor imagery, P300 and error-related EEG-based robot arm movement control for rehabilitation purpose, Med.
Biol. Eng. Comput. 52 (12) (2014) 1007–1017.
[41] L. Cao, J. Li, H. Ji, et al., A hybrid brain computer interface system based on the neurophysiological protocol and brain-actuated switch for wheelchair control,
J. Neurosci. Methods 229 (2014) 33–43.
[42] R. Rupp, M. Rohm, M. Schneiders, et al., Functional rehabilitation of the paralyzed upper extremity after spinal cord injury by noninvasive hybrid neuropros-
theses, Proc. IEEE 103 (6) (2015) 954–968.
[43] M.J. Khan, K.S. Hong, N. Naseer, M.R. Bhutta, Hybrid EEG-NIRS based BCI for quadcopter control, in: Proceedings of the Society of Instrument & Control
Engineers of Japan, IEEE, 2015, pp. 1177–1182.
[44] D. Zhang, M.M. Jackson, Quadcopter Navigation Using Google Glass and Brain–Computer Interface, Georgia Inst. Technol., Atlanta, GA, USA, 2015 Master
Project.
[45] C. Mcgeady, S.P. A.Vuckovic, A Hybrid MI-SSVEP based brain computer interface for potential upper limb neurorehabilitation: a pilot study, in: Proceedings
of the 7th International Winter Conference on Brain-Computer Interface (BCI), 2019.
[46] Y. Yu, Z. Zhou, Y. Liu, et al., Self-paced operation of a wheelchair based on a hybrid brain-computer interface combining motor imagery and P300 potential,
IEEE Trans. Neural Syst. Rehabil. Eng. 25 (12) (2017) 2516–2526.
[47] W. Zhang, F. Sun, C. Liu, et al., A hybrid EEG-based BCI for robot grasp controlling, in: Proceedings of the IEEE International Conference on Systems, Man, and
Cybernetics (SMC), 2017, pp. 3278–3283. Banff, AB.
[48] Q. Gao, L. Dou, A.N. Belkacem, et al., Noninvasive electroencephalogram based control of a robotic arm for writing task using hybrid BCI system, J. Biomed.
Biotechnol. (2017) 1–8.
[49] S. Bhattacharyya, A. Konar, D.N. Tibarewala, Motor imagery and error related potential induced position control of a robotic arm, IEEE/CAA J. Automatica
Sinica 4 (4) (2017) 639–650.
[50] N. Yan, C. Wang, Y. Tao, et al., Quadcopter control system using a hybrid BCI based on off-line optimization and enhanced human-machine interaction, IEEE
8 (2019) 1160–1172.
[51] G.R. Muller-Putz, R. Scherer, C. Neuper, et al., Steady state somatosensory evoked potentials: suitable brain signals for brain-computer interfaces, IEEE Trans.
Neural Syst. Rehabil. Eng. 14 (1) (2006) 30–37.
[52] Y. Lin, M. Jianjun, Z. Dingguo, et al., Selective sensation based brain-computer interface via mechanical vibrotactile stimulation, PLoS ONE 8 (6) (2013) e64784.
[53] L. Yao, X. Sheng, D. Zhang, et al., A stimulus-indep-endent hybrid BCI based on motor imagery and somatosensory attentional orientation, IEEE Trans. Neural
Syst. Rehabil. Eng. 25 (9) (2017) 1674–1682.
[54] J. Lee, K. Cha, H. Kim, et al., Hybrid MI-SSSEP Paradigm for classifying left and right movement toward BCI for exoskeleton control, in: Proceedings of the 7th
International Winter Conference on Brain-Computer Interface (BCI), 2019, pp. 1–3.
[55] J. Choi, S. Jo, Application of hybrid brain-computer interface with augmented reality on quadcopter control, in: Proceedings of the 8th International Winter
Conference on Brain-Computer Interface (BCI), 2020, pp. 1–5.
[56] A. Rakshit, A. Konar, A.K. Nagar, A hybrid brain-computer interface for closed-loop position control of a robot arm, IEEE/CAA J. Automatica Sinica 7 (5) (2020)
1344–1360.
[57] K. Samiee, P. Kovács, M. Gabbouj, Epileptic seizure classification of EEG time-series using rational discrete short-time Fourier transform, IEEE Trans. Biomed.
Eng. 62 (2) (2015) 541–552.
[58] D. D’Croz-Baron, J.M. Ramirez, M. Baker, et al., A BCI motor imagery experiment based on parametric feature extraction and Fisher Criterion, in: Proceedings
of the International Conference on Electrical Communications & Computers, IEEE, 2012, pp. 257–261.
[59] H. Xie, B.Xia D.Xiao, et al., The research for the correlation between ERD/ERS and CSP, in: Proceedings of the Seventh International Conference on Natural
Computation, IEEE, 2011.
[60] H.A. Shedeed, M.F. Issa, S.M. El-Sayed, Brain EEG signal processing for controlling a robotic arm, in: Proceedings of the 8th International Conference on
Computer Engineering & Systems (ICCES) IEEE, 2014.
[61] S. Varonamoya, F. Velascoalvarez, S. Sancharos, et al., Wheelchair Navigation With an Audio-Cued, Two-Class Motor Imagery-Based Brain-Computer Interface
System, 2015.

22
J. Zhang and M. Wang Cognitive Robotics 1 (2021) 12–24

[62] W. Song, X. Wang, S. Zheng, et al., Mobile Robot Control by BCI Based on Motor Imagery, in: Proceedings of the Sixth International Conference on Intelligent
Human-machine Systems & Cybernetics, IEEE, 2014.
[63] M. Han, L. Sun, EEG signal classification for epilepsy diagnosis based on AR model and RVM, in: Proceedings of the International Conference on Intelligent
Control & Information Processing, IEEE, 2010, pp. 134–139.
[64] M.B. Pontifex, K.L. Gwizdala, A.C. Parks, et al., Variability of ICA decomposition may impact EEG signals when used to remove eyeblink artifacts, Psychophys-
iology 54 (3) (2016) 386–398.
[65] M. Piorecký, J. Štrobl, V. Krajča, P03-Dimension reduction of EEG feature space by using PCA, Clin. Neurophysiol. 129 (4) (2018) e14–e15.
[66] R. Bousseta, S. Tayeb, I.E. Ouakouak, et al., EEG efficient classification of imagined right and left hand movement using RBF kernel SVM and the joint CWT_PCA,
AI Soc. 33 (4) (2018) 621–629.
[67] C.S. Kim, J. Sun, D. Liu, et al., Removal of ocular artifacts using ICA and adaptive filter for motor imagery-based BCI, IEEE/CAA J. Automatica Sinica (2017)
1–8.
[68] A. Liu, K. Chen, Q. Liu, et al., Feature selection for motor imagery EEG classification based on firefly algorithm and learning automata, Sensors 17 (11) (2017)
2576.
[69] A. Majid, D. Ridha, I. Sutrisno, Robot navigation using a brain computer interface based on motor imagery, J. Med. Biol. Eng. 39 (4) (2019) 508–522.
[70] Y. Zhang, Y. Wang, J. Jin, et al., Sparse bayesian learning for obtaining sparsity of EEG frequency bands based feature vectors in motor imagery classification,
Int J Neural Syst 27 (02) (2017) 537–552.
[71] S. Ding, H. Zhang, J. Li, Study of a brain-controlled switch during motor imagery, in: Proceedings of the 3rd International Conference on Advanced Robotics
and Mechatronics (ICARM), 2018.
[72] M. Spüler, Spatial filtering of EEG as a regression problem, in: Proceedings of the 7th Graz Brain-Computer Interface Conference, 2017.
[73] Y. Chae, J. Jeong, S. Jo, Noninvasive Brain-Computer Interface-based control of humanoid navigation, in: Proceedings of the IEEE/RSJ International Conference
on Intelligent Robots and Systems, IEEE, 2011.
[74] M. Aljalal, S. Ibrahim, R. Djemal, et al., Comprehensive review on brain-controlled mobile robots and robotic arms based on electroencephalography signals,
Intell. Serv. Robot. 13 (3) (2020).
[75] A. Malki, C. Yang, N. Wang, Z. Li, Mind guided motion control of robot manipulator using EEG signals, in: Proceedings of the 5th International Conference on
Information Science & Technology, IEEE, 2015, pp. 553–558.
[76] Q. Ai, A. Chen, K. Chen, et al., Feature extraction of four-class motor imagery EEG signals based on functional brain network, J. Neural Eng. (2019).
[77] C.J.C Burges, A tutorial on support vector machines for pattern recognition, Data Min. Knowl. Discov. 2 (1998) 121–167.
[78] S.K. Bashar, A.R. Hassan, M.I.H. Bhuiyan, Identific -ation of motor imagery movements from EEG signals using dual tree complex wavelet transform, in:
Proceedings of the International Conference on Advances in Computing, Communications and Informatics, IEEE, 2015, pp. 290–296.
[79] L. Breiman, Random forests, Mach. Learn. 45 (1) (2001) 5–32.
[80] K. Lee, D. Liu, L. Perroud, et al., A brain-controlled exoskeleton with cascaded event-related desynchroni- zation classifiers, Robot. Autonom. Syst. 90 (C) (2016)
15–23.
[81] R. Ron-Angevin, F. Velasco-Álvarez, Á. Fernández- Rodríguez, et al., Brain-computer interface application: auditory serial interface to control a two-class motor
ima- gery-based wheelchair, J. Neuroeng. Rehabil. 14 (49) (2017).
[82] Y. Chae, S. Jo, J. Jeong, Brain-actuated humanoid robot navigation control using asynchronous Brain-Computer Interface, in: Proceedings of the International
IEEE/EMBS Conference on Neural Engineering, 2011.
[83] S.Y. Cho, A.P. Winod, K.W.E. Cheng, Towards a Brain-Computer Interface based control for next generation electric wheelchairs, in: Proceedings of the Inter-
national Conference on Power Electronics Systems & Applications, IEEE, 2009.
[84] L. Bi, X. Fan, Y. Liu, EEG-based brain-controlled mobile robots: a survey, IEEE Trans. Hum. Mach. Syst. 43 (2) (2013) 161–176.
[85] L. He, D. Hu, M. Wan, et al., Common Bayesian network for classification of EEG-based multiclass motor imagery BCI, IEEE Trans. Syst. Man Cybern. Syst. 46
(6) (2017) 843–854.
[86] Kandris Korovesis, Koulouras, et al., Robot motion control via an EEG-based brain–computer interface by using neural networks and alpha brainwaves, Elec-
tronics (Basel) 8 (12) (2019) 1387.
[87] S. Mohamed, S. Haggag, S. Nahavandi, et al., Towards automated quality assessment measure for EEG signals, Neurocomputing 237 (2017) 281–290.
[88] I.T. Hettiarachchi, T. Babaei, T. Nguyen, et al., A fresh look at functional link neural network for motor imagery-based brain-computer interface, J. Neuroence
Methods 305 (2018) 28–35.
[89] Y. Lecun, Y. Bengio, G. Hinton, Deep learning, Nature 521 (7553) (2015) 436–444.
[90] J. Yang, Z. Ma, J. Wang, et al., A novel deep learning scheme for motor imagery EEG decoding based on spatial representation fusion, IEEE Access 8 (2020)
202100–202110.
[91] J.H. Jeong, K.H. Shim, D.J. Kim, et al., Trajectory decoding of arm reaching movement imageries for brain-controlled robot arm system, in: Proceedings of the
41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), 2019, pp. 5544–5547.
[92] A.M. Chiarelli, P. Croce, A. Merla, et al., Deep learning for hybrid EEG-fNIRS brain-computer interface: application to motor imagery classification, J. Neural
Eng. 15 (3) (2018) 036028.
[93] A. Jafarifarmand, M.A. Badamchizadeh, EEG artifacts handling in a real practical brain–computer interface controlled vehicle, IEEE Trans. Neural Syst. Rehabil.
Eng. 27 (6) (2019) 1200–1208.
[94] I. Sakamaki, E. Camilo, P.D. Campo, et al., Assistive technology design and preliminary testing of a robot platform based on movement intention using low-cost
brain computer interface, in: Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics, IEEE, 2017.
[95] W. Ouyang, K. Cashion, V.K. Asari, Electroencephel- ograph based brain machine interface for controlling a robotic arm, in: Proceedings of the Applied Imagery
Pattern Recognition Workshop, IEEE, 2014.
[96] H. Ibrahim, K. Abbas, H. Imali, et al., Multiclass informative instance transfer learning framework for motor imagery-based brain-computer interface, Comput.
Intell. Neurosci. (2018) 1–12 2018.
[97] R. Lu, Z. Li, C.Y. Su, et al., Development and learning control of a human limb with a rehabilitation exoskeleton, IEEE Trans. Ind. Electro- Nics 61 (7) (2014)
3776–3785.
[98] S. Bhattacharyya, S. Shimoda, M. Hayashibe, A synergetic brain-machine interfacing paradigm for multi-DOF robot control, IEEE Trans. Syst. Man Cybern. Syst.
46 (7) (2016) 957–968.
[99] S. Qiu, Z. Li, W. He, L. Zhang, et al., Brain–machine interface and visual compressive sensing-based teleoperation control of an exoskeleton robot, IEEE Trans.
Fuzzy Syst. 25 (1) (2017) 58–69.
[100] Y.S. Lee, S.H. Bae, S.H. Lee, et al., Neurofeedback training improves the dual-task performance ability in stroke patients, Tohoku J. Exp. Med. 236 (1) (2015)
81–88.
[101] M.S.A. Maamari, S.S.A. Badi, A. Saleem, et al., Desi -gn of a brain controlled hand exoskeleton for patients with motor neuron diseases, in: Proceedings of the
International Symposium on Mechatronics and its Applications, IEEE, 2016.
[102] E. Hortal, D. Planelles, F. Resquin, et al., Using a brain-machine interface to control a hybrid upper limb exoskeleton during rehabilitation of patients with
neurological conditions, J. Neuroeng. Rehabil. 12 (92) (2015) 46–53.
[103] J.A. Barios, S. Ezquerro, A. Bertomeu-Motos, et al., Sensory feedback with a hand exoskeleton increases EEG modulation in a brain-machine interface system,
in: Proceedings of the International Conference on NeuroRehabilitation, Springer, Cham, 2018.
[104] K.K. Ang, K.S.G. Chua, K.S. Phua, et al., A rando -mized controlled trial of EEG-based motor imagery brain computer interface robotic rehabilitation for stroke,
Clin. Eeg. Neurosci. 46 (4) (2015) 310–320.
[105] W. Cho, N. Sabathiel, R. Ortner, et al., Paired associative stimulation using brain-computer interfaces for stroke rehabilitation: a pilot study, Eur J Transl Myol
26 (3) (2016) 219–222.

23
J. Zhang and M. Wang Cognitive Robotics 1 (2021) 12–24

[106] G. Marialuisa, F. Emanuela, G. Christian, et al., Quantification of upper limb motor recovery and EEG power changes after robot-assisted bilateral arm training
in chronic stroke patients: a prospective pilot study, Neural Plast. (2018) 1–15.
[107] T. Jia, C. Li, X. Guan, et al., in: Proceedings of the International Conference on Intelligent Medicine and Health - ICIMH 2019 - Enhancing Engagement during
Robot-Assisted Rehabilitation Integrated with Motor Imagery Task, 2019, pp. 12–16.
[108] T. Zaydoon, B.B. Zaidan, A.A. Zaidan, et al., A review of disability EEG based wheelchair control system: coherent taxonomy, open challenges and recommen-
dations, Comput. Methods Progr. Biomed. (2018) S0169260718304620.
[109] A. Nourmohammadi, M. Jafari, T.O. Zander, A survey on unmanned aerial vehicle remote control using brain–computer interface, IEEE Trans. Hum. Mach.
Syst. (99) (2018) 1–12.
[110] X. Mao, M. Li, L. Wei, et al., Progress in EEG-based brain robot interaction systems, Comput. Intell. Neurosci. (2) (2017) 1–25.

24

You might also like