Professional Documents
Culture Documents
1932-4545 © 2018 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission.
See http://www.ieee.org/publications standards/publications/rights/index.html for more information.
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.
FALL et al.: MULTIMODAL ADAPTIVE WIRELESS CONTROL INTERFACE FOR PEOPLE WITH UPPER-BODY DISABILITIES 3
Fig. 2. Methodology : (a) Overview of the proposed alternate control strategy based on a WBSN and (b) Default upper body motions used with the proposed
controller interface to emulate functionnalities of the 3D joystick control depicted in figure.
Fig. 3. (a) Illustration of the proposed multimodal interface strategy that can read the user’s head and shoulders motion, as well as his residual myoelectric
activity, and translate them into control commands. (b) The proposed controller’s architecture diagram describing sensor nodes (IMU and sEMG-IMU types), the
Safety-key connected by USB to the Host platform and connected wirelessly to the network and to JACO, the controlled assistive device used as a testbed.
tool enabling or disabling the control of JACO. To operate it, strategies, two types of sensors have been designed, namely an
any compatible adaptive switch that has a male jack connector IMU sensor node and a combined sEMG-IMU sensor node. As
type output can be used. Since body motion and head position depicted in Fig. 3, the proposed interface also uses a Safety-key
are used for control, the Safety-key provides a necessary safety node for enabling or disabling the whole system, while the raw
feature in case of emergency or uncontrolled movements. Also, data coming from all nodes is transmitted to a base station and
the assistive device under control should be disabled using the processed on the host platform.
Safety-key prior to performing free body motions, in order to 1) IMU Sensor Nodes: IMU type nodes (see Fig. 3(b)) inte-
avoid undesired behavior. Finally, as discussed in Section III-B4, grate IMU sensing features using the LS9DSM0 inertial sensor
several sensors can be used at once depending on the type of as- from STMicroelectronics, Switzerland, which provides a serial
sistive device, the number of DoFs, and the user’s preferences. peripheral interface (SPI). The MSP430F5528 microcontroller
To guarantee robust and reliable control, the whole system is unit (MCU) from Texas Instrument, USA, is used for its low
designed to address the requirements previously mentioned in power performance, whereas data are sent wirelessly using the
Section II-A. nRF24L01+ 2.4-GHz radio-frequency (RF) chip from Nordic
Semiconductor, Norway, which employs a proprietary protocol
III. SYSTEM DESIGN & IMPLEMENTATION designed to allow up to 6 pipelines (TX and RX). The sensor
printed circuit board (PCB) is designed to fit inside a dedicated
The proposed BoMI and wearable sensor nodes must use a small head mounted housing for convenience and comfort (see
miniature architecture for comfort, and optimize power con- Section IV-A).
sumption to extend the lifetime. The next subsections describe 2) sEMG-IMU Sensor Nodes: sEMG-IMU type sensors
its hardware and software elements. (Fig. 3(b)) use the same components as the IMU-type nodes
for motion sensing, and also features the ADS1291 as an ana-
A. Hardware Implementation log front-end (AFE), a low-power 1-channel electrophysiolog-
To provide the controller both with relevant inertial data and ical front-end integrated circuit (IC) from Texas Instruments
sEMG signals, enabling different smart and adaptive control for sEMG measurement. The chip provides an SPI interface.
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.
The device has the shape of a tiny patch sensor that uses three
1.8-cm by 0.5-cm, 1 cm-spaced gold plated dry electrodes at
the bottom. All sensor nodes are powered using 3.7V 100-mA
1.5-cm × 1.2-cm rechargeable Li-ion batteries.
3) Safety-Key Node: As mentioned in Section II, the pro-
posed controller features a Safety-key which can be connected
to the host by USB or/and connect wirelessly to the network,
since it features the nRF24L01+ transceiver as well. It is built
using a commercial device, the Switch Click USB by Ablenet,
USA. The switch is activated when the button is pressed and Fig. 4. WBSN based controller: power management and control safety.
can emulate mouse or keyboard functions. It allows external
switch accommodation via a jack connector, which is a cru-
cial feature when flexibility is important. In fact, most adap- schemes. IMU data from the LSM9DS0 is provided in 16-bit
tive switch devices have jack outputs, thus AT users can adopt resolution and sampled at a frequency of 100-Hz, which pro-
the technology that best fits their residual functionalities. The vides a suitable time resolution since body motion frequency
Safety-key node is used as an additional master node inside the bandwidth is often below 10-Hz [34]. A 9D vector comprising
sensor network (see Section III-B2), to handle safety and smart raw inertial data measurement, [ACCxy z , GYROxy z , MAGxy z ],
power management functions. A MSP430F5529 MCU (Texas is sent from each sensor to the base station every 16-ms
Instruments) reads and broadcasts the switch state (PRESSED (62-Hz). sEMG signal is sampled at a frequency of 1-kHz
or RELEASED) within a status packet (STATUS) sent to every [35]. The ADS1291 is set to the maximum gain (12-V/V) and
sensor node, every 4-ms (250-Hz). provides high-precision 24-bit samples, later compressed into
4) USB Base Station: The wearable sensors are connected 16-bit values in-situ inside each node by the MCU, for reducing
with the base station through a body area network (BAN). The the amount of data to be transmitted. Thirteen samples are sent
nodes are all independent from each other for flexibility and at a time, which corresponds to a 77-Hz transmitting frequency.
modularity, and the communication relies on a star topology. 2) Wireless Sensor Network & Power Management: As
The base station features the TM4C123GH6PM Cortex-M4F mentioned in Section III-A, the nRF24L01+ is used for the
MCU from Texas Instruments to gather the data from the mul- wireless data link of the proposed controller. It is a low-power
tiple sensor nodes, handle the network communications and RF transceiver, consuming less than 13-mA and 12.1-mA in
signaling, and transfer the data to the Host platform. It is also RX and TX modes, respectively, and 26 μA in standby mode,
used to program the sensors (i.e. download the firmware into which operates in the 2.4-GHz ISM (Industrial, Scientific and
the embedded MSP430 MCU) and for battery charging. Medical) frequency band. Its architecture allows for up to 6 re-
5) Host Platform: The control algorithm has been designed ceiving channels. A star topology is used and the base-station re-
to run on both PC (windows, linux) and RPi (raspbian, ARM ceives the stream of measurement data sent by all the network’s
Cortex-A53) platforms. When calibrating the system for a spe- peripheral nodes (sensors + Safety-key). The payload size is
cific user at the beginning, it is recommended to run the PC 32 bytes and the data rate set is 250-kbps (≈210-kbps effective
version that allows for more extensive feeedback through com- due to the RF chips embedded baseband protocol engine head-
plete graphical features and data visualization. Then, after cali- ers (Enhanced ShockBurst). Control Redundancy Check (CRC)
bration, control profiles are stored and can be later downloaded and ACK (acknowledgment) features are implemented in the
on the RPi platform. When the AT under control is JACO, the network communications. The ACK feature is used for net-
RPi is attached on the wheelchair and directly connected to the work bandwidth multiplexing. Indeed, when transmitting pack-
robotic arm through the USB connection. ets, nodes can retransmit for a predetermined number of times
(reTX) until new measurements become available. Thus, if no
ACK packet is received from the base station, a delay shift of
B. Software Architecture
ΔreT X is locally introduced by the embedded processor unit
Usability is the main gauge of a control system’s success- until the measurement packet is acknowledged. In IMU mode,
fulness. In addition to low-power, convenience and flexibility the number of retransmissions (reTX) is set to 2 times with a
concerns that has prevailed in the hardware architecture design 3-ms delay (ΔreT X) while in sEMG mode, those parameters
phase, ease of use, intuitiveness and efficiency are among the are set to 2 and 1.5-ms, for reTX and ΔreT X respectively.
key elements that were considered when designing the architec- The Safety-key is used to enable the proposed WBSN based
ture and processing units of this BoMI. For safety issues, latency controller when control occurs and to disable it otherwise (see
is very important as well, and should not exceed 300-ms for real Fig. 4).
time operation as recommended in [33]. The next subsections The MCU remains in low-power mode by default until the
describe the software elements of the proposed controller, from next measurement sample needs to be retrieved through SPI
the sensor firmware to the data fusion and control algorithms. from the sensor units (IMU or sEMG) or until a packet trans-
1) Sensor Nodes Firmware Architecture: Sensor nodes must mission/listening is required. In the latter case, the RF module
minimize power consumption while guaranteeing adequate mo- transmits or receives data, or remains in standby mode (stdby)
tion sampling rate for supporting robust and precise control otherwise. In sEMG-IMU nodes, the sensor unit that is not
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.
FALL et al.: MULTIMODAL ADAPTIVE WIRELESS CONTROL INTERFACE FOR PEOPLE WITH UPPER-BODY DISABILITIES 5
TABLE I and lower levels of the double-threshold, are set to 1.2 and 0.8
SENSORS OPERATING MODES AND CONFIGURATIONS
times Ψth , respectively, and can be adjusted and finely tuned,
while h is set to 4.
Modes MCU RF IMU AFE+ ii) IMU Data Fusion: IMU data fusion has been widely ex-
Sleep CPU off RX∗ stdby pdwn plored and different filter structures have been used to robustly
IMU CPU off/on∗∗ TX on stdby process sensor orientation angles free from inherent accelerom-
Active
sEMG + CPU off/on TX stdby on eter noise and gyroscope drift effects [42]. As reported in [26],
Kalman filtering has been previously implemented and com-
+
only in sEMG-IMU node sensors. ∗ 2-Hz 25% duty cycle incoming packet listening. pared with the Complementary filter. The latter technique was
∗∗
CPU alternately on to read input samples transmit data, and off otherwise. CPU = Core
Processing Unit. found to provide lower processing requirements and similar dy-
namic response performance compared to the Kalman filter,
when properly designed. This design thus uses the Complemen-
being used is put in power-down mode (pdwn), and can be tary filter. A parametric approach, described in Section III-B4.i,
later woken up when necessary. As described in Table I and is used to select the Complementary filter coefficients. The head-
depicted in Fig. 4, the sensor network operates in two different set sensor is worn as depicted in Fig. 10. Pitch (P), Roll (R) and
modes: 1) a Sleep mode (sensors’ default mode at power up) in Yaw (Y) angles are computed with the Complementary filter,
which the sensor nodes remain idle and periodically (2-Hz with according to (2), (3) and (4). Filter parameters, as well as the
a 25% duty cycle) listen for incoming packets from the Safety- control algorithm, are set by adopting a user-centered design
key to enter in active mode and start then transmitting, and 2) approach to guarantee usability in a typical control environment
an Active mode, used for calibration purposes, when the user and everyday life
performs a single click (OneClick) using the Switch Click USB
or any jack connector-compatible adaptive switch device (see ⎧
⎪
⎪ P [n] = αgyro ∗ (P [n − 1] − ωz ∗ Δt) + αacc ∗ ϕ[n]
Section III-A), or to activate control by holding the Safety-key ⎪
⎪
for more than 500-ms (HoldDown), once all calibration parame- ⎨ R[n] = αgyro ∗ (R[n − 1] + ωy ∗ Δt) + αacc ∗ ν[n]
(2)
ters are set (see Section III-B4). In Active mode, all sensor nodes ⎪
⎪ Y [n] = (1 − αm ag ) ∗ (Y [n − 1] + ωx ∗ Δt)
⎪
⎪
are active and the host receives the stream of data until the next ⎩
switch event (OneClick when in calibration and HoldDown re- +αm ag ∗ γ[n]
⎧
⎨ ϕ[n] = atan2(−ay [n], ax [n])
lease when control is active), then the WBSN goes back to the ⎪
Sleep mode. Finally, a sensor in Active mode which does not
ν[n] = atan2(−az [n], ax [n]) (3)
receive an ACK after 500-ms, turns back into Sleep mode since ⎪
⎩
the communication link might be lost or the control disabled. γ[n] = atan2(−mhz [n], mhy [n])
3) Data Processing: All the data processing is done on the ⎧ h
⎪
⎪ my = sin(P [n]) ∗ cos(R[n]) ∗ mx
host platform. Voluntary muscle contractions are read from raw ⎪
⎪
⎨ − sin(P [n]) ∗ my
muscle activity signals when sEMG features are used whereas
(4)
motion is sensed from IMU data. ⎪
⎪ + cos(P [n]) ∗ sin(R[n]) ∗ mz
⎪
⎪
i) sEMG Processing: In order to use sEMG contractions to ⎩ h
mz = − sin(R[n]) ∗ mx + cos(R[n]) ∗ mz
emulate the user buttons, sEMG-IMU nodes are placed over the
target muscles with RFC and undergo robust onset detection. where P=Pitch, R=Roll and Y=Yaw, ωi , ai and mi are the an-
This is done using the Teager-Kaiser Energy Operator (TKE) gular velocity, acceleration and magnetic component measured
→ → → →
which has been previously found very robust to noise and low along i axis ( x, y , z ), αgyro , αacc and αm ag are the comple-
signal-to-noise ratio (SNR) in several works [36], [37]. Muscu- mentary filter coefficients.
lar activity is sampled at a frequency of 1-kHz and band-pass 4) User Oriented Approach & Control Algorithm:
filtered to remove noise [38], and motion artefacts below 20-Hz i) Design Approach: Since the control of external devices
[39], using a 20-Hz to 450-Hz 4th order Butterworth digital fil- providing several DoFs is involved, robustness, safety and accu-
ter. The power line interference is removed using a 4th order 50 racy are crucial when building BoMIs based on head orientation
to 55-Hz Butterworth band-stop digital filter [40]. The TKE is measurement. The time response of IMU motion sensors need
defined as follow: to be optimized. In other words, a maximum latency should be
Ψ[n] = x2 [n] − x[n + 1]x[n − 1], (1) guaranteed between the moment the motion is performed and
the time the corresponding output angle is provided by the sen-
where x is the filtered sEMG signal. sor. The right data fusion filter should guarantee a low baseline
Ψav is the real time average TKE value, computed from a 120- noise static noise, when no motion is performed, to avoid unsafe
samples window (Nw in ) with an overlap of 75 samples (Nov r lp ), behaviour of the AT being controlled. Finally, the fusion algo-
and the threshold is determined by Ψth = μo + hσo , where μo rithm should be robust enough to ensure a minimum stability
is the measured mean TKE value when no muscle activity is when additional acceleration components other than the gravity
performed, σo is the corresponding standard deviation and h is measured. Ideally, angle variations induced by linear trans-
is a preset variable [41]. A double-threshold trigger is used to lation (translation noise), should not transpire through Pitch,
provide robust detection. By default, Ψhigh
th and Ψlow
th , the upper Roll and Yaw. A trade-off between these criteria is reached by
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.
TABLE II
DESIGN APPLICATION REQUIREMENTS FOR THE PROPOSED
CONTROL STRATEGY BASED ON HEAD MOTION
Parameters Requirements
→ → →
Roll) and rotation (Yaw) angles are mapped to x, y and z axis
of the replicated joystick control scheme (see Fig. 1).
During the calibration phase default orientation of the headset
sensor is captured (Po , Ro , Yo ) to retrieve the user’s neutral
position and apply the proper rotation. Then, for each joystick
→ →
direction D along x (Right (R) and Left (L)) and y (Forward (F )
and Backward (B)), only 4 parameters need to be calibrated: φD
the head inclination angle in the unit circle, φD-margin the desired
Fig. 5. Examples of 3D control schemes which can be implemented with the
proposed multimodal WBSN based BoMI, depending on users RFCs. Head
angle margin on both sides of φD , ρD-max and ρD-th the maximum
control, shoulder motion, residual muscular activity and mechanical switches and the threshold head inclination magnitude along direction D.
are combined in 4 Control Scenarios (A, B, C and D) that are proposed with To provide a robust 2D control, P [n] and R[n] are fused to
multimodal control interface, using IMU and sEMG measurement.
provide polar coordinates (ρ[n], φ[n]), according to (5) below,
within a unit circle whose origin is set at the top center of the
properly setting αgyro , αacc and αm ag , the coefficients of the user’s head
complementary filter, according to a parametric approach de- ρ[n] = P 2 [n] + R2 [n]
scribed below. (5)
φ[n] = atan2(P [n], R[n]).
The adopted parametric approach is used to systematically
measure the time constant τc (63% of the step time response) This polar fusion step is crucial as it merges the two informa-
introduced by the Complementary filter, baseline noise (υB ) tion into a single parameter, the components of which are uncor-
and translation noise (υT )) for different values of filter param- related and provide head inclination magnitude (ρ ∈ [0; 1]) and
eters (αgyro , αacc and αm ag ). The pertinence of adding low- equivalent orientation angle from φ ∈ [0; 360◦ ]. From a design
pass filtering (LPF) at 10-Hz and 25-Hz for noise reduction point of view, this allows for a higher flexibility and robustness.
using a first order finite impulse response (FIR) filter is also A 2D calibration scenario is depicted in Fig. 6 with the cor-
evaluated. Table II summarizes the performance requirements responding transfer functions translating polar coordinates into
that must be met for this application. The time constant of the corresponding output vector V = [Vx ; Vy ].
130-ms is empirically determined and noise should not exceed The third DoF provided by the proposed system can be de-
10E-3 ◦ /m/s2 for υT and 1◦ Pitch and Roll standard deviation rived from Y [n]. From the user’s neutral position Yo , a threshold
for υB . Results show that a trade-off meeting the required per- head rotation magnitude Yth is defined and the two maximum
formance (Table II) is achieved when setting αgyro and αacc and rotation from either the left and the right sides of Yo , YL-max and
αm ag to 92%, 8% and 10% respectively, with a 10-Hz LPF. YR-max respectively are defined (see Fig. 7). Even though the 3rd
ii) Control Algorithm: The proposed controller allows for DoF provided by the Yaw angle is intuitive and complete, as it
both 2 dimensional (2D) and 3D control. Due to its modularity, involves all head motion angles, it introduces a few limitations
several control scenarios and sensor combination approaches that must be solved to guarantee safety, comfort and usability,
can be implemented, using head and shoulder motions, muscular by performing additional data fusion.
activities and RFCs in general. Example of such scenarios are For the Yaw angle, the headset sensor worn by the user
described in Fig. 5. By default, head inclinations (Pitch and reads the heading direction. Variations are generated by heading
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.
FALL et al.: MULTIMODAL ADAPTIVE WIRELESS CONTROL INTERFACE FOR PEOPLE WITH UPPER-BODY DISABILITIES 7
TABLE III
AVERAGE CURRENT CONSUMPTION OF THE SENSOR NODES
IN THE DIFFERENT OPERATING MODES OF THE WBSN
+
only present on sEMG-IMU nodes
Fig. 8. Wearable sensor implementations. (a) Internal view of the IMU sensor node plastic housing, (b) Top and bottom views of the sEMG-IMU sensor node,
and (c) sEMG detection algorithm output, N w i n = 120 and N o v r l p = 75. c-1) EMG signal from right thumb flexor (see Fig. 10), c-2) corresponding TKE signal
Ψ, c-3) corresponding average TKE signal Ψ a v with calibrated hysteresis threshold (Ψ hthig h , Ψ low
th ), and c-4) detection signal output. short and long contractions
are identified for τ lo n g = 700-ms. PMU = Power Module Unit. *The LSM9DS0 is oriented as specified by the provided sensor frame.
TABLE IV
MEASURED SENSOR NODES PERFORMANCE
FALL et al.: MULTIMODAL ADAPTIVE WIRELESS CONTROL INTERFACE FOR PEOPLE WITH UPPER-BODY DISABILITIES 9
Fig. 10. (a) Overview of the experimental test showing the user executing Task 1, (b) TEMPA setup, (c) sEMG patch sensor on target muscles used to implement
and test the Control Scenario B, (d) headset sensor worn by the user and internal view of the IMU node inside the housing and (e) RPi host with the USB Base
station connected.
Fig. 11. Average execution time (mean + standard deviation) for Task 1 (a), Task 2 (b) and Task 3-2 (c), for the 5 participants, using the joystick device depicted
in Fig. 1 (red), and the Control Scenario B of the proposed BoMI (blue). Overall average execution times are plotted using dotted lines.
There was no timeout delay defined and a task is never con- dexterity issues preventing them from precisely interacting with
sidered accomplished until the user manages to fully complete user buttons.
it. Task 1 to Task 3 are successively realized using one con- Table V provides the average execution times, for each task,
trol interface (either the joystick or the BoMI) at a time. Con- for the 10 participants who performed the test using Control
trol Scenario A of the proposed BoMI has been tested by ten Scenario A of the proposed BoMI, and the joystick. Results
able bodied subjects (5 males and 5 females). Half of partic- measured using the Control Scenario B, performed by S1 and
ipants (Group A - S1 to S5 ) first performed tasks using the S2 , are reported as well. All participants were able to complete
joystick controller then the proposed BoMI, while the others the experimentation. Fig. 11 provides a graphical comparative
(Group B - S6 to S1 0) did the opposite. Participants were given view of the average and the standard deviation of all individual
a 15-min practice before using each controller type. It is the performance of participants for Scenario A. As it can be seen, the
Control Scenario A (see Fig. 5) which is performed by the 10 proposed system outperformed the joystick controller in some
able bodied subjects (Si , i = {1, ...10}). S1 and S2 , also per- cases. On average, the proposed BoMI allowed the users to per-
formed the Control Scenario B, right after using the Control form the tasks almost as fast as with the joystick, providing only
Scenario A, to evaluate the use of sEMG as a navigation mode. 30% of overhead completion time, compared to the joystick.
To perform mode navigation in Control Scenario A, 2 buttons Group A (S1 to S5 ) who first familiarized with JACO using the
from a USB game controller are used. For Control Scenario B, joystick controller, performed the tasks 5% faster than Group B
mode navigation is performed using sEMG from right finger (S6 to S1 0) on average while using the proposed BoMI. Results
flexor muscles of S1 and from the right arm wrist flexor mus- also show that within only three attempts, participants were able
cle (flexor carpi radialis) of S2 by mapping sEMG contractions to improve their performance by 17% using the proposed BoMI,
(short and long) to user button functions (see Fig. 10). Thus, vs 27% with the joystick controller.
the obtained performance helped to evaluate usability by people Fig. 12 provides individual average performance results of
who cannot control their arm motion, but have remaining abili- S1 and S2 , using the joystick controller and Control Scenar-
ties on their fingers instead, and by individuals living with finger ios A and B. It is summarized in Table VI. It is shown that
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.
+
Better performance obtained with the proposed BoMI compared to the joystick.
ACKNOWLEDGMENT
The authors would like to thank Kinova Robotics, Canada
for their collaboration, C. Bérubé for her assistance in using
the TEMPA test setup to evaluate the proposed BoMI, and M.
Kritskaya-Eboua for her precious help in illustrating the body
motions that were tested with the proposed BoMI.
REFERENCES
[1] L. Dark, Assistive Technologies for People With Diverse Abilities. New
York, NY, USA: Taylor & Francis, 2014.
[2] V. Hanson and K. Odame, “Real-time embedded implementation of the
binary mask algorithm for hearing prosthetics,” IEEE Trans. Biomed.
Circuits Syst., vol. 8, no. 4, pp. 465–473, Aug. 2014.
Fig. 12. Average execution time for each task, for Subject 1 (blue) and [3] A. Bhowmick and S. M. Hazarika, “An insight into assistive technology
Subject 2 (red), using the joystick controller and Control Scenarios A and B for the visually impaired and blind people: State-of-the-art and future
of the proposed controller. trends,” J. Multimodal User Interfaces, vol. 11, no. 2, pp. 149–172, 2017.
[4] D. T. Pawluk, R. J. Adams, and R. Kitada, “Designing haptic assistive
TABLE VI technology for individuals who are blind or visually impaired,” IEEE
EXPERIMENTAL RESULTS BY CONTROL INPUT Trans. Haptics, vol. 8, no. 3, pp. 258–278, Jul./Sep. 2015.
[5] I. P. Ktistakis and N. G. Bourbakis, “Assistive intelligent robotic
wheelchairs,” IEEE Potentials, vol. 36, no. 1, pp. 10–13, Jan./Feb. 2017.
Ctrl Input Task 1 Task 2 Task 3-1 Task 3-2 [6] D.-S. Vu, U. C. Allard, C. Gosselin, F. Routhier, B. Gosselin, and A.
Campeau-Lecours, “Intuitive adaptive orientation control of assistive
Joystick 38.2 ± 15.9 24.8 ± 9.3 26.5 ± 3.0 88.1 ± 23.9 robots for people living with upper limb disabilities,” in Proc. Int. Conf.
Scenario A 63.7 ± 20.3 54.2 ± 24.3 41.2 ± 6.7 108.8 ± 3.1 Rehab. Robot., 2017, pp. 795–800.
Scenario B 57.0 ± 18.7 29.9 ± 0.28 36.6 ± 4.2 120 ± 1.4 [7] D. A. Bennett, S. A. Dalley, D. Truex, and M. Goldfarb, “A multi-
grasp hand prosthesis for providing precision and conformal grasps,”
IEEE/ASME Trans. Mechatronics, vol. 20, no. 4, pp. 1697–1704,
Aug. 2015.
[8] H. Park et al., “A wireless magnetoresistive sensing system for an intraoral
they performed 80% as fast as with the joystick controller for tongue–computer interface,” IEEE Trans. Biomed. Circuits Syst., vol. 6,
Task 2, and performed 62% as fast as the joystick on average. no. 6, pp. 571–585, Dec. 2012.
Participants were able to realize all the different tasks us- [9] A. M. Cook and J. M. Polgar, Assistive Technologies: Principles and
Practice. Amsterdam, Netherlands: Elsevier Health Sciences, 2014.
ing the proposed BoMI. Results demonstrates its learnability [10] Kinova Robotics Canada. Picture: Jaco arm mounted on a powered
and usability, two characteristics that have been correlated with sheelchair. [Online]. Available: http://www.kinovarobotics.com/assistive-
intuiveness of a gesture based interface [43]. All the control robotics/products/robot-arms/. Accessed on: Mar. 19, 2018.
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.
FALL et al.: MULTIMODAL ADAPTIVE WIRELESS CONTROL INTERFACE FOR PEOPLE WITH UPPER-BODY DISABILITIES 11
[11] M. A. Jose and R. de Deus Lopes, “Human–computer interface controlled [34] H. Zeng and Y. Zhao, “Sensing movement: Microsensors for body motion
by the lip,” IEEE J. Biomed. Health Inf., vol. 19, no. 1, pp. 302–308, measurement,” Sensors, vol. 11, no. 1, pp. 638–660, 2011.
Jan. 2015. [35] A. Yousefian, S. Roy, and B. Gosselin, “A low-power wireless multi-
[12] H. Jiang et al., “A machine vision-based gestural interface for people with channel surface EMG sensor with simplified ADPCM data compression,”
upper extremity physical impairments,” IEEE Trans. Syst., Man, Cybern., in Proc. 2013 IEEE Int. Symp. Circuits Syst., 2013, pp. 2287–2290.
Syst, vol. 44, no. 5, pp. 630–641, May 2014. [36] S. Solnik, P. DeVita, P. Rider, B. Long, and T. Hortobágyi, “Teager–Kaiser
[13] H. Jiang, T. Zhang, J. P. Wachs, and B. S. Duerstock, “Enhanced con- operator improves the accuracy of EMG onset detection independent of
trol of a wheelchair-mounted robotic manipulator using 3-d vision and signal-to-noise ratio,” Acta Bioeng. Biomech./Wroclaw Univ. Technol.,
multimodal interaction,” Comput. Vis. Image Understanding, vol. 149, vol. 10, no. 2, pp. 65–68, 2008.
pp. 21–31, 2016. [37] B. Gosselin and M. Sawan, “An ultra low-power CMOS automatic action
[14] S. Chandra et al., “Eye tracking based human computer interaction: Ap- potential detector,” IEEE Trans. Neural Syst. Rehab. Eng., vol. 17, no. 4,
plications and their uses,” in Proc. IEEE Int. Conf. Man Mach. Interfacing, pp. 346–353, Aug. 2009.
2015, pp. 1–5. [38] R. Merletti and P. Di Torino, “Standards for reporting EMG data,” J.
[15] A. S. Royer, A. J. Doud, M. L. Rose, and B. He, “EEG control of a Electromyograph. Kinesiol., vol. 9, no. 1, pp. 3–4, 1999.
virtual helicopter in 3-dimensional space using intelligent control strate- [39] E. A. Clancy, S. Bouchard, and D. Rancourt, “Estimation and application
gies,” IEEE Trans. Neural Syst. Rehab. Eng., vol. 18, no. 6, pp. 581–589, of EMG amplitude during dynamic contractions,” IEEE Eng. Med. Biol.
Dec. 2010. Mag., vol. 20, no. 6, pp. 47–54, Nov./Dec. 2001.
[16] A. Rouse et al., “Spatial co-adaptation of cortical control columns in a [40] C. J. De Luca, L. D. Gilmore, M. Kuznetsov, and S. H. Roy, “Filtering
micro-ECOG brain–computer interface,” J. Neural Eng., vol. 13, no. 5, the surface EMG signal: Movement artifact and baseline noise contami-
2016, Art. No. 056018. nation,” J. Biomechanics, vol. 43, no. 8, pp. 1573–1579, 2010.
[17] C. E. Bouton et al., “Restoring cortical control of functional movement [41] D. Yang, H. Zhang, Y. Gu, and H. Liu, “Accurate EMG onset detection
in a human with quadriplegia,” Nature, vol. 533, no. 7602, pp. 247–250, in pathological, weak and noisy myoelectric signals,” Biomed. Signal
2016. Process. Control, vol. 33, pp. 306–315, 2017.
[18] A. Ubeda, E. Ianez, and J. M. Azorin, “Wireless and portable EOG-based [42] W. T. Higgins, “A comparison of complementary and Kalman filtering,”
interface for assisting disabled people,” IEEE/ASME Trans. Mechatronics, IEEE Trans. Aerosp. Electron. Syst., vol. AES-11, no. 3, pp. 321–325,
vol. 16, no. 5, pp. 870–873, Oct. 2011. May 1975.
[19] Q. Huang et al., “An eog-based human-machine interface for wheelchair [43] M. Nielsen, M. Störring, T. B. Moeslund, and E. Granum, “A procedure for
control,” IEEE Trans. Biomed. Eng., 2017. developing intuitive and ergonomic gesture interfaces for HCI,” in Proc.
[20] C. L. Fall et al., “Wireless sEMG-based body–machine interface for as- 5th Int. Workshop Gesture Sign Lang. Based Human-Comput. Interaction
sistive technology devices,” IEEE J. Biomed. Health Informat., vol. 21, 2003, pp. 409–420.
no. 4, pp. 967–977, Jul. 2017.
[21] U. Côté-Allard, C. L. Fall, A. Campeau-Lecours, C. Gosselin, F.
Laviolette, and B. Gosselin, “Transfer learning for sEMG hand gestures
recognition using convolutional neural networks,” in Proc. 2017 IEEE Int.
Conf. Syst., Man, Cybern., 2017, pp. 1663–1668. Cheikh Latyr Fall received the Master’s degree in
[22] H. Park et al., “A wireless magnetoresistive sensing system for an intraoral automation and electrical engineering from the In-
tongue-computer interface,” IEEE Trans. Biomed. Circuits Syst., vol. 6, stitut National des Sciences Appliquées, Toulouse,
no. 6, pp. 571–585, Dec. 2012. France, in 2013. He is currently working toward the
[23] L. N. A. Struijk, L. L. Egsgaard, R. Lontis, M. Gaihede, and B. Bentsen, Ph.D. degree in electrical engineering at the Biomed-
“Wireless intraoral tongue control of an assistive robotic arm for indi- ical Microsystems Laboratory, Laval University, QC,
viduals with tetraplegia,” J. NeuroEng. Rehab., vol. 14, no. 1, p. 110, Canada. His main research interests include assis-
2017. tive technologies, rehabilitation robotics, human–
[24] D. Johansen, C. Cipriani, D. B. Popović, and L. N. Struijk, “Control of machine interfaces, wireless technologies, body sen-
a robotic hand using a tongue control system—A prosthesis application,” sor networks, and biomedical instrumentation.
IEEE Trans. Biomed. Eng., vol. 63, no. 7, pp. 1368–1376, Jul. 2016.
[25] D. Kupetz, S. Wentzell, and B. BuSha, “Head motion controlled power
wheelchair,” in Proc. IEEE 36th Annu. Northeast Bioeng. Conf., 2010,
pp. 1–2.
[26] C. L. Fall et al., “Intuitive wireless control of a robotic arm for people
living with an upper body disability,” in Proc. IEEE 37th Annu. Int. Conf. Francis Quevillon will receive the B.Ing. degree in
Eng. Med. Biol. Soc., 2015, pp. 4399–4402. electrical engineering from Laval University, Quebec,
[27] J. Musić, M. Cecić, and M. Bonković, “Testing inertial sensor performance Canada, in 2018. He is currently a student in its final
as hands-free human-computer interface,” WSEAS Trans. Comput., vol. 8, undergraduate year of study.
no. 4, pp. 715–724, 2009.
[28] M. N. Sahadat, A. Alreja, and M. Ghovanloo, “Simultaneous multimodal
PC access for people with disabilities by integrating head tracking, speech
recognition, and tongue motion,” IEEE Trans. Biomed. Circuits Syst.,
vol. 12, no. 1, pp. 192–201, Feb. 2018.
[29] K. D. Katyal et al., “Harmonie: A multimodal control framework for
human assistive robotics,” in Proc. 2013 6th Int. IEEE/EMBS Conf. Neural
Eng., 2013, pp. 1274–1278.
[30] W. Wahlster, SmartKom: Foundations of Multimodal Dialogue Systems. Martine Blouin received the B.Eng. degree
New York, NY, USA: Springer, 2006, vol. 12. in biomedical engineering from Polytechnique
[31] C. L. Fall et al., “A multimodal adaptive wireless control interface for peo- Montreal, Montreal, QC, Canada, in 2012 and the
ple with upper-body disabilities,” in Proc. 2017 IEEE Int. Symp. Circuits M.Sc.A. degree in engineering with a specialisa-
Syst., 2017, pp. 1–4. tion in health technology from the École de Tech-
[32] A. Campeau-Lecours et al., “JACO assistive robotic device: Empowering nologie Supérieure, Montreal, Canada, in 2014. She
people with disabilities through innovative algorithms,” in Proc. Rehab. is currently a Robotic Control Engineer at Kinova,
Eng. Assistive Technol. Soc. North Amer. Conf., 2016. [Online]. Available: Boisbriand, Canada. She provided a technical ex-
https://pdfs.semanticscholar.org/4e8e/872c5df70b961545aa0e9301ddd89 pertise and support on programming and interfac-
c30b94b.pdf ing with the Kinova JACO robot during this project.
[33] B. Hudgins, P. Parker, and R. N. Scott, “A new strategy for multifunction Her research interests include control and servoing,
myoelectric control,” IEEE Trans. Biomed. Eng., vol. 40, no. 1, pp. 82–94, rehabilitation robotics, mathematical modeling, and
Jan. 1993. biomedical instrumentation.
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.
Simon Latour was born in Canada in 1981. He re- Benoit Gosselin received the Ph.D. degree in elec-
ceived the Bachelor’s degree in electrical engineer- trical engineering from École Polytechnique de
ing, with specialization in medical technologies in Montréal, Montreal, QC, Canada, in 2009. He was
2009. From 2005 to 2010, he worked for Telemedic an NSERC Postdoctoral Fellow at the Georgia In-
where he did some research on the integration of pulse stitute of Technology in 2010. He is currently an
oxymetry sensors for portable devices and worked on Associate Professor with the Department of ECE,
the implementation of communication interfaces for Université Laval, where he is heading the Smart
medical devices. In 2010, he joined Kinova where he Biomedical Microsystems Lab. His research interests
developed a robot control interface for the assistive include wireless microsystems for brain computer in-
market. Following that, he was involved in the devel- terfaces, analog/mixed-mode and RF integrated cir-
opments of the next generations of robots. Over the cuits for neural engineering, interface circuits of im-
years, he developed a strong expertise in robotics for the assistive market. He plantable sensors/actuators and point-of-care diagnostic microsystems for per-
is currently a Product Manager for the Assistive Robotics at Kinova Robotics sonalized healthcare.
where he is in charge of the robotic products applied to the assistive market. Dr Gosselin is an Associate Editor of the IEEE TRANSACTIONS ON BIOMED-
ICAL CIRCUITS AND SYSTEMS and he is the Chair and a Founder of the Quebec
IEEE CAS/EMB Chapter (2015 Best New Chapter Award). He served on the
Alexandre Campeau-Lecours received the B.Eng.
committees of several international conferences such as IEEE ISCAS, IEEE
degree in mechanical engineering (mechatronics)
NEWCAS, BIOCAS, and IEEE EMBC/NER. His research interests include
from the Ecole Polytechnique de Montréal, Montréal,
developing innovative microelectronic platforms to collect and study brain ac-
QC, Canada, in 2008, and the Ph.D. degree from Uni-
tivity through advanced multimodal bioinstrumentation and actuation technol-
versité Laval, QC, Canada, in 2012. From 2012 to
2015, he was with Kinova as a Research and Devel- ogy. In addition to earn several awards, such as the 2017 IEEE ISCAS Best
Live Demonstration Award, his contributions led to the commercialization of
opment Project Manager in control and robotic algo-
the first wireless bioelectronic implant to combine optogenetics and large-scale
rithms. He is currently an Assistant Professor with the
brain monitoring capabilities within a single device, with his partner Doric
Department of Mechanical Engineering, Université
Laval. His research interests include physical human– Lenses Inc.
robot interaction, development of assistive technolo-
gies for people living with disabilities and the elderly, medical applications, and
intelligent robotic algorithms.