You are on page 1of 12

This article has been accepted for inclusion in a future issue of this journal.

Content is final as presented, with the exception of pagination.

IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS 1

A Multimodal Adaptive Wireless Control Interface


for People With Upper-Body Disabilities
Cheikh Latyr Fall , Student Member, IEEE, Francis Quevillon, Martine Blouin, Simon Latour,
Alexandre Campeau-Lecours, Member, IEEE, Clément Gosselin, Fellow, IEEE,
and Benoit Gosselin , Member, IEEE

Abstract—This paper describes a multimodal body–machine in- I. INTRODUCTION


terface (BoMI) to help individuals with upper-limb disabilities
UNCTIONAL capacities of people living with disabili-
using advanced assistive technologies, such as robotic arms. The
proposed system uses a wearable and wireless body sensor network
(WBSN) supporting up to six sensor nodes to measure the natu-
F ties have been considerably improving over the past re-
cent years due to significant advances in assistive technology
ral upper-body gesture of the users and translate it into control and rehabilitation engineering. A fleet of assistive devices have
commands. Natural gesture of the head and upper-body parts, as
well as muscular activity, are measured using inertial measure- arisen worldwide to increase and/or maintain residual capaci-
ment units (IMUs) and surface electromyography (sEMG) using ties of impaired users, in their activities of daily life (ADLs) [1].
custom-designed multimodal wireless sensor nodes. An IMU sens- While technologies such as hearing aids, retinal implants and
ing node is attached to a headset worn by the user. It has a size of haptic feedback systems, etc aim at providing or restoring appre-
2.9 cm × 2.9 cm, a maximum power consumption of 31 mW, and hension abilities [2]–[4], mobility aids (intelligent wheelchairs,
provides angular precision of 1◦ . Multimodal patch sensor nodes,
including both IMU and sEMG sensing modalities are placed over prosthetic devices and other domestic appliances such as robotic
the user able-body parts to measure the motion and muscular ac- manipulators) require a voluntary user interaction based on their
tivity. These nodes have a size of 2.5 cm × 4.0 cm and a maximum residual functional capacities (RFCs) [5]–[8]. Even though ex-
power consumption of 11 mW. The proposed BoMI runs on a Rasp- trinsic enablers among the latter category are game changers and
berry Pi. It can adapt to several types of users through different contribute to users quality of life (QoL) improvement, the lack
control scenarios using the head and shoulder motion, as well as
muscular activity, and provides a power autonomy of up to 24 h.
of readily available and affordable adaptive controllers limits
JACO, a 6-DoF assistive robotic arm, is used as a testbed to evaluate their efficiency and adoption rate by people living with severe
the performance of the proposed BoMI. Ten able-bodied subjects disabilities [9].
performed ADLs while operating the AT device, using the Test Tools such as switch devices, dedicated keypads, mice, track-
d’Évaluation des Membres Supérieurs de Personnes Âgées to evalu- balls, joysticks, head and hand pointers, sip-and-puff tools,
ate and compare the proposed BoMI with the conventional joystick mouthsticks or lip control systems [11] require a mechanical in-
controller. It is shown that the users can perform all tasks with the
proposed BoMI, almost as fast as with the joystick controller, with tervention from the users, which can be obtrusive and drain phys-
only 30% time overhead on average, while being potentially more ical resources of severely impaired users. Smart and unobtrusive
accessible to the upper-body disabled who cannot use the conven- Body-Machine Interfaces (BoMIs) have been designed based on
tional joystick controller. Tests show that control performance with latest technologies for easing interaction with assistive devices
the proposed BoMI improved by up to 17% on average, after three by exploiting users’ RFCs. Computer vision with 3 dimensional
trials.
(3D) cameras has been explored to detect and reach/grab ob-
Index Terms—Assistive technologies, body-machine interface, jects [12], [13]. Similarly, in [14], eye tracking is used to infer
electromyography, inertial sensor, low-power, motion control, mul- human intents, detect objects and control an AT device in 3D.
timodal, wireless body sensor network.
Brain-computer interfaces (BCI) using EEG [15] and/or electro-
corticography (ECoG) [16] have also been designed and tested,
but although they achieved promising results, this category of
Manuscript received October 1, 2017; revised January 9, 2018; accepted
February 1, 2018. This paper was recommended by Associate Editor D. Hall. BoMI can remain difficult to operate without extensive training,
(Corresponding author: Cheikh Latyr Fall.) or be very invasive when they are requiring a physical connec-
C. L. Fall, F. Quevillon, and B. Gosselin are with the Department tions to the actuating organs, through electrodes, for instance
of Electrical Engineering, Laval University, Quebec City, QC G1V 0A6,
Canada (e-mail: cheikh-latyr.fall.1@ulaval.ca; francis.quevillon.2@ulaval.ca; [17]. Non invasive and wearable controllers based on Electro-
clement.gosselin@gmc.ulaval.ca). occulography (EOG) have successfully been implemented to
M. Blouin and S. Latour are with the Kinova Robotics, QC J7H 1M7, operate robotic arms [18] and drive motorised wheelchairs [19].
Canada (e-mail: mblouin@kinova.ca; slatour@kinova.ca).
A. Campeau-Lecours and C. Gosselin are with the Department of Mechanical In [20] and [21], surface electromyography (sEMG) was used to
Engineering, Laval University, Quebec City, QC G1V 0A6, Canada (e-mail: translate users’ residual muscular activiy into commands, using
alexandre.campeau-lecours@gmc.ulaval.ca; clement.gosselin@gmc.ulaval.ca). robust detection and pattern recognition techniques.
Color versions of one or more of the figures in this paper are available online
at http://ieeexplore.ieee.org. In addition to systems that are based on electrophysiologi-
Digital Object Identifier 10.1109/TBCAS.2018.2810256 cal signal measurement, different types of BoMI that rely on

1932-4545 © 2018 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission.
See http://www.ieee.org/publications standards/publications/rights/index.html for more information.
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.

2 IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS

residual natural body motion sensing have been designed.


Tongue motion has been used to control a computer [22], a
robotic arm [23] and a prosthetic hand [24]. Head motion, mea-
sured with a camera and a motion tracking algorithm [25], or
with inertial measurement units (IMU) [26], [27], has been used
to build intuitive BoMIs for wheelchair driving, robotic arm op-
eration and computer control. In spite of these promising results,
the number of DoFs provided by these systems is still limited.
Some recent systems aim to address the numerous control and
system integration challenges by combining the information
from different modalities to precisely decode human intents
and/or gesture. In [28], a multimodal interface is combining
head motion, speech recognition and tongue motion to control a
computer. The combination of sEMG, eye-tacking and EEG is
Fig. 1. (a) The JACO robotic arm mounted on a powered wheelchair and
proposed in [29] to manipulate a prosthesis with high dexterirty. (b) example of a conventional 3D joystick device that can be used to control the
Such approaches are deemed to provide many more degrees of robotic arm and its functionalities. F, B, L and R designate Forward, Backward,
freedom (DoF) then unimodal control systems, opening up the Left and Right inclination if the stick, Lr and Rr correspond to its Left (anti-
clockwise) and Right (clockwise) rotations. B1 and B2 are user buttons used no
possibility of harnessing more potential control strategies better navigate the arm’s control modes [10].
suited to each individual user [30].
A multimodal BoMI based on a wireless body sensor network
(WBSN) is developed for people living with upper-body disabil- shoulder control, these RFCs can be leveraged to provide DoFs
ities. A proof-of-concept 2D BoMI prototype running on a PC to interact with the environment, such as in the prototype intro-
has been previously presented in [31]. In this paper, a complete duced in [26]. The JACO robotic arm (Fig. 1), which is used as
3D BoMI running on a Raspberry Pi (RPi) is presented. The the AT under control in this work, provides daily life assistance
system takes advantage on efficient data fusion and power man- to people with upper body disabilities, some of which carry the
agement approaches to integrate body motion and muscular ac- arm, mounted on their powered wheelchairs, in various environ-
tivity from the WBSN, while improving power efficiency, safety ments. Thus, the soughted controller must be easily embeddable
and usability. The Test d’Évaluation des Membres Supérieurs with the AT under control, be low-cost and be robust to perform
de Personnes Âgées (TEMPA), a standardized test to assess well in dynamic environments (see Section III-B4). Finally, the
autonomy of upper level extremities, is used to evaluate the per- last requirement but not the least is safety, which is a key concern
formance of the proposed control system, while manipulating when the user rely on the operation of an external device.
JACO, a 6-DoF assistive robotic arm, manufactured by Kinova
Robotics, Montreal, Canada [32]. Ten participants performed
an experimental test comprising ADLs requiring precision and B. Functional Description
advanced control skills to properly address JACO’s DoFs. We The proposed control interface exploits user’s gesture mea-
show that flexibility and modularity of the WBSN allows for sured with a dedicated WBSN sensing different body motion
diverse control scenarios to suit a wide range of disabilities. modalities. The sensor network realizes IMU and sEMG sensing
The requirements of the proposed BoMI and an overview of the through wearable sensor nodes that can be inserted in clothes and
system are provided in Section II, while Section III describes accessories, or directly put in contact with skin (see Fig. 2(a)). A
the controller’s design and implementation based on a dedicated headset worn by the user is used to measure head motion angles
WBSN. Finally, performance and experimental results with par- (Pitch (P), Roll (R) and Yaw (Y)) latter used to replicate a joystick
ticipants are described in Section IV. behaviour (see Fig. 2(b)). Before using the proposed BoMI with
any AT, a short calibration phase (<3-min) is required to cap-
II. SYSTEM OVERVIEW ture the user’s motion amplitudes and set the min-max control
parameters of the user (see Section III-B4) in order to provide
A. Requirements
a proportional control. JACO is set in the 3D mode, in which 3
People living with SCI at C5-C8 level often partly or com- control modes need to be accessed to reach all the functionalities
pletely lose ability to fully control their arms and fingers with of the arm (translation, rotation and finger control modes). In
precision and dexterity. Some forms of Cerebral Palsy (CP) this application, the user is provided with 2 user buttons, B1 and
can lead to similar limitations. These individuals and all others B2, to switch between these 3 control modes. Mode navigation
with severe upper-body disabilities (individuals with arm am- can be performed by moving a shoulder or any other relevant
putations, congenital absence of upper limbs, etc), relying on available able body parts that can be read using one or several
mechanical controllers such as joysticks, trackballs and classical IMU/sEMG sensor nodes. When motion amplitude is too weak
switches, to operate assistive devices. Operating AT with com- to derive control commands, sEMG can be used to sense motion
plex DoFs, such as the JACO robotic arm, in their ADLs, is often and/or motion intents when the user is able to produce volun-
impossible and physically demanding. When their physical abil- tary electrical muscle activity on target body areas. For safety
ities allow for distinct upper body part motions such as head and issues, the user is provided with a Safety-key which acts a switch
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.

FALL et al.: MULTIMODAL ADAPTIVE WIRELESS CONTROL INTERFACE FOR PEOPLE WITH UPPER-BODY DISABILITIES 3

Fig. 2. Methodology : (a) Overview of the proposed alternate control strategy based on a WBSN and (b) Default upper body motions used with the proposed
controller interface to emulate functionnalities of the 3D joystick control depicted in figure.

Fig. 3. (a) Illustration of the proposed multimodal interface strategy that can read the user’s head and shoulders motion, as well as his residual myoelectric
activity, and translate them into control commands. (b) The proposed controller’s architecture diagram describing sensor nodes (IMU and sEMG-IMU types), the
Safety-key connected by USB to the Host platform and connected wirelessly to the network and to JACO, the controlled assistive device used as a testbed.

tool enabling or disabling the control of JACO. To operate it, strategies, two types of sensors have been designed, namely an
any compatible adaptive switch that has a male jack connector IMU sensor node and a combined sEMG-IMU sensor node. As
type output can be used. Since body motion and head position depicted in Fig. 3, the proposed interface also uses a Safety-key
are used for control, the Safety-key provides a necessary safety node for enabling or disabling the whole system, while the raw
feature in case of emergency or uncontrolled movements. Also, data coming from all nodes is transmitted to a base station and
the assistive device under control should be disabled using the processed on the host platform.
Safety-key prior to performing free body motions, in order to 1) IMU Sensor Nodes: IMU type nodes (see Fig. 3(b)) inte-
avoid undesired behavior. Finally, as discussed in Section III-B4, grate IMU sensing features using the LS9DSM0 inertial sensor
several sensors can be used at once depending on the type of as- from STMicroelectronics, Switzerland, which provides a serial
sistive device, the number of DoFs, and the user’s preferences. peripheral interface (SPI). The MSP430F5528 microcontroller
To guarantee robust and reliable control, the whole system is unit (MCU) from Texas Instrument, USA, is used for its low
designed to address the requirements previously mentioned in power performance, whereas data are sent wirelessly using the
Section II-A. nRF24L01+ 2.4-GHz radio-frequency (RF) chip from Nordic
Semiconductor, Norway, which employs a proprietary protocol
III. SYSTEM DESIGN & IMPLEMENTATION designed to allow up to 6 pipelines (TX and RX). The sensor
printed circuit board (PCB) is designed to fit inside a dedicated
The proposed BoMI and wearable sensor nodes must use a small head mounted housing for convenience and comfort (see
miniature architecture for comfort, and optimize power con- Section IV-A).
sumption to extend the lifetime. The next subsections describe 2) sEMG-IMU Sensor Nodes: sEMG-IMU type sensors
its hardware and software elements. (Fig. 3(b)) use the same components as the IMU-type nodes
for motion sensing, and also features the ADS1291 as an ana-
A. Hardware Implementation log front-end (AFE), a low-power 1-channel electrophysiolog-
To provide the controller both with relevant inertial data and ical front-end integrated circuit (IC) from Texas Instruments
sEMG signals, enabling different smart and adaptive control for sEMG measurement. The chip provides an SPI interface.
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.

4 IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS

The device has the shape of a tiny patch sensor that uses three
1.8-cm by 0.5-cm, 1 cm-spaced gold plated dry electrodes at
the bottom. All sensor nodes are powered using 3.7V 100-mA
1.5-cm × 1.2-cm rechargeable Li-ion batteries.
3) Safety-Key Node: As mentioned in Section II, the pro-
posed controller features a Safety-key which can be connected
to the host by USB or/and connect wirelessly to the network,
since it features the nRF24L01+ transceiver as well. It is built
using a commercial device, the Switch Click USB by Ablenet,
USA. The switch is activated when the button is pressed and Fig. 4. WBSN based controller: power management and control safety.
can emulate mouse or keyboard functions. It allows external
switch accommodation via a jack connector, which is a cru-
cial feature when flexibility is important. In fact, most adap- schemes. IMU data from the LSM9DS0 is provided in 16-bit
tive switch devices have jack outputs, thus AT users can adopt resolution and sampled at a frequency of 100-Hz, which pro-
the technology that best fits their residual functionalities. The vides a suitable time resolution since body motion frequency
Safety-key node is used as an additional master node inside the bandwidth is often below 10-Hz [34]. A 9D vector comprising
sensor network (see Section III-B2), to handle safety and smart raw inertial data measurement, [ACCxy z , GYROxy z , MAGxy z ],
power management functions. A MSP430F5529 MCU (Texas is sent from each sensor to the base station every 16-ms
Instruments) reads and broadcasts the switch state (PRESSED (62-Hz). sEMG signal is sampled at a frequency of 1-kHz
or RELEASED) within a status packet (STATUS) sent to every [35]. The ADS1291 is set to the maximum gain (12-V/V) and
sensor node, every 4-ms (250-Hz). provides high-precision 24-bit samples, later compressed into
4) USB Base Station: The wearable sensors are connected 16-bit values in-situ inside each node by the MCU, for reducing
with the base station through a body area network (BAN). The the amount of data to be transmitted. Thirteen samples are sent
nodes are all independent from each other for flexibility and at a time, which corresponds to a 77-Hz transmitting frequency.
modularity, and the communication relies on a star topology. 2) Wireless Sensor Network & Power Management: As
The base station features the TM4C123GH6PM Cortex-M4F mentioned in Section III-A, the nRF24L01+ is used for the
MCU from Texas Instruments to gather the data from the mul- wireless data link of the proposed controller. It is a low-power
tiple sensor nodes, handle the network communications and RF transceiver, consuming less than 13-mA and 12.1-mA in
signaling, and transfer the data to the Host platform. It is also RX and TX modes, respectively, and 26 μA in standby mode,
used to program the sensors (i.e. download the firmware into which operates in the 2.4-GHz ISM (Industrial, Scientific and
the embedded MSP430 MCU) and for battery charging. Medical) frequency band. Its architecture allows for up to 6 re-
5) Host Platform: The control algorithm has been designed ceiving channels. A star topology is used and the base-station re-
to run on both PC (windows, linux) and RPi (raspbian, ARM ceives the stream of measurement data sent by all the network’s
Cortex-A53) platforms. When calibrating the system for a spe- peripheral nodes (sensors + Safety-key). The payload size is
cific user at the beginning, it is recommended to run the PC 32 bytes and the data rate set is 250-kbps (≈210-kbps effective
version that allows for more extensive feeedback through com- due to the RF chips embedded baseband protocol engine head-
plete graphical features and data visualization. Then, after cali- ers (Enhanced ShockBurst). Control Redundancy Check (CRC)
bration, control profiles are stored and can be later downloaded and ACK (acknowledgment) features are implemented in the
on the RPi platform. When the AT under control is JACO, the network communications. The ACK feature is used for net-
RPi is attached on the wheelchair and directly connected to the work bandwidth multiplexing. Indeed, when transmitting pack-
robotic arm through the USB connection. ets, nodes can retransmit for a predetermined number of times
(reTX) until new measurements become available. Thus, if no
ACK packet is received from the base station, a delay shift of
B. Software Architecture
ΔreT X is locally introduced by the embedded processor unit
Usability is the main gauge of a control system’s success- until the measurement packet is acknowledged. In IMU mode,
fulness. In addition to low-power, convenience and flexibility the number of retransmissions (reTX) is set to 2 times with a
concerns that has prevailed in the hardware architecture design 3-ms delay (ΔreT X) while in sEMG mode, those parameters
phase, ease of use, intuitiveness and efficiency are among the are set to 2 and 1.5-ms, for reTX and ΔreT X respectively.
key elements that were considered when designing the architec- The Safety-key is used to enable the proposed WBSN based
ture and processing units of this BoMI. For safety issues, latency controller when control occurs and to disable it otherwise (see
is very important as well, and should not exceed 300-ms for real Fig. 4).
time operation as recommended in [33]. The next subsections The MCU remains in low-power mode by default until the
describe the software elements of the proposed controller, from next measurement sample needs to be retrieved through SPI
the sensor firmware to the data fusion and control algorithms. from the sensor units (IMU or sEMG) or until a packet trans-
1) Sensor Nodes Firmware Architecture: Sensor nodes must mission/listening is required. In the latter case, the RF module
minimize power consumption while guaranteeing adequate mo- transmits or receives data, or remains in standby mode (stdby)
tion sampling rate for supporting robust and precise control otherwise. In sEMG-IMU nodes, the sensor unit that is not
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.

FALL et al.: MULTIMODAL ADAPTIVE WIRELESS CONTROL INTERFACE FOR PEOPLE WITH UPPER-BODY DISABILITIES 5

TABLE I and lower levels of the double-threshold, are set to 1.2 and 0.8
SENSORS OPERATING MODES AND CONFIGURATIONS
times Ψth , respectively, and can be adjusted and finely tuned,
while h is set to 4.
Modes MCU RF IMU AFE+ ii) IMU Data Fusion: IMU data fusion has been widely ex-
Sleep CPU off RX∗ stdby pdwn plored and different filter structures have been used to robustly
IMU CPU off/on∗∗ TX on stdby process sensor orientation angles free from inherent accelerom-
Active
sEMG + CPU off/on TX stdby on eter noise and gyroscope drift effects [42]. As reported in [26],
Kalman filtering has been previously implemented and com-
+
only in sEMG-IMU node sensors. ∗ 2-Hz 25% duty cycle incoming packet listening. pared with the Complementary filter. The latter technique was
∗∗
CPU alternately on to read input samples transmit data, and off otherwise. CPU = Core
Processing Unit. found to provide lower processing requirements and similar dy-
namic response performance compared to the Kalman filter,
when properly designed. This design thus uses the Complemen-
being used is put in power-down mode (pdwn), and can be tary filter. A parametric approach, described in Section III-B4.i,
later woken up when necessary. As described in Table I and is used to select the Complementary filter coefficients. The head-
depicted in Fig. 4, the sensor network operates in two different set sensor is worn as depicted in Fig. 10. Pitch (P), Roll (R) and
modes: 1) a Sleep mode (sensors’ default mode at power up) in Yaw (Y) angles are computed with the Complementary filter,
which the sensor nodes remain idle and periodically (2-Hz with according to (2), (3) and (4). Filter parameters, as well as the
a 25% duty cycle) listen for incoming packets from the Safety- control algorithm, are set by adopting a user-centered design
key to enter in active mode and start then transmitting, and 2) approach to guarantee usability in a typical control environment
an Active mode, used for calibration purposes, when the user and everyday life
performs a single click (OneClick) using the Switch Click USB
or any jack connector-compatible adaptive switch device (see ⎧

⎪ P [n] = αgyro ∗ (P [n − 1] − ωz ∗ Δt) + αacc ∗ ϕ[n]
Section III-A), or to activate control by holding the Safety-key ⎪

for more than 500-ms (HoldDown), once all calibration parame- ⎨ R[n] = αgyro ∗ (R[n − 1] + ωy ∗ Δt) + αacc ∗ ν[n]
(2)
ters are set (see Section III-B4). In Active mode, all sensor nodes ⎪
⎪ Y [n] = (1 − αm ag ) ∗ (Y [n − 1] + ωx ∗ Δt)


are active and the host receives the stream of data until the next ⎩
switch event (OneClick when in calibration and HoldDown re- +αm ag ∗ γ[n]

⎨ ϕ[n] = atan2(−ay [n], ax [n])
lease when control is active), then the WBSN goes back to the ⎪
Sleep mode. Finally, a sensor in Active mode which does not
ν[n] = atan2(−az [n], ax [n]) (3)
receive an ACK after 500-ms, turns back into Sleep mode since ⎪

the communication link might be lost or the control disabled. γ[n] = atan2(−mhz [n], mhy [n])
3) Data Processing: All the data processing is done on the ⎧ h

⎪ my = sin(P [n]) ∗ cos(R[n]) ∗ mx
host platform. Voluntary muscle contractions are read from raw ⎪

⎨ − sin(P [n]) ∗ my
muscle activity signals when sEMG features are used whereas
(4)
motion is sensed from IMU data. ⎪
⎪ + cos(P [n]) ∗ sin(R[n]) ∗ mz


i) sEMG Processing: In order to use sEMG contractions to ⎩ h
mz = − sin(R[n]) ∗ mx + cos(R[n]) ∗ mz
emulate the user buttons, sEMG-IMU nodes are placed over the
target muscles with RFC and undergo robust onset detection. where P=Pitch, R=Roll and Y=Yaw, ωi , ai and mi are the an-
This is done using the Teager-Kaiser Energy Operator (TKE) gular velocity, acceleration and magnetic component measured
→ → → →
which has been previously found very robust to noise and low along i axis ( x, y , z ), αgyro , αacc and αm ag are the comple-
signal-to-noise ratio (SNR) in several works [36], [37]. Muscu- mentary filter coefficients.
lar activity is sampled at a frequency of 1-kHz and band-pass 4) User Oriented Approach & Control Algorithm:
filtered to remove noise [38], and motion artefacts below 20-Hz i) Design Approach: Since the control of external devices
[39], using a 20-Hz to 450-Hz 4th order Butterworth digital fil- providing several DoFs is involved, robustness, safety and accu-
ter. The power line interference is removed using a 4th order 50 racy are crucial when building BoMIs based on head orientation
to 55-Hz Butterworth band-stop digital filter [40]. The TKE is measurement. The time response of IMU motion sensors need
defined as follow: to be optimized. In other words, a maximum latency should be
Ψ[n] = x2 [n] − x[n + 1]x[n − 1], (1) guaranteed between the moment the motion is performed and
the time the corresponding output angle is provided by the sen-
where x is the filtered sEMG signal. sor. The right data fusion filter should guarantee a low baseline
Ψav is the real time average TKE value, computed from a 120- noise static noise, when no motion is performed, to avoid unsafe
samples window (Nw in ) with an overlap of 75 samples (Nov r lp ), behaviour of the AT being controlled. Finally, the fusion algo-
and the threshold is determined by Ψth = μo + hσo , where μo rithm should be robust enough to ensure a minimum stability
is the measured mean TKE value when no muscle activity is when additional acceleration components other than the gravity
performed, σo is the corresponding standard deviation and h is measured. Ideally, angle variations induced by linear trans-
is a preset variable [41]. A double-threshold trigger is used to lation (translation noise), should not transpire through Pitch,
provide robust detection. By default, Ψhigh
th and Ψlow
th , the upper Roll and Yaw. A trade-off between these criteria is reached by
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.

6 IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS

TABLE II
DESIGN APPLICATION REQUIREMENTS FOR THE PROPOSED
CONTROL STRATEGY BASED ON HEAD MOTION

Parameters Requirements

Time constant τ c ≤ 130 ms


Robustness to linear acceleration υ B ≤ 10E-3 ◦ /m/s2
Baseline angular noise υ T ≤ 1◦

Fig. 6. User calibration scenario and the corresponding transfer function.


φ R = 0 ◦ , φ F = 90 ◦ , φ L = 180 ◦ , φ B = 270 ◦ , φ D-margin = 20 ◦ , ρD-max = 1
and ρD-th = 0.3 except ρB-th = 0.2. The areas with blue outline are the corre-
→ →
sponding zones, Vx and Vy are transfer functions along x and y , respectively.

→ → →
Roll) and rotation (Yaw) angles are mapped to x, y and z axis
of the replicated joystick control scheme (see Fig. 1).
During the calibration phase default orientation of the headset
sensor is captured (Po , Ro , Yo ) to retrieve the user’s neutral
position and apply the proper rotation. Then, for each joystick
→ →
direction D along x (Right (R) and Left (L)) and y (Forward (F )
and Backward (B)), only 4 parameters need to be calibrated: φD
the head inclination angle in the unit circle, φD-margin the desired
Fig. 5. Examples of 3D control schemes which can be implemented with the
proposed multimodal WBSN based BoMI, depending on users RFCs. Head
angle margin on both sides of φD , ρD-max and ρD-th the maximum
control, shoulder motion, residual muscular activity and mechanical switches and the threshold head inclination magnitude along direction D.
are combined in 4 Control Scenarios (A, B, C and D) that are proposed with To provide a robust 2D control, P [n] and R[n] are fused to
multimodal control interface, using IMU and sEMG measurement.
provide polar coordinates (ρ[n], φ[n]), according to (5) below,
within a unit circle whose origin is set at the top center of the
properly setting αgyro , αacc and αm ag , the coefficients of the user’s head
 
complementary filter, according to a parametric approach de- ρ[n] = P 2 [n] + R2 [n]
scribed below. (5)
φ[n] = atan2(P [n], R[n]).
The adopted parametric approach is used to systematically
measure the time constant τc (63% of the step time response) This polar fusion step is crucial as it merges the two informa-
introduced by the Complementary filter, baseline noise (υB ) tion into a single parameter, the components of which are uncor-
and translation noise (υT )) for different values of filter param- related and provide head inclination magnitude (ρ ∈ [0; 1]) and
eters (αgyro , αacc and αm ag ). The pertinence of adding low- equivalent orientation angle from φ ∈ [0; 360◦ ]. From a design
pass filtering (LPF) at 10-Hz and 25-Hz for noise reduction point of view, this allows for a higher flexibility and robustness.
using a first order finite impulse response (FIR) filter is also A 2D calibration scenario is depicted in Fig. 6 with the cor-
evaluated. Table II summarizes the performance requirements responding transfer functions translating polar coordinates into
that must be met for this application. The time constant of the corresponding output vector V = [Vx ; Vy ].
130-ms is empirically determined and noise should not exceed The third DoF provided by the proposed system can be de-
10E-3 ◦ /m/s2 for υT and 1◦ Pitch and Roll standard deviation rived from Y [n]. From the user’s neutral position Yo , a threshold
for υB . Results show that a trade-off meeting the required per- head rotation magnitude Yth is defined and the two maximum
formance (Table II) is achieved when setting αgyro and αacc and rotation from either the left and the right sides of Yo , YL-max and
αm ag to 92%, 8% and 10% respectively, with a 10-Hz LPF. YR-max respectively are defined (see Fig. 7). Even though the 3rd
ii) Control Algorithm: The proposed controller allows for DoF provided by the Yaw angle is intuitive and complete, as it
both 2 dimensional (2D) and 3D control. Due to its modularity, involves all head motion angles, it introduces a few limitations
several control scenarios and sensor combination approaches that must be solved to guarantee safety, comfort and usability,
can be implemented, using head and shoulder motions, muscular by performing additional data fusion.
activities and RFCs in general. Example of such scenarios are For the Yaw angle, the headset sensor worn by the user
described in Fig. 5. By default, head inclinations (Pitch and reads the heading direction. Variations are generated by heading
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.

FALL et al.: MULTIMODAL ADAPTIVE WIRELESS CONTROL INTERFACE FOR PEOPLE WITH UPPER-BODY DISABILITIES 7

TABLE III
AVERAGE CURRENT CONSUMPTION OF THE SENSOR NODES
IN THE DIFFERENT OPERATING MODES OF THE WBSN

Modes MCU RF IMU AFE+ Total

Sleep 70 μA 3.1 mA – – ≈3.1 mA


IMU 0.7 mA 1.3 mA 7.4 mA – ≈9.4 mA
Active
EMG 1.0 mA 1.3 mA – 1.2 mA ≈3.5 mA

+
only present on sEMG-IMU nodes

in Fig. 1, involving up to 10 participants performing different


control scenarios (see Table III-B3).

A. Measured Sensors Performance



Fig. 7. Vz transfer function along z . Calibration parameters Yo , Yth , YL-max
and YR-max described in Section III-B4-ii) are depicted, and corresponding Vz
The sensor node prototypes that have been implemented are
is plotted. depicted in Fig. 8. The IMU node type lies on a 2.9-cm ×
2.9-cm PCB designed to fit inside the headset accessory hous-
ing depicted in Fig. 10. The node that integrates both IMU
rotation motions which are mapped to control outputs. How- and EMG measurements (sEMG-IMU node) is mounted on a
ever, wheelchair rotation motions can transpire into the Yaw an- 4.0-cm by 2.5-cm patch PCB. Gold platted electrodes are lo-
gle measurements and introduce undesired variations (ΔYw [n]). cated at the bottom side of the PCB. The network’s RF link

Thus, the Yaw angle becomes Y [n] as described in (6) allows for up to a 20-m transmitting distance. Up to 6 sensor
 nodes can be used at once to implement different scenarios and
Y [n] = Y [n] + ΔYw [n]. (6)
provide multimodal data to meet the precision requirements for
To address this error, an additional inertial sensor node, IMUc , a robust control scheme. As described in Section III-B3, sig-
attached on the user’s wheelchair to measure a reference heading nal onset detection is performed on the collected sEMG signals
angle Y c [n], is used 1) to detect wheelchair rotation and, prevent and Euler angles are extracted from IMU data. One recharge-
unsafe behaviour of the controlled assistive device, and 2) to able 3.7-V 100-mAh Li-ion battery is used to power up each
measure wheelchair rotation angle (ΔYw [n]) to retrieve the Y [n] sensor. In Sleep mode, sensor nodes consumption drops down

from Y [n]. When ωxc , the velocity of the wheelchair returned to 3-mA. The measured current consumption of sensor nodes
by IMUc , is higher than a threshold noted ωth c
, control using the in both Sleep mode and Active mode are depicted in Fig. 9,
c
Yaw angle is inhibited until ωx becomes 0. while Table III provides a breakdown overview of current con-
For JACO, in order to provide an intuitive control scheme sumption for each component. Considering a 6-h cumulative
independent of the user’s position around the arm, the mapping daily control duration using the control Scenario A described in
of the 2D output vector (V = [Vx ; Vy ]) is made dynamic and Table III-B3 (1 IMU headset for 3D head motion control + 1
adaptive by using an additional IMU sensor node attached to the EMG for mode change + Safety Key), the proposed system con-
robotic arm (IMUJ ). The sensor provides a reference heading fers a 24-h autonomy. Table IV summarizes the measured sensor
angle, Y J [n], which is compared to the Yaw angle issued by the nodes performance.

headset sensor, Y [n], and the 2D output vector becomes V as 1) sEMG Detection: sEMG-IMU node sensors require
described in (7) 3.1-mA on average to perform sEMG measurement. The
 measured input referred noise is 6-μVrms. Fig. 8 shows a
V = R(−ΔYJ ) · V (7) 7 seconds sEMG signal recorded from the flexor pollicis longus
where (see Fig. 10). The corresponding TKE signals (Ψ and Ψav ),
calibrated thresholds (Ψhigh low
th , Ψth ) and the detection signal
ΔYJ = Y [n] − Y J [n] (8) are depicted as well. By default, depending on the duration
of the detected contraction, noted τon , the proposed controller
and can classify a detected sEMG contraction into two categories:

cos(ΔYJ ) sin(ΔYJ ) 1) short contractions when τon ≤ τlong and long contractions
R(−ΔYJ ) = . (9) when τon > τlong , where τlong equals 700-ms by default and can
−sin(ΔYJ ) cos(ΔYJ )
be set according to the user’s preferences and RFCs (see Fig. 8).
In this application, the detected sEMG contractions length are
IV. MEASURED PERFORMANCE AND EXPERIMENTAL RESULTS used to replicate the user buttons B1 (short contraction) and
This section presents the performance of the proposed mul- B2 (long contraction) respectively, to navigate into the robotic
timodal WBSN based adaptive controller in terms of precision arm’s control modes (see Section II).
and autonomy. Usability as well is evaluated by means of a 2) IMU Measurement: When operating in IMU mode, sen-
standardized comparison with the joystick controller depicted sors consume 9-mA on average. The measured time constant τc ,
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.

8 IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS

Fig. 8. Wearable sensor implementations. (a) Internal view of the IMU sensor node plastic housing, (b) Top and bottom views of the sEMG-IMU sensor node,
and (c) sEMG detection algorithm output, N w i n = 120 and N o v r l p = 75. c-1) EMG signal from right thumb flexor (see Fig. 10), c-2) corresponding TKE signal
Ψ, c-3) corresponding average TKE signal Ψ a v with calibrated hysteresis threshold (Ψ hthig h , Ψ low
th ), and c-4) detection signal output. short and long contractions
are identified for τ lo n g = 700-ms. PMU = Power Module Unit. *The LSM9DS0 is oriented as specified by the provided sensor frame.

TABLE IV
MEASURED SENSOR NODES PERFORMANCE

PARAMETERS IMU mode EMG mode

Size 2.9 × 2.9 [cm]∗ 4.0 × 2.5 [cm] ∗∗


RF
Reach 20-m
Output Power 0-dBm
Data Rate ≈ 210-kbps
Packet size 32-bytes
Transmission frequency 62-Hz 77-Hz
Measurement
Sampling Frequency 62-Hz 1-kHz
Precision ≤ 1◦ 16-bits
Power consumption
Supply Voltage 3.3-V
RX (Sleep mode) 3.1-mA
TX (Active mode) 9.4-mA 3.5-mA
Autonomy
Continuous 11 hours 33 hours
Fig. 9. Measured current consumption, (a) in sleep mode for both IMU and 6-h / 24-h (average) 24 hours ≥48 hours
sEMG-IMU nodes, (b) in active mode when operating in IMU mode, and Noise
(c) when operating in sEMG mode. Input referred n.a. 6μ Vrms
static noise (std) 0.53◦ n.a.
Translation noise 6E-3 ◦ /m/s−2 n.a.
at equals 122-ms. The standard deviation of the static noise that Time constant τ c = 122-ms τ s E M G = 45-ms
has been measured for Pitch, Roll and Yaw is 0.53◦ . Measured ∗
provided for the IMU node.
control angles’ average standard deviation using the testbench ∗∗
provided for the sEMG-IMU node.
characteristics described in Section III-B4 is 6E-3 ◦ /m/s−2 . The
proposed architecture provides a precision of 1◦ .
Participants had to 1) move Object A from Position A0 to
Position A1, inside the container cup (Task 1), 2) seize the bottle
B. Validation Results With Participants
(Object B) and have a drink using a pipe (Task 2), and 3) move
The proposed BoMI is compared with the joystick device de- the coffee container (Object C) from Position C0 to Position
picted in Fig. 1. The 6 DoFs and finger control of the robotic C1, prior to unscrewing and lifting its cap Task 3. Since the last
arm are accessed through a 3D control by navigating into 3 task required advanced control skills, it has been divided in two
control modes. The Test d’Évaluation des Membres Supérieurs “sub-tasks” referred to as Task 3-1 which designates the step in
de Personnes Âges (TEMPA) setup is used to evaluate the per- which Object C is moved from Position C0 to Position C1, and
formance of the proposed control interface in terms of task Task 3-2 which refers to the complete task. JACO is in home
completion time, while performing daily life tasks with JACO. position, its initial position (depicted in Fig. 11 of [20]), at the
The test setup is depicted in Fig. 10. beginning of each task.
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.

FALL et al.: MULTIMODAL ADAPTIVE WIRELESS CONTROL INTERFACE FOR PEOPLE WITH UPPER-BODY DISABILITIES 9

Fig. 10. (a) Overview of the experimental test showing the user executing Task 1, (b) TEMPA setup, (c) sEMG patch sensor on target muscles used to implement
and test the Control Scenario B, (d) headset sensor worn by the user and internal view of the IMU node inside the housing and (e) RPi host with the USB Base
station connected.

Fig. 11. Average execution time (mean + standard deviation) for Task 1 (a), Task 2 (b) and Task 3-2 (c), for the 5 participants, using the joystick device depicted
in Fig. 1 (red), and the Control Scenario B of the proposed BoMI (blue). Overall average execution times are plotted using dotted lines.

There was no timeout delay defined and a task is never con- dexterity issues preventing them from precisely interacting with
sidered accomplished until the user manages to fully complete user buttons.
it. Task 1 to Task 3 are successively realized using one con- Table V provides the average execution times, for each task,
trol interface (either the joystick or the BoMI) at a time. Con- for the 10 participants who performed the test using Control
trol Scenario A of the proposed BoMI has been tested by ten Scenario A of the proposed BoMI, and the joystick. Results
able bodied subjects (5 males and 5 females). Half of partic- measured using the Control Scenario B, performed by S1 and
ipants (Group A - S1 to S5 ) first performed tasks using the S2 , are reported as well. All participants were able to complete
joystick controller then the proposed BoMI, while the others the experimentation. Fig. 11 provides a graphical comparative
(Group B - S6 to S1 0) did the opposite. Participants were given view of the average and the standard deviation of all individual
a 15-min practice before using each controller type. It is the performance of participants for Scenario A. As it can be seen, the
Control Scenario A (see Fig. 5) which is performed by the 10 proposed system outperformed the joystick controller in some
able bodied subjects (Si , i = {1, ...10}). S1 and S2 , also per- cases. On average, the proposed BoMI allowed the users to per-
formed the Control Scenario B, right after using the Control form the tasks almost as fast as with the joystick, providing only
Scenario A, to evaluate the use of sEMG as a navigation mode. 30% of overhead completion time, compared to the joystick.
To perform mode navigation in Control Scenario A, 2 buttons Group A (S1 to S5 ) who first familiarized with JACO using the
from a USB game controller are used. For Control Scenario B, joystick controller, performed the tasks 5% faster than Group B
mode navigation is performed using sEMG from right finger (S6 to S1 0) on average while using the proposed BoMI. Results
flexor muscles of S1 and from the right arm wrist flexor mus- also show that within only three attempts, participants were able
cle (flexor carpi radialis) of S2 by mapping sEMG contractions to improve their performance by 17% using the proposed BoMI,
(short and long) to user button functions (see Fig. 10). Thus, vs 27% with the joystick controller.
the obtained performance helped to evaluate usability by people Fig. 12 provides individual average performance results of
who cannot control their arm motion, but have remaining abili- S1 and S2 , using the joystick controller and Control Scenar-
ties on their fingers instead, and by individuals living with finger ios A and B. It is summarized in Table VI. It is shown that
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.

10 IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS

TABLE V scenarios tested demonstrated the suitable usability of the pro-


EXPERIMENTAL RESULTS BY TASKS FOR ALL SUBJECTS
posed BoMI and show its significance for people living with
upper-body disability who need to use a robot arm, but cannot
Si Task 1 Task 2 Task 3-1 Task 3-2 manipulate a joystick.
Joystick Controller Device - (sec) mean ± std

S1 49.5 ± 9.6 31.4 ± 4.8 27.6 ± 1.4 104.7 ± 53.4 V. CONCLUSION


S2 27.0 ± 5.6 18.2 ± 1.1 23.6 ± 4.8 71.2 ± 18.7
S3 22.3 ± 1.1 25.9 ± 8.2 26.3 ± 3.6 92.3 ± 48.7 A multimodal wireless BoMI with body and head motion
S4 47.1 ± 14.0 23.0 ± 1.1 30.0 ± 4.0 106.4 ± 33.8 control is proposed, for people living with upper body disabil-
S5 37.4 ± 7.3 17.5 ± 3.2 25.5 ± 5.7 103.3 ± 29.6
S6 92.9 ± 22.0 27.2 ± 3.8 29.1 ± 2.8 133.6 ± 10.7 ities, as an alternative to conventional control interfaces. The
S7 46.9 ± 17.3 23.8 ± 3.9 30.3 ± 1.8 104.1 ± 11.8 system uses a WBSN to read users RFCs making it unobtru-
S8 59.7 ± 15.5 24.8 ± 1.4 28.5 ± 1.2 108.7 ± 23.4 sive and flexible. The JACO robotic arm, a 6-DoF robotic arm
S9 84.5 ± 7.9 29.8 ± 5.0 28.7 ± 5.3 116.6 ± 32.6
S10 64.9 ± 22.2 36.4 ± 4.5 36.9 ± 3.3 110.0 ± 9.9 manufactured by Kinova Robotics, Canada, is used to demon-
Proposed BoMI (Scenario A) - (sec) mean ± std
strate usability and specific requirements have been considered,
during the design process, to allow a comfortable and safe uti-
S1 78.1 ± 9.8 71.4 ± 25.3 33.3 ± 10.5 106.6 ± 41.3
S2 49.4 ± 5.0 36.9 ± 5.2 39.3 ± 4.3 110.8 ± 15.2 lization by people in powered wheelchairs. The proposed system
S3 64.2 ± 7.1 60.9 ± 20.9 32.7 ± 7.0 104.5 ± 5.8 is low-power and uses an embeddable based on a RPi, which
S4 55.0 ± 7.1 45.6 ± 8.0 56.0 ± 8.1 189.5 ± 39.3 makes it suitable for utilization in dynamic contexts. A com-
S5 60.9 ± 9.4 40.2 ± 13.4 38.0 ± 4.9 107.1 ± 38.4
S6 103.0 ± 20.7 51.4 ± 15.0 38.7 ± 6.0 174.6 ± 9.5 parison test with a joystick controller device has involved 10
S7 87.8 ± 29.3 40.7 ± 17.2 38.7 ± 4.2 125.4 ± 29.6 able bodied subjects who successfully performed ADLs, using
S8 74.4 ± 9.5 42.9 ± 10.1 30.1 ± 3.2 100.8 ± 18.7 + the proposed BoMI. Results show that the proposed prototype
S9 103.2 ± 10.3 37.4 ± 12.2 31.0 ± 2.9 123.0 ± 15.1
S10 82.4 ± 32.4 40.2 ± 13.1 33.9 ± 7.0 100.6 ± 18.8 controller can 1) suit different disabilities by adapting to differ-
Proposed BoMI (Scenario B) - (sec) mean ± std
ent control schemes, and, 2) potentially, contribute to increase
the accessibility to assistive devices, like robotic arms, by the
S1 70.2 ± 21.4 30.1 ± 1.3 36.5 ± 4.6 118.7 ± 20.6
S2 43.8 ± 4.8 29.6 ± 6.5 46.0 ± 3.9 121.0 ± 27.0 severely impaired people.

+
Better performance obtained with the proposed BoMI compared to the joystick.
ACKNOWLEDGMENT
The authors would like to thank Kinova Robotics, Canada
for their collaboration, C. Bérubé for her assistance in using
the TEMPA test setup to evaluate the proposed BoMI, and M.
Kritskaya-Eboua for her precious help in illustrating the body
motions that were tested with the proposed BoMI.

REFERENCES
[1] L. Dark, Assistive Technologies for People With Diverse Abilities. New
York, NY, USA: Taylor & Francis, 2014.
[2] V. Hanson and K. Odame, “Real-time embedded implementation of the
binary mask algorithm for hearing prosthetics,” IEEE Trans. Biomed.
Circuits Syst., vol. 8, no. 4, pp. 465–473, Aug. 2014.
Fig. 12. Average execution time for each task, for Subject 1 (blue) and [3] A. Bhowmick and S. M. Hazarika, “An insight into assistive technology
Subject 2 (red), using the joystick controller and Control Scenarios A and B for the visually impaired and blind people: State-of-the-art and future
of the proposed controller. trends,” J. Multimodal User Interfaces, vol. 11, no. 2, pp. 149–172, 2017.
[4] D. T. Pawluk, R. J. Adams, and R. Kitada, “Designing haptic assistive
TABLE VI technology for individuals who are blind or visually impaired,” IEEE
EXPERIMENTAL RESULTS BY CONTROL INPUT Trans. Haptics, vol. 8, no. 3, pp. 258–278, Jul./Sep. 2015.
[5] I. P. Ktistakis and N. G. Bourbakis, “Assistive intelligent robotic
wheelchairs,” IEEE Potentials, vol. 36, no. 1, pp. 10–13, Jan./Feb. 2017.
Ctrl Input Task 1 Task 2 Task 3-1 Task 3-2 [6] D.-S. Vu, U. C. Allard, C. Gosselin, F. Routhier, B. Gosselin, and A.
Campeau-Lecours, “Intuitive adaptive orientation control of assistive
Joystick 38.2 ± 15.9 24.8 ± 9.3 26.5 ± 3.0 88.1 ± 23.9 robots for people living with upper limb disabilities,” in Proc. Int. Conf.
Scenario A 63.7 ± 20.3 54.2 ± 24.3 41.2 ± 6.7 108.8 ± 3.1 Rehab. Robot., 2017, pp. 795–800.
Scenario B 57.0 ± 18.7 29.9 ± 0.28 36.6 ± 4.2 120 ± 1.4 [7] D. A. Bennett, S. A. Dalley, D. Truex, and M. Goldfarb, “A multi-
grasp hand prosthesis for providing precision and conformal grasps,”
IEEE/ASME Trans. Mechatronics, vol. 20, no. 4, pp. 1697–1704,
Aug. 2015.
[8] H. Park et al., “A wireless magnetoresistive sensing system for an intraoral
they performed 80% as fast as with the joystick controller for tongue–computer interface,” IEEE Trans. Biomed. Circuits Syst., vol. 6,
Task 2, and performed 62% as fast as the joystick on average. no. 6, pp. 571–585, Dec. 2012.
Participants were able to realize all the different tasks us- [9] A. M. Cook and J. M. Polgar, Assistive Technologies: Principles and
Practice. Amsterdam, Netherlands: Elsevier Health Sciences, 2014.
ing the proposed BoMI. Results demonstrates its learnability [10] Kinova Robotics Canada. Picture: Jaco arm mounted on a powered
and usability, two characteristics that have been correlated with sheelchair. [Online]. Available: http://www.kinovarobotics.com/assistive-
intuiveness of a gesture based interface [43]. All the control robotics/products/robot-arms/. Accessed on: Mar. 19, 2018.
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.

FALL et al.: MULTIMODAL ADAPTIVE WIRELESS CONTROL INTERFACE FOR PEOPLE WITH UPPER-BODY DISABILITIES 11

[11] M. A. Jose and R. de Deus Lopes, “Human–computer interface controlled [34] H. Zeng and Y. Zhao, “Sensing movement: Microsensors for body motion
by the lip,” IEEE J. Biomed. Health Inf., vol. 19, no. 1, pp. 302–308, measurement,” Sensors, vol. 11, no. 1, pp. 638–660, 2011.
Jan. 2015. [35] A. Yousefian, S. Roy, and B. Gosselin, “A low-power wireless multi-
[12] H. Jiang et al., “A machine vision-based gestural interface for people with channel surface EMG sensor with simplified ADPCM data compression,”
upper extremity physical impairments,” IEEE Trans. Syst., Man, Cybern., in Proc. 2013 IEEE Int. Symp. Circuits Syst., 2013, pp. 2287–2290.
Syst, vol. 44, no. 5, pp. 630–641, May 2014. [36] S. Solnik, P. DeVita, P. Rider, B. Long, and T. Hortobágyi, “Teager–Kaiser
[13] H. Jiang, T. Zhang, J. P. Wachs, and B. S. Duerstock, “Enhanced con- operator improves the accuracy of EMG onset detection independent of
trol of a wheelchair-mounted robotic manipulator using 3-d vision and signal-to-noise ratio,” Acta Bioeng. Biomech./Wroclaw Univ. Technol.,
multimodal interaction,” Comput. Vis. Image Understanding, vol. 149, vol. 10, no. 2, pp. 65–68, 2008.
pp. 21–31, 2016. [37] B. Gosselin and M. Sawan, “An ultra low-power CMOS automatic action
[14] S. Chandra et al., “Eye tracking based human computer interaction: Ap- potential detector,” IEEE Trans. Neural Syst. Rehab. Eng., vol. 17, no. 4,
plications and their uses,” in Proc. IEEE Int. Conf. Man Mach. Interfacing, pp. 346–353, Aug. 2009.
2015, pp. 1–5. [38] R. Merletti and P. Di Torino, “Standards for reporting EMG data,” J.
[15] A. S. Royer, A. J. Doud, M. L. Rose, and B. He, “EEG control of a Electromyograph. Kinesiol., vol. 9, no. 1, pp. 3–4, 1999.
virtual helicopter in 3-dimensional space using intelligent control strate- [39] E. A. Clancy, S. Bouchard, and D. Rancourt, “Estimation and application
gies,” IEEE Trans. Neural Syst. Rehab. Eng., vol. 18, no. 6, pp. 581–589, of EMG amplitude during dynamic contractions,” IEEE Eng. Med. Biol.
Dec. 2010. Mag., vol. 20, no. 6, pp. 47–54, Nov./Dec. 2001.
[16] A. Rouse et al., “Spatial co-adaptation of cortical control columns in a [40] C. J. De Luca, L. D. Gilmore, M. Kuznetsov, and S. H. Roy, “Filtering
micro-ECOG brain–computer interface,” J. Neural Eng., vol. 13, no. 5, the surface EMG signal: Movement artifact and baseline noise contami-
2016, Art. No. 056018. nation,” J. Biomechanics, vol. 43, no. 8, pp. 1573–1579, 2010.
[17] C. E. Bouton et al., “Restoring cortical control of functional movement [41] D. Yang, H. Zhang, Y. Gu, and H. Liu, “Accurate EMG onset detection
in a human with quadriplegia,” Nature, vol. 533, no. 7602, pp. 247–250, in pathological, weak and noisy myoelectric signals,” Biomed. Signal
2016. Process. Control, vol. 33, pp. 306–315, 2017.
[18] A. Ubeda, E. Ianez, and J. M. Azorin, “Wireless and portable EOG-based [42] W. T. Higgins, “A comparison of complementary and Kalman filtering,”
interface for assisting disabled people,” IEEE/ASME Trans. Mechatronics, IEEE Trans. Aerosp. Electron. Syst., vol. AES-11, no. 3, pp. 321–325,
vol. 16, no. 5, pp. 870–873, Oct. 2011. May 1975.
[19] Q. Huang et al., “An eog-based human-machine interface for wheelchair [43] M. Nielsen, M. Störring, T. B. Moeslund, and E. Granum, “A procedure for
control,” IEEE Trans. Biomed. Eng., 2017. developing intuitive and ergonomic gesture interfaces for HCI,” in Proc.
[20] C. L. Fall et al., “Wireless sEMG-based body–machine interface for as- 5th Int. Workshop Gesture Sign Lang. Based Human-Comput. Interaction
sistive technology devices,” IEEE J. Biomed. Health Informat., vol. 21, 2003, pp. 409–420.
no. 4, pp. 967–977, Jul. 2017.
[21] U. Côté-Allard, C. L. Fall, A. Campeau-Lecours, C. Gosselin, F.
Laviolette, and B. Gosselin, “Transfer learning for sEMG hand gestures
recognition using convolutional neural networks,” in Proc. 2017 IEEE Int.
Conf. Syst., Man, Cybern., 2017, pp. 1663–1668. Cheikh Latyr Fall received the Master’s degree in
[22] H. Park et al., “A wireless magnetoresistive sensing system for an intraoral automation and electrical engineering from the In-
tongue-computer interface,” IEEE Trans. Biomed. Circuits Syst., vol. 6, stitut National des Sciences Appliquées, Toulouse,
no. 6, pp. 571–585, Dec. 2012. France, in 2013. He is currently working toward the
[23] L. N. A. Struijk, L. L. Egsgaard, R. Lontis, M. Gaihede, and B. Bentsen, Ph.D. degree in electrical engineering at the Biomed-
“Wireless intraoral tongue control of an assistive robotic arm for indi- ical Microsystems Laboratory, Laval University, QC,
viduals with tetraplegia,” J. NeuroEng. Rehab., vol. 14, no. 1, p. 110, Canada. His main research interests include assis-
2017. tive technologies, rehabilitation robotics, human–
[24] D. Johansen, C. Cipriani, D. B. Popović, and L. N. Struijk, “Control of machine interfaces, wireless technologies, body sen-
a robotic hand using a tongue control system—A prosthesis application,” sor networks, and biomedical instrumentation.
IEEE Trans. Biomed. Eng., vol. 63, no. 7, pp. 1368–1376, Jul. 2016.
[25] D. Kupetz, S. Wentzell, and B. BuSha, “Head motion controlled power
wheelchair,” in Proc. IEEE 36th Annu. Northeast Bioeng. Conf., 2010,
pp. 1–2.
[26] C. L. Fall et al., “Intuitive wireless control of a robotic arm for people
living with an upper body disability,” in Proc. IEEE 37th Annu. Int. Conf. Francis Quevillon will receive the B.Ing. degree in
Eng. Med. Biol. Soc., 2015, pp. 4399–4402. electrical engineering from Laval University, Quebec,
[27] J. Musić, M. Cecić, and M. Bonković, “Testing inertial sensor performance Canada, in 2018. He is currently a student in its final
as hands-free human-computer interface,” WSEAS Trans. Comput., vol. 8, undergraduate year of study.
no. 4, pp. 715–724, 2009.
[28] M. N. Sahadat, A. Alreja, and M. Ghovanloo, “Simultaneous multimodal
PC access for people with disabilities by integrating head tracking, speech
recognition, and tongue motion,” IEEE Trans. Biomed. Circuits Syst.,
vol. 12, no. 1, pp. 192–201, Feb. 2018.
[29] K. D. Katyal et al., “Harmonie: A multimodal control framework for
human assistive robotics,” in Proc. 2013 6th Int. IEEE/EMBS Conf. Neural
Eng., 2013, pp. 1274–1278.
[30] W. Wahlster, SmartKom: Foundations of Multimodal Dialogue Systems. Martine Blouin received the B.Eng. degree
New York, NY, USA: Springer, 2006, vol. 12. in biomedical engineering from Polytechnique
[31] C. L. Fall et al., “A multimodal adaptive wireless control interface for peo- Montreal, Montreal, QC, Canada, in 2012 and the
ple with upper-body disabilities,” in Proc. 2017 IEEE Int. Symp. Circuits M.Sc.A. degree in engineering with a specialisa-
Syst., 2017, pp. 1–4. tion in health technology from the École de Tech-
[32] A. Campeau-Lecours et al., “JACO assistive robotic device: Empowering nologie Supérieure, Montreal, Canada, in 2014. She
people with disabilities through innovative algorithms,” in Proc. Rehab. is currently a Robotic Control Engineer at Kinova,
Eng. Assistive Technol. Soc. North Amer. Conf., 2016. [Online]. Available: Boisbriand, Canada. She provided a technical ex-
https://pdfs.semanticscholar.org/4e8e/872c5df70b961545aa0e9301ddd89 pertise and support on programming and interfac-
c30b94b.pdf ing with the Kinova JACO robot during this project.
[33] B. Hudgins, P. Parker, and R. N. Scott, “A new strategy for multifunction Her research interests include control and servoing,
myoelectric control,” IEEE Trans. Biomed. Eng., vol. 40, no. 1, pp. 82–94, rehabilitation robotics, mathematical modeling, and
Jan. 1993. biomedical instrumentation.
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.

12 IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS

Simon Latour was born in Canada in 1981. He re- Benoit Gosselin received the Ph.D. degree in elec-
ceived the Bachelor’s degree in electrical engineer- trical engineering from École Polytechnique de
ing, with specialization in medical technologies in Montréal, Montreal, QC, Canada, in 2009. He was
2009. From 2005 to 2010, he worked for Telemedic an NSERC Postdoctoral Fellow at the Georgia In-
where he did some research on the integration of pulse stitute of Technology in 2010. He is currently an
oxymetry sensors for portable devices and worked on Associate Professor with the Department of ECE,
the implementation of communication interfaces for Université Laval, where he is heading the Smart
medical devices. In 2010, he joined Kinova where he Biomedical Microsystems Lab. His research interests
developed a robot control interface for the assistive include wireless microsystems for brain computer in-
market. Following that, he was involved in the devel- terfaces, analog/mixed-mode and RF integrated cir-
opments of the next generations of robots. Over the cuits for neural engineering, interface circuits of im-
years, he developed a strong expertise in robotics for the assistive market. He plantable sensors/actuators and point-of-care diagnostic microsystems for per-
is currently a Product Manager for the Assistive Robotics at Kinova Robotics sonalized healthcare.
where he is in charge of the robotic products applied to the assistive market. Dr Gosselin is an Associate Editor of the IEEE TRANSACTIONS ON BIOMED-
ICAL CIRCUITS AND SYSTEMS and he is the Chair and a Founder of the Quebec
IEEE CAS/EMB Chapter (2015 Best New Chapter Award). He served on the
Alexandre Campeau-Lecours received the B.Eng.
committees of several international conferences such as IEEE ISCAS, IEEE
degree in mechanical engineering (mechatronics)
NEWCAS, BIOCAS, and IEEE EMBC/NER. His research interests include
from the Ecole Polytechnique de Montréal, Montréal,
developing innovative microelectronic platforms to collect and study brain ac-
QC, Canada, in 2008, and the Ph.D. degree from Uni-
tivity through advanced multimodal bioinstrumentation and actuation technol-
versité Laval, QC, Canada, in 2012. From 2012 to
2015, he was with Kinova as a Research and Devel- ogy. In addition to earn several awards, such as the 2017 IEEE ISCAS Best
Live Demonstration Award, his contributions led to the commercialization of
opment Project Manager in control and robotic algo-
the first wireless bioelectronic implant to combine optogenetics and large-scale
rithms. He is currently an Assistant Professor with the
brain monitoring capabilities within a single device, with his partner Doric
Department of Mechanical Engineering, Université
Laval. His research interests include physical human– Lenses Inc.
robot interaction, development of assistive technolo-
gies for people living with disabilities and the elderly, medical applications, and
intelligent robotic algorithms.

Clément Gosselin received the B.Eng. degree in


mechanical engineering from the Université de
Sherbrooke, QC, Canada, in 1985, and the Ph.D.
degree from McGill University, Montréal, QC, in
1988. He was then a Postdoctoral Fellow at INRIA
in Sophia-Antipolis, France, in 1988–1989. In 1989,
he was appointed by the Department of Mechani-
cal Engineering, Université Laval, QC, where he is a
Full Professor since 1997. He is currently holding a
Canada Research Chair in Robotics and Mechatron-
ics since January 2001. He was a Visiting Researcher
with the RWTH in Aachen, Germany, in 1995, at the University of Victoria,
Canada, in 1996 and at the IRCCyN in Nantes, France, in 1999.
His research interests include kinematics, dynamics, and control of robotic
mechanical systems with a particular emphasis on the mechanics of grasping,
the kinematics, and dynamics of parallel manipulators and the development
of human-friendly robots. His work in the aforementioned areas has been the
subject of numerous publications in international journals and conferences as
well as of several patents and two books. He has been directing many research
initiatives, including collaborations with several Canadian and foreign high-
technology companies and he has trained more than 100 graduate students. He
is an Associate Editor of the IEEE ROBOTICS AND AUTOMATION LETTERS and
of the ASME Journal of Mechanisms and Robotics.
Dr. Gosselin received several awards including the ASME DED Mechanisms
and Robotics Committee Award in 2008 and the ASME Machine Design Award
in 2013. He was appointed Officer of the Order of Canada in 2010 for contri-
butions to research in parallel mechanisms and underactuated systems. He is a
fellow of the ASME and the Royal Society of Canada.

You might also like