You are on page 1of 7

302

IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, VOL. 19, NO. 1, JANUARY 2015

HumanComputer Interface Controlled by the Lip


Marcelo Archajo Jose, Member, IEEE, and Roseli de Deus Lopes

AbstractLip control system is an innovative humancomputer


interface specially designed for people with tetraplegia. This paper
presents an evaluation of the lower lip potential to control an input
device, according to Fitts law (ISO/TS 9241-411:2012 standard).
The results show that the lower lip throughput is comparable with
the thumb throughput using the same input device under the same
conditions. These results establish the baseline for future research
studies about the lower lip capacity to operate a computer input
device.
Index TermsAssistive technologies (ATs), Fitts law, human
computer interaction, pointing devices, severe disabilities.

I. INTRODUCTION
E propose a new form of interaction, a humancomputer
interface controlled by the lower lip, for users with
tetraplegia and validate it with a prototype.
While restorative treatments for spinal cord injury (SCI) [1],
[2] or invasive brainmachine interfaces [3], [4] are not available
outside research labs, some technologies [5], [6] can be used by
people with tetraplegia to interact with the world. Interaction
must be focused in the brain and muscles that users can control.
The selection of an assistive humancomputer interface requires
maximizing the flow of information and minimizing the effort
(physical and mental) to use it [6], [7].
Current alternatives include noninvasive braincomputer interfaces (BCI), eye tracking, electromyography (EMG), sipand-puff, voice commands, chin control, head control, mouth
joystick, and tongue control.
Noninvasive BCI using electroencephalography (EEG) has
two main types [6]: synchronous and asynchronous. A synchronous system [8], using P300 to control a power wheelchair,
requires 5 s, on average, to produce a highly reliable command,
too slow for a continuous control as required from a computer
pointing device (more information about EEG transfer rates can
be found in [9] and [10]). On the other hand, an asynchronous
system using sensorimotor rhythms shows the possibility to control a computer cursor [11], [12], but it will need more research
to overcome the strong performance variability [6].
In general, eye tracking can be made based on image processing [13], [14] or electro-oculography [6], [15], which uses

Manuscript received September 9, 2013; revised November 27, 2013 and


January 21, 2014; accepted February 4, 2014. Date of publication February 6,
2014; date of current version December 30, 2014.
The authors are with the School of Engineering, Polytechnic School of the
University of Sao Paulo, Sao Paulo 05508-010, Brazil (e-mail: marcelo.
archanjo@usp.br; roseli@lsi.usp.br).
Color versions of one or more of the figures in this paper are available online
at http://ieeexplore.ieee.org.
Digital Object Identifier 10.1109/JBHI.2014.2305103

electrodes to monitor the eye movement. Any eye tracking system demands much user attention and errors can occur due to the
mismatch of selecting a command and the eye already changing
the position. For a pointing device, this could be acceptable, but
to control a power wheelchair, maintaining the eyes position can
be very tiring and false commands are not acceptable.
Another possibility is EMG, which uses electrodes to monitor
facial muscles to control a computer pointing device [6], [14].
An analysis of an EMG interface according to Fitts law is shown
in [16]; this interface provides discrete direction movements
(horizontal and vertical separately) not diagonal.
Sip-and-puff is an option for people with tetraplegia mainly
to control power wheelchairs [15], but this is usually difficult to
operate and usually works only with four discrete directions.
Voice commands can be useful to access some computer and
smartphone applications, but they are not adequate to direct
control pointing devices. Some research studies use voice commands to control power wheelchairs [15], [17]. A way to improve the precision of the voice recognition system in noisy environments is lip-reading (speech reading), an image processing
technology to identify speech from lip images [18].
Chin control is one of the best options currently available for
people with tetraplegia; this consists of a power wheelchair joystick adapted to be controlled by the chin [19]. Some systems
provide connection with computer as a pointing device [20],
[21]. A great advantage of the joystick is the possibility of soft
and free movements in any direction. Chin control (also applicable to head control [15], [22] and mouth joystick1 [5]) depend
on neck movements; the body must be fixed and the head must
be able to move freely. Power wheelchairs provide this condition, but vibration during the drive and body spasms (common
in spastic tetraplegia) can generate false commands. Outside the
wheelchair, the user has no control, due to the dependence of
the apparatus on the wheelchair structure.
The advantage of using tongue control [23][25] is that the
tongue is not controlled by the spinal cord; instead, it is controlled by the hypoglossal nerve directly connected to the brain.
One inconvenience of this type of interface is the hygienic issue, because people with tetraplegia depend on someone to put a
tongue piercing; for [25], it is also necessary an intraoral system.
Another restriction is that it allows only few discrete directions.
The tongue drive system (TDS) [23], [24] allows four directions (and two selection commands). Inductive tongue control
system [25] allows eight directions (and 10 sensors for other
commands).
The tongue and the mouth occupy a significant amount of
motor cortex, comparable with the hands and the fingers [26],
[27].

1 Mouth

joystick is a joystick with the stick adapted to be bitten.

2168-2194 2014 IEEE. Translations and content mining are permitted for academic research only. Personal use is also permitted, but republication/redistribution
requires IEEE permission. See http://www.ieee.org/publications standards/publications/rights/index.html for more information.

JOSE AND LOPES: HUMANCOMPUTER INTERFACE CONTROLLED BY THE LIP

303

Lip muscles are controlled by the facial nerve that is directly


connected to the brain. This is an important characteristic for
people with SCI in the neck region. An innovative human
computer interface using the lips, such as the one proposed in
this paper, indicates an excellent potential. The major contributions of this paper are the analysis, under the rigor of Fitts
law [28], [29], of using the lips to control a pointing device; and
the comparison of the results with a common way (the thumb)
to control the same device.
II. SCOPE
The proposed lip control system (LCS) is a humancomputer
interface with a headset and a joystick positioned in front of the
lower lip. The studies to develop the prototype showed that the
lip control must be head mounted in order to capture the lower lip
muscles movements. The joystick, as an interaction method, was
chosen because it is easy to use, provides an intuitive control,
is compatible with the lips movement and is widely known
and adopted in assistive technologies (ATs) [5]. Some other
important characteristics of the LCS are as follows:
1) it is controlled by the lower lip (dry area), an external body
part, less hygienic issues;
2) it allows soft free movement in any direction as it is based
on a joystick;
3) it is a personal system that can stay with the user in the
wheelchair, chair, bed, etc; and
4) it avoids false commands deriving from wheelchair vibration or body spasms because it is head mounted.
An efficient humancomputer interface is very important to
improve the autonomy of people with tetraplegia allowing the
control of power wheelchairs, computers, smartphones or other
computerized appliances.
To evaluate LCS as a computer input device, it was configured as a Bluetooth standard mouse (compatible with computers
and smartphones), but with the purpose of controlling power
wheelchairs as well.
Computer input devices have been deeply studied [30][32]
and there are effective methods to evaluate their interface efficiency, such as Fitts law [28] (standards in ISO/TS 9241411:2012 [29] that revises ISO9241-9:2000) that is widely used.
The main measure for comparing computer input devices is
the throughput TP in bits/s [30] from a human to a computer,
and it is calculated as:
IDe
(1)
TP =
MT
where IDe is the task effective index of difficulty [23], [30], [32]
and MT is the average movement time to execute it. IDe is based
on Shannon formulation [33]:


De
IDe = logz
+1
(2)
We
where De is the average of effective distance between the point
where the participant selects one target and the point where he
selects the next target. We is the effective width and is defined
as:
W e = 4.133 SD

(3)

Fig. 1. (a) LCS architecture. (b) Head support. (c) Joystick support. (d) Calibration holes.

where SD is the standard deviation of the distance between the


target center and the point at which the participant selects the
target, 4.133 is a constant. More detailed information can be
found in [30][32].
The LCS throughput, controlled by the lip, was measured to
establish the lower lip capacity baseline to control a human
computer interface, but it is also important to understand if
this throughput is limited by the device. The LCS throughput,
controlled by the thumb (as a gamepad) was measured, because
this can be considered one of the best use conditions, near the
device limit throughput. This two-throughput comparison shows
how good the lower lip could be considered if compared with
the thumb to control the LCS. This is the reason why all the
participants chosen to the tests are able-bodied.
III. LCS ARCHITECTURE AND IMPLEMENTATION
The LCS hardware consists of a development board Arduino
Mega ADK, a Bluetooth module (Roving RN42-HID) and a
thumb joystick, Fig. 1(a). The system was configured as a standard Bluetooth mouse with a human interface device (HID) profile. All the communications occur as with a standard Bluetooth
mouse.
The LCS was designed specifically to be controlled by the
lower lip; the current version is the ninth. The head support,
Fig. 1(b), evolved to provide the necessary stability during the
operation; the joystick support, Fig. 1(c), evolved to be double
and to provide calibration [see Fig. 1(d)] of length and angle in
order to set the joystick in the correct operation position (just
touching the skin). Fitts law multidirectional tests were done to
choose the joystick response with better throughput response.
The full headset prototype has 158.8 g of mass, including the
joystick and the cable used to connect the joystick. A USB cable
was connected to the computer just to provide power during the
tests (this prototype does not have batteries).
IV. METHODS AND MATERIALS
The tests main objectives are to obtain the LCS lip-controlled
throughput and to validate the lower lip potential as a body part
capable to control a humancomputer interaction. An important measure is the LCS throughput controlled by the thumb
compared with the lower lip throughput. To evaluate the LCS,

304

Fig. 2.

IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, VOL. 19, NO. 1, JANUARY 2015

(a) Mouse. (b) Thumb-controlled LCS. (c) Lip-controlled LCS.


Fig. 3. (a) One-directional horizontal task screen. (b) One-directional vertical
task screen. (c) Multidirectional task screen.

tapping (point-and-select) tests were made according to Fitts


law (ISO/TS 9241-411:2012 [29] standard).
The tests were previously approved by the Research Ethics
Committee of the University of Sao Paulo, approval number
219927.
A. Participants
Twelve able-bodied volunteer participants (eight male, four
female) were chosen to take the tests, and were recruited among
students of undergraduate and graduate programs. Participants
ranged from 20 to 37 years of age (average = 28), weighed from
43 to 145 kg (average = 78 kg) and were 1.50 to 1.89 m tall
(average = 1.75 m). All of them used computers for more than
6 h a day, but had no prior experience with LCS. The joystick
and the LCS headset were cleaned in front of each participant
before the tests.
B. Apparatus
The test apparatus consisted of a notebook HP Pavilion Dv7
(AMD Turion II Dual-Core Mobile M600 2.40 GHz, 4 GB
of RAM, LCD 17.3, Windows 7 64 bits, Resolution 1600
900), standard optical USB mouse Bright (model 0106, chip
PAN3511, without mouse pad) and the LCS. The software used
to conduct the tests was Java (tasks) and Python (tasks launcher).
During the tests, just these programs were running in the foreground and no network access was allowed, in order to avoid
any background activity, which could interfere in the processing
time and produce unexpected results in the data collected.
C. Procedure
The tests occurred in a quiet room with just the researcher
and one participant at a time. Three pointing devices were
used: mouse, thumb-controlled LCS, and lip-controlled LCS
(see Fig. 2).
The mouse was selected to ensure that the apparatus can
achieve the well-known throughput for this device, values between 3.7 and 4.9 bits/s [23], [30], [34].
Twelve participants, in all sessions, followed the same test
sequence: first mouse, next thumb-controlled LCS and, finally,
lip-controlled LCS. This order was chosen from the most familiar device and use (mouse), to an unfamiliar device (LCS),
but controlled in a known way: the thumb; and in the end, the
LCS controlled in an unfamiliar way, the lip. This is to explore
the learning effect [31], in order to make the test more reliable,
once the LCS controlled by the lip is a very different way of
interaction for the participants.

For each device, one-directional horizontal, one-directional


vertical, and multidirectional tapping tasks, in this order, were
performed, Fig. 3.
D. Dwell Time
The participants were asked to click the mouse device, but
for LCS (thumb or lip), instead of clicking, they were required
to dwell within the target area [29]. The use of dwell time
reduces the throughput [13], but isolates the cursor movement
process. A 500 ms dwell time was chosen for the test to make
this work comparable with other similar Fitts law research
studies: [35] used 500 ms, [13] used 500 and 750 ms, [23]
used 560 ms; but to find the best dwell time for lip-controlled
LCS, additional research is necessary. The use of dwell time is
important to isolate the lower lip muscular behavior to control a
joystick, without introducing the click interference.2 Additional
comparison was made with the LCS controlled by the thumb
and by the lip, both using the same conditions, including the
dwell time.
E. Movement Time
The mouse device has the movement time defined as the
mouse button release period from one target and the button
release on the next target.
As LCS (both thumb-controlled and lip-controlled) uses
dwell time; the movement time cannot include the dwell [23],
[30]. Then, movement time was defined as the period between
the next target activation (dwell end) and the moment when the
cursor enters the new target area (starts a new dwell), only if
the cursor stays inside the target area all along the dwell time.
The movement time continues if the cursor leaves the target
before the end of the dwell time. The x, y position is registered
to calculate the effective distance De and effective width We, in
both cases, the start and the end of movement time, Fig. 4.
F. Tasks
The procedure for one-directional tapping tasks is to point
and to select3 a rectangle on the left (or top for the vertical task)
indicated by a red + sign; after that, pointing and selecting the
rectangle on the right (or bottom for vertical task), go back to
2 The finger used to press a key introduces the time to change the attention
from the LCS to the keyboard and uses another far different muscular group and
is not an option for a person with tetraplegia.
3 Click for mouse or 500 ms of dwell time for LCS.

JOSE AND LOPES: HUMANCOMPUTER INTERFACE CONTROLLED BY THE LIP

305

TABLE III
TEST DESIGN OVERVIEW

Fig. 4.

Dwell time and movement time representation, for LCS tests.


TABLE I
ONE-DIRECTIONAL DW COMBINATION

TABLE II
MULTIDIRECTIONAL DW COMBINATION

include a one-directional vertical task to verify if a different


muscular response in the control of LCS by the lower lip is
observed.
The participants were asked to balance cursor movement as
fast and precisely as possible for all tasks.
G. Design

the left (or top for the vertical task) and so on. This procedure
is repeated 25 times [29] for each block.
A block is the rectangle width (W ) (pixels) and the distance
between targets (D) (pixels) combination, which defines the
index of difficulty (bits) [32]:


D
+1
(4)
ID = logz
W
We used four blocks for the task. Table I shows the combination of D and W , as well as ID; during the tests, they appear
randomly.
These values are the same as the ones for the one-directional
task of [23], but four combinations of DW that have clear
distinct ID values were chosen.
The procedure for the multidirectional circular tapping task
is to point and to select the circle indicated in red; after that,
a circle diagonally opposite will be indicated until all the 15
circles for each block have been gone through, Fig. 3(c). Here,
the data registration considers 14 steps in 15 circles.
Table II shows the combination of D and W , as well the ID;
during the tests, they appeared randomly.
These values are the same as the ones for the multidirectional
task of [23], which used just three DW combinations. Combination D = 305 W = 76 (ID = 2.33) was not used there. Here,
we prefer to maintain this combination because it has a very
distinct ID.
Many works use multidirectional circular tapping tasks, others use one-directional horizontal tapping tasks [30]. Here, we

Each of the 12 participants performed three tasks with each


of the three devices (each one-directional task has 25 trials and
multidirectional has 14 trials). Four blocks of DW combinations were used for each task. All of this is one session. Each
participant was assigned five sessions (two in one day and three
on a different day). Table III shows the whole test design. The
independent variables and levels are as follows.
1) Tasks: {one-directional horizontal, one-directional vertical, multidirectional}.
2) Trials: {25 for one-directional tasks, 14 for multidirectional task}.
3) Blocks: {1,2,3,4}.
4) Sessions: {1,2,3,4,5}.
5) Devices: {mouse, LCS thumb, LCS lip}.
The dependent variables were effective width We, effective distance De (both based on x and y screen position) and
movement time MT (milliseconds); all necessary to obtain the
throughput TP (bits/s). For the mouse, additionally, there was
the error rate when the participant clicked with the mouse outside the target, which computed an error. When the participant
used the LCS, there was no error [13], [35] due to dwell time.
We applied the comfort assessment defined in ISO/TS 9241411 [29] to all the participants, just for lip-controlled LCS, with
questions adapted for the body parts used to control the input
device.
Before the first session, the participants had a 30-min training
period with all the devices, executing all tasks without data
register; during this time, they received all the instructions. This
training has just 10 trials for one-directional tasks instead of
the 25 trials real test; the multidirectional task had 14 trials, the
same number of the real test trials. On the second day, before
starting the third session, the participants trained with the LCS
using the lip for about 5 min, just to remember the operation.

306

IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, VOL. 19, NO. 1, JANUARY 2015

TABLE IV
SESSIONS THROUGHPUT, AVERAGE, AND STANDARD DEVIATION (BITS/S)

Fig. 5.

Average throughput and standard deviation for all participants.

The test took about 30 min per session; on the first day,
each participant spent 30 min training and 1 h in the two first
sessions; on the second day, they took 1 h and 30 min in the
three last sessions. This was done to avoid excessive mental or
physical fatigue, reducing the fatigue effects [31]. All the tests
were conducted over four weeks.
V. RESULTS AND DISCUSSION
In order to compare the three devices throughput measure, the
mean of means [30] was used. The general average throughput
and the standard deviation are shown in Fig. 5. The mouse had
4.70 bits/s for the one-directional horizontal task, 4.78 bits/s for
the one-direction vertical task and 4.95 bits/s for the multidirectional task; these values agree with many research works [23],
[30], [34] which indicates that the apparatus is adequate.
The thumb-controlled LCS achieved 4.19 bits/s for the onedirectional horizontal task, 3.97 bits/s for the vertical task and
1.8 bits/s for the multidirectional task.
The lip-controlled LCS is the main measure of this test; it had
2.59 bits/s for the one-directional horizontal task, 2.60 bits/s
for one-direction vertical task, and 1.06 bits/s for the multidirectional task. The one-directional tapping tasks show a similar throughput for horizontal and vertical movements despite
the different muscular behavior to move the joystick in these
directions. The ANOVA confirms that the difference was not
statistically significant (F1,11 = 0.015, ns). This could be due
to no significant difference in human psychomotor behavior for
the lower lip in the horizontal and vertical movements, or because the hardware achieves its limits very quickly, reducing the
muscular fine movements importance.
Table IV shows the average throughput of all the participants
for each session, the general average and standard deviation.
The lip-controlled LCS achieved throughput comparable with
other assistive technology research studies. The TDS presented
in [23] achieved for one-directional horizontal task results from
2.1 to 2.5 bits/s, vertical task from 2.2 to 2.7 bits/s and multidirectional task from 0.4 to 1.0 bits/s.
An important objective of this work is to understand how the
lower lip can be a good option to control an input device. That
is the reason for controlling the LCS with the thumb and the
lip under the same conditions. The lip-controlled LCS reaches
1.06 bits/s throughput for the multidirectional task, while the
thumb-controlled LCS had 1.80 bits/s throughput (this can be

Fig. 6.

LIP/thumb throughput comparison percentage.

considered near the device limit) also for the multidirectional


task. We can conclude that the lower lip achieved 59.2% of
the thumb-controlled LCS throughput for multidirectional task.
Fig. 6 shows the percentages for each task.
Fig. 6 shows a consistent percentage tendency for the three
tasks, with mean 62.2% and standard deviation 3.2%. In short,
the lower lip was able to control a humancomputer interface
reaching 62.2% of the thumb throughput.
A. Tracing
The LCS uses a joystick to control the cursor, which permits
soft movement including diagonals. Fig. 7(a) shows the tracing of multidirectional tasks of the thumb-controlled LCS and
Fig. 7(b) shows the tracing of multidirectional task of the lipcontrolled LCS, both for D = 534 and W = 76, for participant
#3, session 5 (best result for LCS lip).
The tracing in Fig. 7 shows a similar behavior of the thumb
and the lip with the LCS. The participant could explore the
diagonal movement in both cases. The individual throughput of
these tests was 2.71 bits/s for the thumb and 2.28 bits/s for the lip.

JOSE AND LOPES: HUMANCOMPUTER INTERFACE CONTROLLED BY THE LIP

307

was an important observation, showing that the lip movement


over the teeth is intense.
VI. CONCLUSION AND FUTURE WORK

Fig. 7. (a) Thumb-controlled LCS. (b) Lip-controlled LCS. (Tracing for multidirectional task for LCS of the participant #3, session 5, with D = 534 and
W = 76.)

This paper presented an evaluation of the LCS according to


Fitts law (ISO/TS 9241-411:2012 standard). The tests showed
the lower lip potential to control an input device, and the results
showed viable throughputs (2.6 bits/s for one-direction tasks and
1.06 bits/s for multidirectional task) and the most important, the
lower lip achieves 62.2% of the thumb throughput, showing its
potential to control humancomputer interfaces.
These results encourage us to expand the use of LCS to other
applications (for instance controlling a power wheelchair), researching the use of other input devices that has better throughput than the joystick (to be lip-controlled) or to develop a new
input device specially designed to be controlled by the lower lip.
We have two new ongoing works.
1) Development and test of a new version of LCS with a mini
trackball instead of a thumb joystick.
2) Evaluating the LCS to control power wheelchairs.
ACKNOWLEDGMENT
The authors would like to sincerely thank S. MacKenzie for
promptly sending us the one-directional horizontal and multidirectional Java code as well as information about Fitts law,
available in his website http://www.yorku.ca/mack/index.html.
REFERENCES

Fig. 8.

LCS controlled by the lip assessment results.

B. Comfort Assessment
The participants answered a questionnaire defined in ISO/TS
9241-411:2012 [29] to assess the comfort in the use of the lipcontrolled LCS; the questionnaire was adapted to involve the
specific body parts used during the tests.
The assessment consists of 12 questions rated from 1 to 7,
where 7 means the most favorable answer.
The assessment in Fig. 8 shows the same verbal information
given by the participants; some fatigue on the lips (4.17) and
jaw (4.17), which affects the general comfort (3.83). The participants had no prior experience with the lip-controlled LCS,
or with the force necessary to control the joystick, or the kind
of movements likely to cause muscular discomfort. Its regular
use could promote lip muscles strength, making the device more
comfortable to use. We also observed that the participants used
the jaw as a complement to the lower lip to move the joystick
up and down.
There was practically no complaint about the neck and
shoulder.
The participants liked the operation speed (5.25) of the lipcontrolled LCS, which helped to achieve a good rate (5.25) for
the overall device operation.
A participant with orthodontic brace did not succeed in using
the LCS with the lips, and was replaced. During the tests, this

[1] Y. Guo, L. Ma, M. Cristofanilli, R. P. Hart, A. Hao, and M. Schachner,


Transcription factor Sox11b is involved in spinal cord regeneration in
adult zebrafish, Neuroscience, vol. 172, pp. 329341, Jan. 2011.
[2] S. Thuret, L. D. Moon, and F. H. Gage, Therapeutic interventions after
spinal cord injury, Nat. Rev. Neurosci., vol. 7, no. 8, pp. 628643, Aug.
2006.
[3] J. E. ODoherty, M. A. Lebedev, P. J. Ifft, K. Z. Zhuang, S. Shokur,
H. Bleuler, and M. A. L. Nicolelis, Active tactile exploration using a
brain-machine-brain interface, Nature, vol. 479, no. 7372, pp. 228231,
Nov. 2011.
[4] M. A. Nicolelis, Actions from thoughts, Nature, vol. 409, no. 6818,
pp. 403407, Jan. 2001.
[5] H. A. Caltenco, B. Breidegard, B. Jonsson, and L. N. S. Andreasen Struijk, Understanding computer users with tetraplegia: Survey of assistive
technology users, Int. J. Human-Comput. Interaction, vol. 28, no. 4,
pp. 258268, Apr. 2012.
[6] C. G. Pinheiro, E. L. M. Naves, P. Pino, E. Losson, A. O. Andrade, and
G. Bourhis, Alternative communication systems for people with severe
motor disabilities: A survey, Biomed. Eng. Online, vol. 10, no. 1, pp. 1
28, Jan. 2011.
[7] J. Abascal, Users with disabilities: Maximum control with minimum
effort, in Articulated Motion and Deformable Objects Lecture Notes
in Computer Science Volume 5098. Berlin, Germany: Springer, 2008,
pp. 449456.
[8] B. Rebsamen, C. Guan, H. Zhang, C. Wang, C. Teo, M. H. Ang, and
E. Burdet, A brain controlled wheelchair to navigate in familiar environments, IEEE Trans. Neural Syst. Rehabil. Eng., vol. 18, no. 6, pp. 590
598, Dec. 2010.
[9] J. R. Wolpaw, N. Birbaumer, W. J. Heetderks, D. J. McFarland,
P. H. Peckham, G. Schalk, E. Donchin, L. A. Quatrano, C. J. Robinson,
and T. M. Vaughan, Brain-computer interface technology: A review of
the first international meeting, IEEE Trans. Rehabil. Eng., vol. 8, no. 2,
pp. 16473, Jun. 2000.
[10] B. Obermaier, C. Neuper, C. Guger, and G. Pfurtscheller, Information
transfer rate in a five-classes braincomputer interface, IEEE Trans. Neural Syst. Rehabil. Eng., vol. 9, no. 3, pp. 283288, Sep. 2001.

308

IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, VOL. 19, NO. 1, JANUARY 2015

[11] G. E. Fabiani, D. J. McFarland, J. R. Wolpaw, and G. Pfurtscheller, Conversion of EEG activity into cursor movement by a braincomputer interface (BCI), IEEE Trans. Neural Syst. Rehabil. Eng., vol. 12, no. 3,
pp. 331338, Sep. 2004.
[12] D. J. McFarland, A. T. Lefkowicz, and J. R. Wolpaw, Design and operation of an EEG-based brain-computer interface with digital signal processing technology, Behavior Res. Methods, Instruments, Comput., vol. 29,
no. 3, pp. 337345, Sep. 1997.
[13] X. Zhang and I. S. Mackenzie, Evaluating eye tracking with ISO 9241
Part 9, in Proc. 12th Int. Conf. Human-Comput. Interaction, 2007,
pp. 779788.
[14] C. A. Chin, A. Barreto, and M. Adjouadi, Integration of EMG and EGT
modalities for the development of an enhanced cursor control system,
Int. J. Artif. Intell. Tools, vol. 18, no. 3, pp. 399414, Jun. 2009.
[15] M. Mazo, An integral system for assisted mobility, IEEE Robot. Autom.
Mag., vol. 8, no. 1, pp. 4656, Mar. 2001.
[16] M. R. Williams and R. F. Kirsch, Evaluation of head orientation and neck
muscle EMG signals as command inputs to a humancomputer interface
for individuals with high tetraplegia, IEEE Trans. Neural Syst. Rehabil.
Eng., vol. 16, no. 5, pp. 485496, Oct. 2008.
[17] G. Pacnik, K. Benkic, and B. Brecko, Voice operated intelligent
wheelchairVOIC, in Proc. IEEE Int. Symp. Ind. Electron., 2005, vol. 3,
pp. 12211226.
[18] Z. Yi, L. Quan-jie, L. Yan-hua, and Z. Li, Intelligent wheelchair multimodal human-machine interfaces in lip contour extraction based on
PMM, in Proc. IEEE Int. Conf. Robot. Biomimetics, Dec. 2009, pp. 2108
2113.
[19] N. Pellegrini, B. Guillon, H. Prigent, M. Pellegrini, D. Orlikovski,
J.-C. Raphael, and F. Lofaso, Optimization of power wheelchair control
for patients with severe Duchenne muscular dystrophy, Neuromuscular
Disorders, vol. 14, no. 5, pp. 297300, May 2004.
[20] H.-C. Chen, Y.-P. Liu, C.-L. Chen, and C.-Y. Chen, Design and feasibility study of an integrated pointing device apparatus for individuals with
spinal cord injury, Appl. Ergonomics, vol. 38, no. 3, pp. 275283, May
2007.
[21] H.-C. Chen, C.-L. Chen, C.-C. Lu, and C.-Y. Wu, Pointing device usage
guidelines for people with tetraplegia: A simulation and validation study
utilizing an integrated pointing device apparatus, IEEE Trans. Neural
Syst. Rehabil. Eng., vol. 17, no. 3, pp. 279286, Jun. 2009.
[22] H. V. Christensen and J. C. Garcia, Infrared non-contact head sensor, for
control of wheelchair movements, Assistive Technol.: Virtuality Reality,
vol. 16, pp. 336340, 2005.
[23] B. Yousefi, X. Huo, E. Veledar, and M. Ghovanloo, Quantitative and
comparative assessment of learning in a tongue-operated computer input
device, IEEE Trans. Inf. Technol. Biomed., vol. 15, no. 5, pp. 747757,
Sep. 2011.
[24] X. Huo and M. Ghovanloo, Using unconstrained tongue motion as an
alternative control mechanism for wheeled mobility, IEEE Trans. BioMed. Eng., vol. 56, no. 6, pp. 17191726, Jun. 2009.
[25] M. E. Lund, H. V. Christiensen, H. A. Caltenco, E. R. Lontis, B. Bentsen,
and L. N. S. Andreasen Struijk, Inductive tongue control of powered
wheelchairs, in Proc. IEEE 32nd Annu. Int. Conf. Eng. Med. Biol. Soc.,
2010, vol. 2010, pp. 33613364.
[26] X. Huo, J. Wang, and M. Ghovanloo, A magneto-inductive sensor based
wireless tongue-computer interface, IEEE Trans. Neural Syst. Rehabil.
Eng., vol. 16, no. 5, pp. 497504, Oct. 2008.
[27] E. R. Kandel, J. H. Schwartz, and T. M. Jessell, Principles of Neural Science, 4th ed. New York, NY, USA: McGraw-Hill, 2000.
[28] P. M. Fitts, The information capacity of the human motor system in
controlling the amplitude of movement, J. Exp. Psychology, vol. 47,
pp. 381391, 1954.
[29] Ergonomics of Human-System InteractionPart 411Evaluation Methods for the Design of Physical Input Devices, ISO/TS Standard 9241-411,
May 2012.
[30] R. W. Soukoreff and I. S. MacKenzie, Towards a standard for pointing
device evaluation, perspectives on 27 years of Fitts law research in HCI,
Int. J. Human-Comput. Studies, vol. 61, no. 6, pp. 751789, Dec. 2004.
[31] I. S. Mackenzie, HumanComputer Interaction: An Empirical Research
Perspective, 1st ed. San Mateo, CA, USA: Elsevier Morgan Kaufmann,
2013.

[32] I. S. MacKenzie, Fitts law as a research and design tool in human


computer interaction, Human-Comput. Interaction, vol. 7, pp. 91139,
1992.
[33] C. E. Shannon, A mathematical theory of communication, The Bell Syst.
Tech. J., vol. 27, pp. 379423, Jul. 1948.
[34] I. S. MacKenzie, T. Kauppinen, and M. Silfverberg, Accuracy measures
for evaluating computer pointing devices, in Proc. SIGCHI Conf. Human
Factors Comput. Syst., 2001, pp. 916.
[35] I. S. Mackenzie and R. J. Teather, FittsTilt: The application of Fitts
law to tilt-based interaction, in Proc. 7th Nordic Conf. Human-Comp.
Interaction, 2012, pp. 568577.

Marcelo Archanjo Jose (M07) received the graduate degree in electrical engineering from the Universidade Sao Judas Tadeu, Brazil, in 1997, and the
Masters degree in electrical engineering from the
Escola Politecnica da Universidade de Sao Paulo
(USP), Brazil, in 2008, where he is currently working toward the Ph.D. degree in electrical engineering.
In 1997, he joined Citibank and affiliates in Brazil.
In 2006, he began working at Fidelity Processadora e
Servicos. In 2008, he rejoined Citibank. Since 2009,
he has been a Researcher in NATENucleo de Aprendizagem Trabalho e Entretenimento do Laboratorio de Sistemas Integraveis da POLI USP. He is a
professional who works in research and development of hardware and software,
also heads his research teams, using his experience in personnel management,
obtained working in (Citibank and Fidelity) IT areas. He is specialized mainly in:
humancomputer interaction, computer graphics, image processing, stereo vision, stereo matching, 3-D reconstruction, computer vision, accessibility (power
wheelchair), user-friendly interfaces and games.
Mr. Jose has been a member of the Sociedade Brasileira de Computaca o
since 1999.

Roseli de Deus Lopes received the Undergraduate, Masters, Doctorate, and Postdoctorate degrees
in electrical engineering from Escola Politecnica da
Universidade de Sao Paulo, Brazil.
She is an Associate Professor at the Electronic Systems Department, School of Engineering of the University of Sao Paulo (USP). She is the Vice-Chair of
the Instrumentation Center of Interactive Technologies at the USP. She was Vice-Chair (20062008)
and Director (2008Feb. 2010) of Estaca o Ciencia, a
Center for Scientific, Technological and Cultural Dissemination of USP. She has been a Researcher at the Laboratory for Integrated
Systems since 1988, where she is a Principal Investigator of the Interactive Electronic Media Research Group (which includes research in computer graphics,
digital image processing, techniques and devices for humancomputer interaction, virtual reality, and augmented reality). She coordinates research projects in
the area of interactive electronic media, with emphasis on applications related
to education and health. She coordinates scientific dissemination initiatives and
projects aimed at identifying and developing talents in science and in engineering. She was responsible for the design and feasibility of Febrace (Brazilian
Fair of Science and Engineering), the largest national precollege science and
engineering fair in Brazil. Since 2003, she has acted as the General Coordinator
of Febrace.

You might also like