You are on page 1of 2

Motivation Points:

1. It provides a compelling proposition for BMI users for controlling robotic


arm by integrating AR feedback to assist paralysis people to make more
accurate and efficient grasp actions and improve quality of live.
2. It demonstrates a very good and practical exampling of utilizing AR in
Rehabilitation

Introduction:
Brain Machine Interface is a communication system that allows direct connection
between a human brain and a computer or other external device. It is mainly
designed for assisting people with severe motor disabilities, helping them re-
establish communicative and environmental control abilities.
Recent brain machine interface works have shown that people with tetraplegia
can control robotic arms using signals recorded by intracortical electrodes.
However process control of objects grasping is quite complex task for BMI users.
High efficiency and accuracy is hard to achieve as it lacks sufficient feedback
information to perform closed-loop control. The user exclusively rely on normal
visual information.To make up for this AR feedback information has been provided
about gripper aperture in real time which acts like a closed loop control and
enhances the accuracy and efficiency of the performed tasks.
The basic working principle of the is that the user sits in front of the monitor
watching the live video captured from the workspace while the robotic arm is
being operated via EyeX and Emotiv. The robotic arm executes reaching, grasping,
delivering and releasing tasks in response to the command from BMI user. And
Augmented reality feedback is presented to the user via computer screen during
the object manipulation.
Experimental Protocol:
1. The robotic arm is in the initial position
2. The user selects an object according to the gaze point. If gaze point is over
the object it is represented in red color and if the user decides to pick that
object the color turns to green.
3. The grasping process, where the user has decided when to stop the
grasping process only by visual inspection.
4. AR is used to augment the feedback information in real time. When an
object has been grabbed a simulated contact force is overlaid over over the
object by two virtual arrows.
5. When the grasp is completed the user fixates his gaze on the target cap and
performs motor imagery to trigger the robotic arm moving to the target cap
automatically.

Results:
In the process of experiments, one trial without AR feedback is followed by one
trial with AR feedback. The task is repeated 30 times where 15 times with AR
feedback and 15 times with visual inspection only. Two indices are used to
evaluate the performance of this system:
1. Time used in grasping process. The execution time for the grasping task
with AR feedback is generally shorter than that of without AR. When no AR
feedback is provided, it is hard for the user to clearly observe the state of
grasping process from one perspective, especially when robotic arm blocks
the view of the subject. Furthermore, in order to grasp the object tightly,
the user has to generate more commands through BMI in the grasping
process without AR. Whereas, with AR feedback, the state of grasping
process and the simulated grasping force between the gripper and the
object are shown in real time. Therefore, the user can handle the process
easily.
2. The controlling error rate of gripper aperture. It is defined as the proportion
between the difference of the size of the object and gripper aperture to
objects size.

You might also like