You are on page 1of 6

ARMS

Automated Response Motion Sensor


Abstract - In this paper we present our approach for tracking and interpreting hand gestures and movements using the
Microsoft kinect for windows. Given the skeletal tracking functionalities of the kinect we are able to detect various body parts
simultaneously. With this information we are able to specify special positions and movements a normal human being can
perform, and determine when such cases have been activated.

Introduction
In environments toxic or dangerous for
humans, its always best to send a robot.
The main objective of this project is to
develop a robotic arm controlled by real or
familiar human gestures to be used in
situations or areas where we would rather
not risk a life, example when dealing with
toxic or radioactive materials or disarming
explosives. The arm will sever as a
substitute for an actual person.
In this paper we will explain two
main aspects of this project as seen
highlighted In the class diagram below the
StateManagement
class
and
the
ControlUnit class. The ControlUnit class
manages and interprets gestures coming in
from the kinect and speaks directly to the
hardware (the kinect) via the API.
This
information
is
then
communicated to the communication class
which manages information from the
hardware section also and relates this
information to the StateManagement class.
The purpose of the StateManagement class
is to maintain accurate and up to date
information of the state of the physical
robotic arm, using this information along

with that from the ControlUnit decisions are


made on what to tell the arm to do.

Main Libraries Used

Code4fun SDK toolkit


Microsoft kinect
Native C# Libraries (for a WPF
application)

View Modes
The main reason for the use of multiple
view mods was to allow the user monitor
multiple conditions or states that must be
maintained for the system to function
effectively and efficiency. These view
modes are controlled by the ControlUnit
class.

Skeletal view mode

This view mode provides the user


with real-time tracking information
as to what the system think the user
is doing if the user thinks that what
is being interpreted by the kinect is
in correct the user may try to adjust
himself to ensure proper tracking of
his skeletal structure.

Distance view mode

This view mode shows the user their


position relative to the kinect
sensor. This gives color coded
images of the user relative to the
sensor; this will help the user to stay
within the acceptable rangers for
the sensor to work at its best.

With this information we are able to


determine when certain postures or
movements occur. To do so we established
some reference points on the body. These
points include:
1.
2.
3.
4.
5.
6.

The Head
The Elbow
Shoulder center
Right shoulder
Left shoulder
Hip Center

A gesture is recognized base on the state of


the tracked points being observed and how
it is described in the code.
The image below shows a persons skeleton
being tracked by the kinect.

Gesture Recognition (ControlUnit ())


The kinect is able to track the position of
the joints on the body as seen in the
diagram below.

The points being observed may be


the arm and the head, a simple gesture may
say: if the hands Y position is greater than

that of the Head then the arm is above the


head hence a gesture has been activated,
an operation must be done.

This is where the data collected from the


kinect is communicated to the hardware
(ARM) and then executed.

The diagram below displays the distances


and angles used to determine gestures.

The information received from the kinect


that is the X, Y and Z position of a given
point.

State Management
Along with the potentiometer that
would keep track of the state the robotic
arm is in the StateManagement class on the
software side will keep track of the ARMs
position.
r - Represents the distance between the
head and the hand along the x axis.
y - Represents the distance between the
head and the hand along the y axis.

The interactions of the classes


bellow describe how the ARMs state is
managed, and the general explanation of
the ARM control protocol (ACP).

The angle between the hand and the


head which serves depth information i.e.
how far out or in is the hand with respect to
the head.
X, Y, Z Each point being tracked has an X, a
Y and a Z (Depth) value.
The kinect measures distances in
millimeters, to determine actions and
gestures thresholds and boundaries are set.
For example if the depth of the head is
greater than 2000mm then activate a state.
Movement translation
1. A command is sent from the control unit
to the communication module, each

2.
3.

4.
5.

command has a specific command


number.
Then that command is related to the
hardware,
The hardware then acknowledges the
command by sending the command
number back to the communication
module.
That acknowledgment is then sent to
the control unit a d then.
The control unit then updates the state
of the ARM via the state management
module.

State management class is concerned with


the managing the state of the ARM under
two conditions:
1. User Controlled
2. System Controlled.
The diagram below shows how the state of
the system is managed under a user
controlled condition.

The diagram below shows how the state of


the system is managed under a System
controlled condition.

The State management module also takes


into consideration state cases; these are
scenarios that may occur, which will cause
the system to enter into an inconsistent
state, rendering the StateManagement
class void and of non-effect.
State Cases:
1. Unacknowledged Commands
2. Boundary State ACKs
3. Late ACKs
Each Case must be managed in light of who
is in control i.e. the user or the system.
Unacknowledged Commands
1. User controlled movement
a. Ignore until the number of
Unacknowledged Commands
become greater than 3.
b. The user is then prompt to
check connection.
c. After 3 consecutive
successful ACKs the warning
is removed.
2. System Controlled Movement
a. After the first timeout the
motion is canceled.
b. User prompt to try again
c. After 3 consecutive
Unacknowledged Commands
prompt to check connection.

Boundary State ACKs


1. User controlled movement
a. Inform the user that the
motor limit has been
reached.
b. Transmission is paused
during this state
(Redundancy implemented
with the auto-correction
algorithm on hardware side)
2. System Controlled Movement
a. Send system in an
inconsistent state
b. prompt the user to restart
and reposition the arm in the
default position,
c. Reset system to default
values.
Late ACKs
Only considered in User Mode

Late ACKs will be ignored until 3 or


more late ACKs that are less than
the correction threshold time for
ACKs then the time out will be
adjusted to the mean excess time of
the ACKs.

Correction threshold is the maximum


value the time out for the system ACKs can

be set to without significantly affecting the


performance of the system.

You might also like