Professional Documents
Culture Documents
Arms
Arms
Introduction
In environments toxic or dangerous for
humans, its always best to send a robot.
The main objective of this project is to
develop a robotic arm controlled by real or
familiar human gestures to be used in
situations or areas where we would rather
not risk a life, example when dealing with
toxic or radioactive materials or disarming
explosives. The arm will sever as a
substitute for an actual person.
In this paper we will explain two
main aspects of this project as seen
highlighted In the class diagram below the
StateManagement
class
and
the
ControlUnit class. The ControlUnit class
manages and interprets gestures coming in
from the kinect and speaks directly to the
hardware (the kinect) via the API.
This
information
is
then
communicated to the communication class
which manages information from the
hardware section also and relates this
information to the StateManagement class.
The purpose of the StateManagement class
is to maintain accurate and up to date
information of the state of the physical
robotic arm, using this information along
View Modes
The main reason for the use of multiple
view mods was to allow the user monitor
multiple conditions or states that must be
maintained for the system to function
effectively and efficiency. These view
modes are controlled by the ControlUnit
class.
The Head
The Elbow
Shoulder center
Right shoulder
Left shoulder
Hip Center
State Management
Along with the potentiometer that
would keep track of the state the robotic
arm is in the StateManagement class on the
software side will keep track of the ARMs
position.
r - Represents the distance between the
head and the hand along the x axis.
y - Represents the distance between the
head and the hand along the y axis.
2.
3.
4.
5.