You are on page 1of 59

Palestine Technical University Khadoori (PTUK)

Faculty of Engineering and Technology

Electrical Systems Department

Name of the Project:

Controlling robot arm using Kinect

Name of students: Asim turiaqi Dawoud sader


Ammar ashour Islam shraim

Supervisor: Dr. Mohammad Dreidy.

Graduation Project Submitted in Partial Fulfillment of the Requirements for the Degree

of Bachelor of Electrical and mechatronics engineering.

Tulkaram, Palestine
Year 2019

DEDICATION
I
This thesis is dedicated to: The sake of Allah, my Creator and my Master, My great

teacher and messenger, Mohammed (May Allah bless and grant him), who taught us the

purpose of life, My homeland Palestine; The great martyrs and prisoners, the symbol of

sacrifice; Khadoori University; my second magnificent home; My great parents, who never

stop giving of themselves in countless ways; Our friends and colleagues who stand with us

during all obstacles and issues ; The dear supervisor Dr.Mohammad Dreidy ;The Deanship

of the Faculty of Engineering and Technology Dr. Basem Al Sayed the dean of the

engineering department, Dr.mohammed Nour; on their generous efforts in the service of

this project.

Abstract
II
Today, technology is developing in the same direction in line with rapidly increasing of

human needs. The work done to meet these needs makes life easier every day, and these

studies are concentrated in robotics studies. Actually in recent year’s scientists use the

word "Robot" to mean any man-made machine that can perform work or other action

normally performed by humans, either automatically or by remote control because of this

robot pervasive machine because of it is accuracy of work and doing thing that people

can’t do in addition robot can work in dangerous regions that human can’t work in it

because of all these reason robot became one of the most popular thing that scientists still

persevere to make it better by finding new controllers and designs that make robot more

efficient and more reliable and in our project we have built a robot arm with 5 DOF

(degree of freedom). In fact there are several methods were implemented to make a 5-

DOF manipulator capable of performing pick-and-place operations. but the problem is that

all the controller is relentless that mean if we need to change the program of the arm we

have to reboot and write or designs another one and upload it to arm robot this way

apparently is not efficient, on balance we choose a different and unconventional method to

control the robot arm by using image processor device that called (Kinect).

Acknowledgments

III
Alhamdulillah, all praise and thanks to ALLAH for his Guidance and Benevolence. Firstly

we would like to thank our supervisor Dr.Mohammad deridy for good advice and

guidance through this project. Many thanks to our supervisor for his suggestions and good

discussions of the results. Also, thanks to him for helping us understand the robot system

and for sharing information about the robot system and the Kinect. We also would like to

acknowledge my special thanks to Palestine Technical University (Khadoorie) for

providing academic and practical support to accomplish this project.

A special thanks to our parents who stuck out with us through our late working hours and

got our motivation back when it was gone. And last but not least a big thanks to our

teachers and doctors for their support.

TABLE OF CONTENTS

IV
Cover Page ………………………………………… I

V
DEDICATION ………………………………………… II
ABSTRACT ………………………………………… III
ACKNOWLEDGEMENT ………………………………………………. IV
Chapter 1: Introduction 1
1.1 Overview ………………………………………………. 2
1.2 Type of robot ………………………………………………. 3
1.2.1:liner robot ………………………………………………. 3
1.2.2:cylindirical robot ………………………………………………. 4
1.2.3:Parallel robot ………………………………………………. 5
1.2.4:spherical robot ………………………………………………. 5
1.2.5:Scara robot ………………………………………………. 6
1.2.6:Articulated robot ………………………………………………. 7
1.2.6.1: Robotic arm 8
1.3:Project objectives ………………………………………………. 10
1.4:project scope and ………………………………………………. 11
methodology
Chapter 2: Hardware 12
2.1:Introduction ………………………………………………. 13
2.2:Electronic circuits ………………………………………………. 13
2.2.1:Arduino Mega ………………………………………………. 13
2.2.2:Servo motor ………………………………………………. 14
2.2.3:Power supply ………………………………………………. 16
2.2.4:Xbox Kinect ………………………………………………. 17
Chapter 3: Mechanical design 20
3.1Robotics arm ………………………………………………. 21
mechanical design
3.1.1:Brackets assembly ………………………………………………. 22
3.1.2:Gripper assembly ………………………………………………. 24
3.1.3:Base assembly ………………………………………………. 25
Chapter 4: Software 29
4.1:Introduction ………………………………………………. 30
Chapter 5: Conclusion and limitation 33
5.1:Conclusion ………………………………………………. 34
5.2:Limitation ………………………………………………. 34
VI
Chapter 6: Results and Testing 37
Appendix ………………………………………………. 39
Appendix (A1): Processing code 39
Appendix (A2): Arduino code 45
References ………………………………………………. 49

List of Figurers:

Figure 1.1: Linear Robots ……………………………………………………… 2

VII
Figure 1.2: Cylindrical Robots 3
Figure 1.3: Parallel Robot 4
Figure 1.4: Spherical Robots 5
Figure 1.5 SCARA Robot 6
Figure 1.6: Articulated Robots 7
Figure 1.7: Scope and methodology of robot arm 10
Figure The robot arm electronic circuit 12
2.1 :
Figure 2.2 Arduino mega development board 13
Figure 2.3 Pulse width of servo motor 14
Figure 2.4 Servo motor 14
Figure 2.5 Power supply 15
Figure 2.6 Buck converter xl4006 board 16
Figure 2.7 Buck converter “ circuit diagram “ 16
Figure 2.8 Xbox 360 kinect 17
Figure 2.9 Kinect adapter 18
Figure 3.1 Robot arm diagram 21
Figure 3.2 Servo brackets components 22
Figure 3.3 Servo bracket ( side front view ) 22
Figure 3.4 Templates of the gripper 23
Figure 3.5 Assembled gripper 23
Figure 3.6 Assembling of servo horn on base 24
Figure 3.7 Assembling of aluminum piece on bearing 24
Figure 3.8 Assembling of U brackets 25
Figure 3.9 Attaching servo’s to the E brackets 25
Figure A bracket 26
3.10
Figure Double brackets 27
3.11
Figure Robot arm after construction 27
3.12
Figure 4.1 Software block diagram 28
Figure 4.2 Processing algorithm 29
Figure 4.3 Arduino algorithm 30
Figure 5.1 Shoulder joint issue 34
Figure 6.1 Result and testing 36

VIII
IX
Chapter 1: Introduction

1
Chapter 1: Introduction

1.1 Overview

A robot is a machine designed to execute one or more tasks automatically with speed and

precision. We need robots because robots are often cheaper to use over humans, in addition it

is easier to do some jobs using robots and sometimes the only possible way to accomplish

some tasks! Robots can explore inside gas tanks, inside volcanoes, travel the surface of Mars

or other places too dangerous for humans to go where extreme temperatures or contaminated

environments exist. Robotics is an interdisciplinary branch of engineering and science that

includes mechanical engineering, electrical engineering, computer science, and others.

Robotics deals with the design, construction, operation, and use of robots, as well as computer

systems for their control, sensory feedback, and information processing. Robotic system has

been widely used in manufacturing, military and surgery since the robot can perform many

advantages and used as the countermeasure for some job that cannot be conduct by the human

excellently. [1].

Robots are used in different fields such as industrial, military, space exploration, and

medical applications. These robots could be classified as manipulator robots and cooperate

with other parts of automated or semi-automated equipment to achieve tasks such as loading,

unloading, spray painting, welding, and assembling. Generally robots are designed, built and

controlled via a computer or a controlling device which uses a specific program or algorithm.

Programs and robots are designed in a way that when the program changes, the behavior of

the robot changes accordingly resulting in a very flexible task achieving robot. Robots are

2
categorized by their generation, intelligence, structural, capabilities, application and

operational capabilities.

1.2 Type of robots

In this study robots are reviewed according to their structural properties:

1.2.1 Linear Robots

A robot which has linear actuators cooperating with linear motors linked to a linear axis

is known as a linear robot (also known as gantry or Cartesian) as shown in figure 1.1. This

link can be fixed or flexible connections between the actuators and the robot. The linear motor

is attached directly to the linear axis. Robots which use two motors in controlling a linear axis

defined gantry robots. Each motor has a limited distance orthogonal to the linear axis. Ball

screws follow the same principles which either use linear motors or rotary motors. This kind

of robots usually achieve tasks such as palletizing, unitizing, and stacking, order grasping,

loading, and coordinate measuring. The manipulator (also known as end-effector) of the linear

robots is connected in an overhead way that allows the robot to move along the horizontal

plane easily, where each of these movements are perpendicular to each.

Figure (1.1): Linear Robots

3
This type of robot has many advantages like high speed and stiffness, good performance,

Good for multiple machines and lines, good handling with large loads. However, it has large

structural frame, complex mechanical properties for linear sliding movements, Energy

inefficiency, large floor space requirement, Limited workspace, Common workspace

restriction.

1.2.2 Cylindrical Robots

Cylindrical robots have two prismatic joints: one rotary joint for positioning task and the

end-effector of the robot forms a cylindrical workspace. The main idea of the cylindrical

robots is to mount a horizontal arm which moves in forward and backward directions. The

horizontal arm is linked to a carriage which goes up and down and is connected to the rotary

base. When the arm of the robot has a revolute and two prismatic joints, it can operate in z-

axis and each point that can be reached by this robot can be represented by the cylindrical

coordinates. As shown in figure 1.2, the robot can move in and out in 𝑧 direction, can elevate

in 𝑦 direction and can rotate in 𝜃 direction. The arm can move in directions between the

specific upper and lower boundaries.

Figure (1.2): Cylindrical Robots

4
1.2.3 Parallel Robot

A parallel robot has an end-effector with 𝑛 DOF which is connected to a fixed base. The

connection is done by at least two independent kinematic chains which provide the

movements of the robot as shown in figure 1.3. A generalized parallel manipulator has a

closed-loop kinematic chain mechanism where the manipulator is linked to the base.

Figure (1.3): Parallel Robot

1.2.4 Spherical Robots

The spherical robot (also known as polar robot) is huge in terms of size and has a

telescopic arm. Spherical robot basic movements are rotation at the base and angularly up and

down at the arm. 15 Spherical robots have at least two movable joints and one fixed joint. The

schematic diagram and the motion of the spherical robot consists of the following three

movement steps; the first movement defines the rotation of the base along vertical axis. The

second movement defines the rotation of the arm and finally the third movement defines the

5
in and out motion. The workspace of the spherical robot depends on the volume of globe of

the sphere. The workspace of the robot is the space between two concentric hemispheres.

When the arm is fully retracted, the reach of the arm is the inner hemisphere and when the

arm is fully straightened, the reach is the outer hemisphere.

Figure (1.4): Spherical Robots

This type of robot has many advantages like light weight, simple kinematics, compatible

with other robots especially with ones in a common workspace, sharp joints level, good

resolution due to perpendicularity of the end-effector’s errors. However, it need of variable

torque due to the large size, challenging counter balancing, chance of having collision with

obstacles due to bounded ability to avoid collisions and large position errors due to the

rotation and proportional radius

1.2.5 SCARA Robot

Selective Compliance Assembly Robot Arm (SCARA) was first designed and invented in

early 1960s in Japan. SCARA robots are perfect for the applications which require high speed

and repetitive point to point movements. This is why SCARA is used widely in assembly

6
operation. Special end-effector movement makes SCARA ideal for the tasks which require

uniform motion and accelerations in a circular form. SCARA consists of two parallel rotary

joints and a prismatic joint. The rotary joints can move along the horizontal plane and the

prismatic joint moves along the vertical plane. One of the special characteristic of SCARA is

that the robot is smooth while operating on 𝑥 and 𝑦-axis but very strong versus the z-axis.

SCARA arm is able to pick up a part vertically from a horizontally placed table and

move along the horizontal plane to a desired point and accomplish the assembly task by

lowering the arm and placing the part at its proper location as shown in the following figure

1.5.

Figure (1.5): SCARA Robot

1.2.6 Articulated Robots

Articulated robots (also known as revolute robots) have three fixed axis connected to two

revolute base as shown in figure 1.6. All joints of an articulated arm are revolute and most

likely represent the human arm.

7
A robotic arm can be said to be a typical example for articulated robot. An important matter

which should be considered is that the dimension of the configuration space increases with the

number of joints however the operation speed is limited due to the different payloads at the

manipulator and nonlinear environment

Figure (1.6): Articulated Robots

This type of robot has many advantages like super structural flexibility, compatible with

other robots operating in common workspace, high rotation speed. However, it has low

accuracy and resolution because of rotary joints and positional errors, counter balancing

difficulties due to the large and variable torque, high chance of collision and dynamic

instability due to higher moment of inertia and gravity.

1.2.6.1 Robot arm

Robotic manipulators resembling the human arm are known as robotic arms. They are

constituted by a structure consisting of structurally robust links coupled by either rotational

joints (also referred to as revolute joints) or translating joints (also referred to as prismatic

8
joints) a robotic arm is thus a type of mechanical arm, usually programmable, with similar

functions to a human arm. [2].

A typical robotic arm has the following components:

• Links and joints

• Actuators

• Controller

• End-effector

• Sensor (not present in our project)

1- Links and joints: A link is considered as a rigid body that defines the relationship

between two neighboring joint axes of a manipulator. Manipulators consist of rigid

links, which are connected by joints that allow relative motion of neighboring links.

The links move to position the end-effector.[3]

2- Actuators: Actuators play the same role the muscles play in the human arm - they

convert stored energy into movement. Actuators are used to move a robot’s

manipulator joints, the actuator can be pneumatic, hydraulic and electrical actuator but

in the project we used electrical actuators (servo motors), (see 2.2.2). [4].

3- The controller: is the main device that processes information and carries out

instructions in a robot. It is the robot's 'brain' and controls the robot's movements. It is

usually a computer of some type which is used to store information about the robot

and the work environment and to store and execute programs which operate the robot.

It contains programs, data algorithms, logic analysis and various other processing

9
activities which enable the robot to perform its intended function, most robots

incorporate computer or microprocessor-based controllers. These controllers perform

computational functions and interface with and control sensors, grippers, tooling, and

other peripheral equipment [5], so the controller that we used in this project is Arduino

mega microcontroller.

4- End-effector: In robotics, an end-effector is the device at the end of a robotic arm,

designed to interact with the environment. The exact nature of this device depends on

the application of the robot. Typical functions of the end-effector include grasping,

pushing and pulling, twisting, using tools, performing insertions, welding and various

types of assembly activities. Thus, the major types of robot end-effectors are Grippers

(see 3.1.2), material removal tools, welding torches, tool changers, and there is a

surgical robots have end-effectors that are specifically manufactured for performing

surgeries. [4].

This project is about a 5 DOF robot arm that used for plenty purposes like moving or grapping

things and it can be used in medical tasks for operation that need accuracy and last long time

in addition robot arm can be used as artificial arm for people that suffer from missing limps.

The main idea in this project is involve kinect to use it as an image processor that take images

of body and compile it to the controller to do the task.

1.3 Project objectives:

1- Design and fabricate a five DOF robot arm with gripper (end effector) that can be

used for demonstrative and educational purposes in addition it will be able to hold

thing or repairing thing since it will have a wide work envelope.

10
2- Use Kinect device to control the robot arm that successfully achieved the task of

lifting an object from one location to another as per programed sequence.

3- The ability of using this arm in dangerous or poisonous environments to do some work

depending on human arm orders.

1.4 Project Scope and Methodology:

Figure (1.7) shows the scope and methodology of this project, starting from the revision

of different load shedding techniques to validate the proposed scheme experimentally.

Review all existing robot types

Design an arm robot with gripper

Implement the mechanical part of arm robot

Implement the electronic part of arm robot

Test the arm robot movements based on human


movements via Xbox kinect
Figure (1.7): Scope and methodology of arm robot

11
Chapter 2: Hardware

12
Chapter 2: HARDWARE

2.1 Introduction:

This chapter will review the overall project scheme with all electrical components and

circuits that have been used in the project including microcontroller that used (Arduino), and

the motors that use for arm joints and the power supply that used to feed the overall... etc.

This project is divided into two parts which are the electronic circuit and mechanical design

that will be discussed in chapter 3:

2.2 Electronic circuit


The electronic circuit of arm robot is shown in figure 2.1. The human stand in front of the

kinect, then the processing software divide the body into joints and send the angle of these

joints to the arduino. The arduino mega send the control signals which represent the angles to

the servos to apply the required movements.

Figure (2.1): the robot arm electronic circuit


13
2.2.1 Arduino Mega Board

Arduino has become very popular in the world in recent times. In this project the arduino

mega is used to communicate with processing and control the arm robot servos. The causes of

the spread of Arduino at such a rapid rate are: 1) it can be used on all platforms due to the

simplicity of the development environment with driver usage. 2) With the help of the

advanced library, even complex operations can be easily solved. 3) There is a lot of hardware

support that is compatible with Arduino and can work together. 4) Communication with the

environment is easy because it is open source. The arduino mega and its construction is

shown in figure 2.2.

Figure (2.2): Arduino Mega development board

2.2.2 Servo motor:

Servo motors are the kinds of motors that can fulfill the commands we want. They can

operate steadily even at very small or very large speeds. In these motors, the large moment

can be obtained from the small size. Primarily, servomotors are geared dc motors with a

positional feedback control that allows the rotor (shaft) to be positioned accurately. The

position control signal is a single variable width pulse. The pulse can be varied from 1 to 2

ms. The width of the pulse controls the position of the servo motor shaft. A 1-ms pulse rotates

14
the shaft to the extreme counterclockwise (CCW) position (-45’). A 1.5-ms pulse places the

shaft in a neutral midpoint position (0’). A 2-ms pulse rotates the shaft to the extreme

clockwise (CW) position (+45’). The pulse width signal is sent to the servomotor

approximately 50 times per second (50Hz). Figure below illustrates the relationship of pulse

width to servomotor position. [6].

Figure (2.3): pulse width of servo motor

In this project an (MG996R) servo motor is used to alloy the arm robot move as required.

This servo motor is shown in figure 2.4.

Figure (2.4): servo motor (MG996R)

15
2.2.3 Power supply

In this project a power supply is required to feed the overall electronic circuit. This

requirement is provided by a power supply shown in figure 2.5. However, the power supply

that we use it in our project converting 220 V-AC to 12-DC so we use an external converter

called “Buck converter XL4016 “to step down the power to 5 volts as shown in figure 2.6.

Figure (2.5): Power supply

Actually this type of converter is used because it has many features some like: 1) Wide

8V to 40V Input Voltage Range 2) Output Adjustable from 1.25V to 36V 3) 8A Constant

Output Current Capability and this make sure that the converter can fed all servo motor at the

same time without overloading 4) High efficiency up to 96% 5) Excellent line and load

regulation 6) Built in current limit function 7) Built in short circuit and over voltage

protections. The schematic circuit of this buck converter is shown in figure 2.7

16
Figure (2.6): Buck converter XL4016 Board

Figure (2.7): Buck converter “Circuit diagram”

2.2.4 Xbox Kinect

Microsoft Corporation announced and demonstrated a new add-on device for the Xbox

360 video game platform named as Project Natal (which later called as Kinect sensor) in June

2009 which attracted the robotics community to evaluate it as a potentially valuable device.

The official release of Kinect in North America was on November 4th, 2010, in Europe on

November 10th, 2010. [7]- [8].

The Kinect features a RGB camera, an IR depth sensor as shown in the previous a multi-

array microphone and a motor in the socket for tilting the Kinect. Both the RGB and IR

17
camera have a resolution of 640x480 pixels (VGA) and can deliver an image stream of 30

frames per second. In Xbox games the cameras are used to track the players, while the

microphone array are used to issue voice commands. Using several microphones the Kinect

can locate the source of the sound, and thus it can know which of the players who issued the

command. Figure shows the Kinect and the location of the different parts. For the depth

sensor, the IR-emitter is to the left, while the IR-sensor is on the right. [9].

Figure (2.8): XBOX 360 Kinect

The “brain” in the Kinect comes from Prime Sense’s PS1080 System on Chip .The

PS1080 controls the IR emitter and receives input from the cameras and microphones. The

depth map are calculated on the chip and all the sensor inputs are synchronized and sent to the

host computer via an USB 2.0 interface. The Kinect uses more power than the USB

connection can deliver, and thus it requires an extra power supply shown in figure 2.9. This

power supply is in most case included when you buy the Kinect, since only newer Xbox 360’s

can deliver enough power to the Kinect. [10].

18
Figure (2.9): Kinect adapter

- Programming the Kinect

The Kinect was intended as a gaming devise to use with the Xbox 360, but it did not take a lot

of time before the first open source drivers for connecting the Kinect to a computer was

available online. Many programmers saw the potential in this new technology and wanted to

use it. This has led to open source communities making and sharing code to use with the

Kinect. [10]

Today there are three main sources of drivers for the Kinect. The first source of drivers

are those made by the open source community OpenKinect2 and their “libfreenect” software

library that provides drivers that give you direct access to the raw data from Kinect. Another

alternative software is the software provided by OpenNI, which is an industry-led non-profit

organization working with natural interaction between people and machines. One of the main

contributors to OpenNI is PrimeSense™, the company that made the 3D technology used in

the Kinect. The last alternative is to use the official drivers released by Microsoft. These

drivers were released along with a software development kit, SDK, for the Kinect in

May/June 2011, and further updated as Microsoft released their Kinect for Windows in 2012.

[10]

19
Chapter 3:

Mechanical design

20
Chapter 3: Mechanical design
In designing machine components, it is necessary to have a good knowledge of many

subjects such as Mathematics, engineering mechanics, strength of materials, theory of

machines, and Engineering Drawing. Machines are always the same, they have combination

of linkages, gears, belts and other mechanics and by which we make a complete mechanism to

achieve a certain task. [11].

3.1 Robotics arm mechanical design

The first step of designing a robot is to decide the dimension and workspace

configuration according to the requirements. The next step is to decide the specification of

each actuator. The arm is attached to a base which is the most bottom part of the robot. It is

important to mention that the base ought to have considerably heavy weight in order to

maintain the general balance of the robot in case of grabbing an object. Although the idea of

using stepper and gear motors is brilliant, but physical movement of the robot is done by

using servo motors. The advantage of the servos is that they can be programmed to return to

their initial position. Since the servo motors operate using the signals received from the

microcontroller, they could be programmed according to the requirements. The diagram of

our robot arm and it is coordinates before we construct it is shown in figure 3.1.

21
Figure (3.1): robot arm diagram

The developed robot in this study is a stationary articulated robotic arm with 5 DOF

which includes base, shoulder, elbow, gripper, including two revolute joints (wrist, Base).

Before starting construction on the robotic arm the 5 servo motor controller should be built

first. Before we begin construction, we assembled a servo motor bracket in order to determine

function before assembling them into a robotic arm.

3.1.1 Bracket assembly

The servomotor bracket components are shown in figure 3.2. Each of the aluminum U

brackets that make up the assembly has multiple holes for connecting a standard Hi-Tec or

Futba servomotor as well as bottom and top holes for connecting U brackets and assemblies to

one another. The servomotor horn holes in Part B are compatible with Hi-Tec the round horn

22
standard included with Hi-Tec servomotors, these brackets may also be used with similar size

Futaba servomotors, so we need to match a suitable horn to attach to part B. Each servomotor

bracket assembly consists of the following components: two aluminum U brackets, labeled A

and B, (1) binding head post screw, (4) plastic pop rivets and four sheet metal screw for

mounting a servomotor horn. When assembled with a compatible servomotor, the bracket

becomes a modular motion controlled component that may be attached to other brackets and

components. [12].

Figure (3.2): Servo brackets components

The servomotor bracket allows the top and bottom components to swivel along the axis

of the servomotor’s shaft as shown in figure 3.3. [7].

Figure (3.3): Servo bracket (side and front view)

23
3.1.2 Gripper Assembly

The gripper of the arm is designed in a way which uses a single actuator and follows a basic

physical gear concept. This means that when the mini servo actuates, it turns the gear which is

attached to it causing the gripper to expand and contract. Figure 3.4 shows the template of the

gripper with its magnitude. After assembling these part together by using special screws and

post it will be shown in figure 3.5. [7].

Figure (3.4): Templates of the gripper

Figure (3.5): Assembled gripper.

24
3.1.3 base Assembly

We begin construction of the base, by putting the base plate assembly together. Then the

servomotor horn is attached to the aluminum space using two #2 x ¼” screws as seen in figure

3.6.

Figure (3.6): assembling of servo horn on base

Then we are assembling this horn with bearing to allow the base rotating as figure 3.7.

Figure (3.7): assembling of aluminum piece on bearing

25
Finally we attached this bracket on the rotating base as shown in figure 3.8

Figure (3.8): assembling of U brackets

The next step, is removing servo motor horns from two servo motors. Then attach two

servo motors to servo motor bracket D using plastic pop rivets then attach the two servo

motor horns removed from the servo motors in the last step and attach them to bracket E.

Attach the horns to the bracket as seen in figure 3.9.

Figure (3.9): attaching servo’s to the E brackets

26
The servo motor horns should still be only loosely attached to the “E” bracket. The “E”

bracket should be loosed enough to move back and forth slight. The reason that this is

necessary is so that we can tighten the “E” bracket on to the servo motors without having the

two “D” bracket servo motors fight against one another. At this point, keep the power applied

to the servo motors and gently tighten the screws holding the servo motor horns to the “E”

bracket .Next place the center servo motor screws will lock the servo motor horns to the servo

motors. Next, attach an empty (without servo) “A” bracket to the top of the “E” bracket using

(4) machine screws with #4 split lock washers and nuts, ass seen in figure 3.10

Figure (3.10): A Bracket

Then we attach another upset “A” bracket to the top of the previous one to make the link that

connect the shoulder to the elbow joint as seen in figure 3.11:

27
Figure (3.11): double brackets

Now after make the link we have to attach the elbow joint so we add an A bracket

including servo motor to represent elbow joint at the same way of previous steps. Then we

attached a servo motor in vertical way on the elbow joint to represent the rotating wrist.

Finally we add gripper hand to the wrist joint by this way the robot arm became ready as seen

in figure 3.12.

Figure (3.12): robot arm after construction

28
Chapter 4: Software

29
Chapter 4: Software

4.1 Introduction:
This chapter will explain the block diagram of software for this project as shown in figure 4.1.

Start

Human stand in
front of Kinect

The Kinect capture the


human body

Processing software divide the body into


joints and send the joints angles serially

Left elbow angle Left shoulder angle Right shoulder angle Right elbow angle Neck angle

Arduino mega
receive all angles
from serial port

Arduino mega send all


control signals to the
servo motors

Figure (4.1): Software block diagram

30
The algorithm of this project is divided into two parts the first part is processing program

which shown 4.2. The second is arduino program which shown in figure 4.3.

Start

Import the required


libraries

Initial the serial port

Enable depth Map


Enable skeleton generation for all joints by
using kinect

Draw the skeleton joints

calculate the axes against which we


want to measure our angles

Calculate the angles between our joints and send it serially


float shoulderAngle = angleOf(rightElbow2D,rightShoulder2D,torsoOrientation);
.
.
.

Left elbow angle Left shoulder angle Right shoulder angle Right elbow angle Neck angle
out[1] = byte(elbowAngle); out[0] = byte(shoulderAngle); . . .
port.write(out[1]); port.write(out[0]); . . .

Figure (4.2): Processing algorithm

31
Start

Import the required


libraries

Define the servo


motors

Initial the serial port

Read the incoming array which represents all angles


incomingByte[i] = Serial.read();

Write each angle with corresponding servo

Left elbow angle Right shoulder angle Right elbow angle Neck angle
Left shoulder angle . .
.
servo2.write(incomingByte[1]) servo1.write(incomingByte[0]) . .
.

Figure (4.3): Arduino algorithm

32
Chapter 5:

Conclusion and limitations

33
Chapter 5: Conclusion and limitations

5.1 Conclusion

 From this project we conclude that we can use the Kinect tool not just for games

because we have used it for controlling different types of machines.

 During the period of the software of the project we found that the processing language

(JAVA) have made a revolution in the technology domain.

 This project is a prototype of an immense number of projects that we can use it in a lot

of purposes like military, medical, social, educational, and industrial and we can use it

for people with special needs.

 We also conclude that we merge more than programming language to achieve some

processes to run the project as we have done in our project, we have merged the

processing program (JAVA) with Arduino program (C ) by using serial port .

5.2 Limitations:
In this section we will introduce some problems that faced us during working on the project:

 The first problem was when we tested the project, the shoulder joint had only one servo

so the torque wasn’t enough to carry the load so we have decided to add another motor

antiparallel to the first one.

 The second is during the testing of the arm on simple code of servo motor called (sweep)

which all the servo to move freely in either directions, so we found that the two shoulder

servo’s are not moving at the same direction, see figure 5.1, so we have solved this

34
problem successfully by inverting the angle of one of the servo so in that way they will

move together at the same direction.

Figure (5.1): shoulder joint issue

35
Chapter 6:

Results and testing

36
Chapter 6: Results and testing

After completing all the design process and reviewing, the final test is on its place. The
robotic arm is assembled as shown in figure (3.12) and programmed in different sequence to
Check the result of the system.

Successfully the processing programmed response to the Kinect and when the body stand in

front of Kinect so when we run the processing program a scope windows appear which

included human body skeleton divided into joint as shown in following figure:

Figure (6.1): skeleton included body joints

37
As explained in figure (4.2) processing program have to give response for 5 joint of the body

so when we connected the robot with power all joint successfully give desirable motion at the

robot arm except the last joint, so when left shoulder move the base of the arm rotate in either

directions , and when the left elbow move the shoulder of the arm move, and when the Right

shoulder move the elbow of the arm of the arm move, and when the Right elbow move the

wrist of the arm rotate in either directions according to the angle.

But here is the problem the neck joint didn’t give a precise response , so it stay open it and

move in just some degrees that it isn’t enough to hold or grab things, this problem happened

due to processor program command :

(float NECKM = angleOf(NECK2D,torsoOrientation,torsoOrientation);

Since the neck is angle is moving in respect of the torso of the body but it doesn’t work so it need to

measure the angle in respect of something else.

On balance this problem can be overcome for people who want to try to this project by

adjusting in processing program commands to get desirable result.

38
Appendix

A. Computer programming

Robotic system requires a specific set of instruction to achieve a given task those

instructions are compiled in computer based program. In modern technology we have around

2000 computer language but some of them used widely. Now in our project three is two

stages of programming the first one in processing program that take the data from Kinect and

compile it to the Arduino program, so in this project we have two codes the first one is the

processing code, actually we use processing 2.2.1 to be the compiler The code we wrote is given

below in JAVA language:

A1. Processing code

import SimpleOpenNI.*;
import processing.serial.*;
Serial port;
SimpleOpenNI kinect;
SimpleOpenNI context;
color[] userClr = new color[]{ color(255,0,0),
color(0,255,0),
color(0,0,255),
color(255,255,0),
color(255,0,255),
color(0,255,255)
};
PVector com = new PVector();
PVector com2d = new PVector();

39
void setup()
{
println(Serial.list());
String portName = Serial.list()[0];
port = new Serial(this, portName, 9600);
size(640,480);

context = new SimpleOpenNI(this);


if(context.isInit() == false)
{
println("Can't init SimpleOpenNI, maybe the camera is not connected!");
exit();
return;
}
// enable depthMap generation
context.enableDepth();
// enable skeleton generation for all joints
context.enableUser();
background(200,0,0);
stroke(0,0,255);
strokeWeight(3);
smooth();
}
void draw()
{
// update the cam
context.update();
// draw depthImageMap
image(context.depthImage(),0,0);
image(context.userImage(),0,0);

40
// draw the skeleton if it's available
int[] userList = context.getUsers();
for(int i=0;i<userList.length;i++)
{
if(context.isTrackingSkeleton(userList[i]))
{
stroke(userClr[ (userList[i] - 1) % userClr.length ] );
drawSkeleton(userList[i]);
}
// draw the center of mass
if(context.getCoM(userList[i],com))
{
context.convertRealWorldToProjective(com,com2d);
stroke(100,255,0);
strokeWeight(5);
beginShape(LINES);
vertex(com2d.x,com2d.y - 5);
vertex(com2d.x,com2d.y + 5);
vertex(com2d.x - 5,com2d.y);
vertex(com2d.x + 5,com2d.y);
endShape();
fill(0,255,100);
text(Integer.toString(userList[i]),com2d.x,com2d.y);
}
}
}
// draw the skeleton with the selected joints
void drawSkeleton(int userId)
{
// to get the 3d joint data

41
// PVector jointPos = new PVector();
// context.getJointPositionSkeleton(userId,SimpleOpenNI.SKEL_NECK,jointPos);
// println(jointPos);

context.drawLimb(userId, SimpleOpenNI.SKEL_HEAD, SimpleOpenNI.SKEL_NECK);


context.drawLimb(userId, SimpleOpenNI.SKEL_NECK, SimpleOpenNI.SKEL_LEFT_SHOULDER);
context.drawLimb(userId, SimpleOpenNI.SKEL_LEFT_SHOULDER,SimpleOpenNI.SKEL_LEFT_ELBOW);
context.drawLimb(userId, SimpleOpenNI.SKEL_LEFT_ELBOW, SimpleOpenNI.SKEL_LEFT_HAND);
context.drawLimb(userId, SimpleOpenNI.SKEL_NECK, SimpleOpenNI.SKEL_RIGHT_SHOULDER);
context.drawLimb(userId, SimpleOpenNI.SKEL_RIGHT_SHOULDER,
SimpleOpenNI.SKEL_RIGHT_ELBOW);
context.drawLimb(userId, SimpleOpenNI.SKEL_RIGHT_ELBOW, SimpleOpenNI.SKEL_RIGHT_HAND);
context.drawLimb(userId, SimpleOpenNI.SKEL_LEFT_SHOULDER, SimpleOpenNI.SKEL_TORSO);
context.drawLimb(userId, SimpleOpenNI.SKEL_RIGHT_SHOULDER, SimpleOpenNI.SKEL_TORSO);
context.drawLimb(userId, SimpleOpenNI.SKEL_TORSO, SimpleOpenNI.SKEL_LEFT_HIP);
context.drawLimb(userId, SimpleOpenNI.SKEL_LEFT_HIP, SimpleOpenNI.SKEL_LEFT_KNEE);
context.drawLimb(userId, SimpleOpenNI.SKEL_LEFT_KNEE, SimpleOpenNI.SKEL_LEFT_FOOT);
context.drawLimb(userId, SimpleOpenNI.SKEL_TORSO, SimpleOpenNI.SKEL_RIGHT_HIP);
context.drawLimb(userId, SimpleOpenNI.SKEL_RIGHT_HIP, SimpleOpenNI.SKEL_RIGHT_KNEE);
context.drawLimb(userId, SimpleOpenNI.SKEL_LEFT_HIP, SimpleOpenNI.SKEL_LEFT_KNEE);
context.drawLimb(userId, SimpleOpenNI.SKEL_RIGHT_KNEE, SimpleOpenNI.SKEL_RIGHT_FOOT);

if ( context.isTrackingSkeleton(userId)) {
PVector rightHand = new PVector();
context.getJointPositionSkeleton(userId,SimpleOpenNI.SKEL_RIGHT_HAND,rightHand);
PVector LEFTHand = new PVector();
context.getJointPositionSkeleton(userId,SimpleOpenNI.SKEL_LEFT_HAND,LEFTHand);
PVector rightElbow = new PVector();
context.getJointPositionSkeleton(userId,SimpleOpenNI.SKEL_RIGHT_ELBOW,rightElbow);
PVector LEFTElbow = new PVector();
context.getJointPositionSkeleton(userId,SimpleOpenNI.SKEL_LEFT_ELBOW,LEFTElbow);
PVector rightShoulder = new PVector();
context.getJointPositionSkeleton(userId,SimpleOpenNI.SKEL_RIGHT_SHOULDER,rightShoulder);

42
PVector LEFTShoulder = new PVector();
ontext.getJointPositionSkeleton(userId,SimpleOpenNI.SKEL_LEFT_SHOULDER,LEFTShoulder);

// we need right hip to orient the shoulder angle


PVector RIGHTHip = new PVector();
context.getJointPositionSkeleton(userId,SimpleOpenNI.SKEL_RIGHT_HIP,RIGHTHip);
PVector NECK = new PVector();
context.getJointPositionSkeleton(userId,SimpleOpenNI.SKEL_NECK,NECK);
PVector LEFTHip = new PVector();
context.getJointPositionSkeleton(userId,SimpleOpenNI.SKEL_LEFT_HIP,LEFTHip);

// reduce our joint vectors to two dimensions


PVector rightHand2D = new PVector(rightHand.x, rightHand.y);
PVector NECK2D = new PVector(NECK.x, NECK.y);
PVector LEFTHand2D = new PVector(LEFTHand.x, LEFTHand.y);
PVector rightElbow2D = new PVector(rightElbow.x, rightElbow.y);
PVector LEFTElbow2D = new PVector(LEFTElbow.x, LEFTElbow.y);
PVector rightShoulder2D = new PVector(rightShoulder.x,rightShoulder.y);
PVector LEFTShoulder2D = new PVector(LEFTShoulder.x,LEFTShoulder.y);
PVector RIGHTHip2D = new PVector(RIGHTHip.x, RIGHTHip.y);
PVector LEFTHip2D = new PVector(LEFTHip.x, LEFTHip.y);
// calculate the axes against which we want to measure our angles
PVector torsoOrientation = PVector.sub(rightShoulder2D, RIGHTHip2D);
PVector upperArmOrientation = PVector.sub(rightElbow2D, rightShoulder2D);

// calculate the angles between our joints


float shoulderAngle = angleOf(rightElbow2D,rightShoulder2D,torsoOrientation);
float shouldermAngle = angleOf(LEFTElbow2D,LEFTShoulder2D,torsoOrientation);
float elbowAngle = angleOf(rightHand2D,rightElbow2D,upperArmOrientation);
float elbowMAngle = angleOf(LEFTHand2D,LEFTElbow2D,upperArmOrientation);
float NECKM = angleOf(NECK2D,torsoOrientation,torsoOrientation);

43
// show the angles on the screen for debugging
fill(255,0,0);
scale(3);
text("shoulder: " + int(shoulderAngle) + "\n" + " elbow: " + int(elbowAngle)+ "\n" + " LEFT
SHOULDER: " + int(shouldermAngle)+ "\n" + " LEFT ELBOW: " + int(elbowMAngle)+ "\n" + "
NECK ANGLE: " + int(NECKM), 20, 20);
byte out[] = new byte[5];
out[0] = byte(shoulderAngle);
out[1] = byte(elbowAngle);
out[2] = byte(shouldermAngle);
out[3] = byte(elbowMAngle);
out[4] = byte( NECKM);
port.write(out[0]);
port.write(out[1]);
port.write(out[2]);
port.write(out[3]);
port.write(out[4]);
println (out[0]);
println (out[1]);
println (out[2]);
println (out[3]);
println (out[4]);
}}
float angleOf(PVector one, PVector two, PVector axis) {
PVector limb = PVector.sub(two, one);
return degrees(PVector.angleBetween(limb, axis));
}
// -----------------------------------------------------------------
// SimpleOpenNI events
void onNewUser(SimpleOpenNI curContext, int userId)
{
println("onNewUser - userId: " + userId);

44
println("\tstart tracking skeleton");
curContext.startTrackingSkeleton(userId);
}
void onLostUser(SimpleOpenNI curContext, int userId)
{
println("onLostUser - userId: " + userId);
}
void onVisibleUser(SimpleOpenNI curContext, int userId)
{
//println("onVisibleUser - userId: " + userId);
}
void keyPressed()
{
switch(key)
{
case ' ':
context.setMirror(!context.mirror());
break;
}
}
_________________________________________________________________________________

A.2 The Arduino code

#include <Servo.h>
Servo servo1;
Servo servo2;
Servo servo3;
Servo servo4;
Servo servo5;
Servo servo6;

45
/*
This sketch allows the Arduino to read 16 bytes of data from Vixen and turn on
its pins accordingly, which in turn controls a solid state relay hooked up to Xmas lights.
*/
// Define pins on Arduino that will control the relay.
#define CHANNEL_06 7
#define CHANNEL_07 8
#define CHANNEL_08 9
#define CHANNEL_09 10
#define CHANNEL_10 11
#define CHANNEL_11 12
// Define size of array to hold channels
#define CHANNEL_COUNT 5
// Define array to hold channels
int channels[] =
{
CHANNEL_06, CHANNEL_07, CHANNEL_08, CHANNEL_09, CHANNEL_10
,CHANNEL_11
};
// Define array to hold incoming data stream from Vixen
int incomingByte[5];
// Define baud rate. This figure must match that of your profile configuration in Vixen!
#define BAUD_RATE 9600
int ledPin = 13; // Set the pin to digital I/O 4
void setup()
{
servo1.attach(7);
servo2.attach(8);
servo3.attach(9);

46
servo4.attach(10);
servo5.attach(11);
servo6.attach(12);

// Begin serial communication


Serial.begin(BAUD_RATE);

pinMode(ledPin, OUTPUT); // Set pin as OUTPUT


// Set up each channel as an output
for(int i = 7; i < 13; i++)
{
pinMode(channels[i], OUTPUT);
}
}
void loop()
{
if (Serial.available() > CHANNEL_COUNT)
{
// Read data from Vixen, store in array
for (int i = 0; i < CHANNEL_COUNT; i++)
{
incomingByte[i] = Serial.read();
}
//
// // Write data from array to a pin on Arduino
// for (int i = 0; i < CHANNEL_COUNT; i++)
// {
servo1.write(incomingByte[0]); // BASE RIGHT SHOULDER
servo2.write(incomingByte[1]); // RIGHT ELBOW
servo3.write(180-incomingByte[1]); // RIGHT ELBOW

47
servo4.write(incomingByte[2]); // LEFT SHOULDER
servo5.write(incomingByte[3]); // LEFT ELBOW
servo6.write(incomingByte[4]-40); // LEFT ELBOW

delay(10);
}

48
References:

[1] Introduction and Objective Robotic Arm by (Abdul Zahier Ismail)

https://www.scribd.com/document/362905868/Introduction-and-Objective-Robotic-Arm

[2] Patil, C., Sachan, S., Singh, R. K., Ranjan, K., & Kumar, V. (2009). Self and Mutual
learning in Robotic Arm, based on Cognitive systems. West Bengal: Indian Institute of
Technology Kharagpur.

[3] Craig, J. J. (2005). Introduction to Robotics-Mechanics and Control (3rd ed.). (M. J.
Horton, Ed.) Upper Saddle River, USA: Pearson Prentice Hall.

[4] Angelo, J. A. (2007). Robotics: A Reference Guide to the New Technology. Westport:
Greenwood Press.

[5] Carlson, M., Donley, E., Graf, K., & Jones, T. (2013). Helping Hand: Senior Design Final
Documentation. Orlando: University of Central Florida.

[6]ROBOT ARM CONTROL WITH ARDUINO by Aimn Mohamed Ahmed Ghiet

MECHANICAL AND AERONAUTICAL ENGINEERING:

https://www.researchgate.net/publication/317277584_ROBOT_ARM_CONTROL_WITH_A

RDUINO

49
[7] Motion Control of Robot by using Kinect Sensor by: Mohammed A. Hussein, Ahmed S.
Ali, Department of Electronics Technology, Faculty of Industrial Education, Beni-Suef
University, Egypt
https://www.researchgate.net/publication/290594972_Motion_Control_of_Robot_by_using_
Kinect_Sensor

[8] Intuitive Human Robot Interaction and Workspace Surveillance by means of the Kinect
Sensor Klaudius Werber ,Department of Automatic Control, Lund University
https://lup.lub.lu.se/luur/download?
func=downloadFile&recordOId=2198971&fileOId=2214422

[9]Robot Obstacle Avoidance using the Kinect ( RASOUL M O JTAHEDZADEH)

https://www.nada.kth.se/utbildning/grukth/exjobb/rapportlistor/2011/rapporter11/mojtahedzad

eh_rasoul_11107.pdf

[10] The use of a 3D sensor (Kinect™) for robot motion compensation, Martin Kvalbein ,
Faculty of Mathematics and Natural Sciences University of Oslo.

[11] AUTOMATION: A ROBOTIC ARM By: (ARSALAN ALLAHYAR) Bachelor of


Technology in Mechanical Preston Institute of Management Sciences &Technology
(PIMSAT) Karachi, Sindh, Pakistan
https://www.academia.edu/14933221/Automation_A_Robotic_Arm_FYP_Thesis

[12]Robotic Arm & Controller Manual & User’s Guide by Images SI Inc. Pages: [10-30].

50

You might also like