This action might not be possible to undo. Are you sure you want to continue?
Scientific Research and Essays Vol. 6(22), pp.
48084820, 7 October, 2011
Available online at http://www.academicjournals.org/SRE
ISSN 19922248 ©2011 Academic Journals
Full Length Research Paper
A realtime object tracking by using fuzzy controller for
visionbased mobile robot
Mustafa Serter Uzer
1
and Nihat Yilmaz
2
1
Electronics Technology Program, Ilgin Technical Vocational School of Higher Education, Selcuk University,
Konya, Turkey.
2
ElectricalElectronics Engineering, Faculty of Engineering and Architecture, Selcuk University, Konya, Turkey.
Accepted 24 August, 2011
In this study, a realtime object tracking application based on robot vision is presented. The developed
robot has features that enable it to be utilized also in expedition, security, objectbased indoor
navigation and observational activities. Enabling the autonomous motion of this visionbased mobile
robot, by just using robot vision methods and fuzzy logic control was the aim of the study. A vision
based object tracking algorithm, conventional PD control algorithm and fuzzy control are used and
compared in study. The robot system focused on fuzzy control and image based control which is useful
as a test bench tool for graduate and undergraduate students who are interested in robot control and
image processing.
Keywords: Mobile robot, robot vision, fuzzy control, realtime control, objects tracking.
INTRODUCTION
Mobile robots can be grouped according to their
structures and location of utilization, under four major
groups; as terrestrial robots, aquatic robots, airborne
robots, and space robots respectively (Dudek and Jenkin,
2000). Terrestrial robots can be classified into two
different groups according to navigation; as indoor and
outdoor robots (DeSouza and Kak, 2002). The indoor
robots are mostly used for transport, service, safety, or
cleaning purposes. The outdoor robots, however, are
used for works that could be dangerous for human health
(Backes et al., 2003; Briones et al., 1994). The mobile
robots can be classified according to their control talents
as telerobots, semiautonomous robots, and autonomous
robots. Autonomous robots are the robots that can both
make the decisions and implement work by itself for the
establishment of the task given. The operator in these
situations states the only required task or programs the
robot towards a single task (De Almeida and Khatib,
1998; Peker and Zengin, 2010; Sagiroglu and Yilmaz,
2009; Uzer et al., 2010a; Yılmaz and Sağıroğlu, 2009;
Yılmaz et al., 2006).
*Corresponding author. Email: msuzer@selcuk.edu.tr,
nyilmaz@selcuk.edu.tr.
The path for the studies regarding visionbased mobile
robots had been cleared with human’s transfer of his/her
sight, needed for discovering the environment and
moving around, to the mobile robots by the use of
cameras. The studies towards enabling the recognition of
the object distance (from the observer), sizes, and
movements, just like the recognition of objects with the
human eye, by the use of the camera has gained
importance over the past ten years (BoninFont et al.,
2008; Tsalatsanis et al., 2007; Yılmaz et al., 2006). As a
result of these studies, visual navigation systems for
robots were developed (BoninFont et al., 2008).
Implementations of real time object following an obstacle
avoidance were actualized in indoor environments by
using stereo vision and laser sensor (Tsalatsanis et al.,
2007) or robot experiment studies controlling with vision,
sonar, handy board for the mobile robot navigation (Inanc
and Dinh 2009). Systems regarding the control of vision
based robotic motion in nonautonomous environments
were also developed (Huang and Wu, 2009). Fuzzy logic
controllers are rather popular for mobile robot navigation
control systems (Parhi, 2005) such as a fuzzy logic
controller lab using a line following robot (Ibrahim and
Alshanableh, 2009), a visionbased fuzzy logic controller
for mobile robot systems (Uzer et al., 2010b).
In this study, a visionbased mobile robotic platform
Uzer and Yilmaz 4809
HARDWARE:
 Threewheeled
robot vehicle
 Portable PC
 TV card
 Wireless camera
 Microcontroller
 Bluetooth Modem
(BlueSMIRF)
SOFTWARE:
 Control Center
(MATLAB)
 Microcontroller
(PIC C)
Figure 1. The designed mobile robot platform.
that is designed to form a basis for the objectbased
indoor navigation, among the types of navigation, was
developed. The developed robot has features that enable
it to be utilized also in the expedition, security and
observational activities. The autonomous motion of this
visionbased mobile robot was enabled by using image
processing, robotic vision techniques, and fuzzy
controller. MATLAB program was used for all the
software, except microcontroller software in the system.
A visual algorithm, which studies the mechanism of
robot’s followingup of a ball according to its color and
shape, was created; and related experiments were
performed. Based on the results of the test, one of the
findings is that the activity periods of the created vision
based mobile robot system could be significantly
decreased; and thus it can be used efficiently in multi
purposed implementations. As for the drive system of the
robot, PD and fuzzy logic controllers were designed.
Experiments related to the followingup of an object,
which moves with a constant speed in linear and circular
orbits, by the robot were implemented to test the
performance of the system. The response period of this
robot, which can follow the objects with different colors
(red, blue, green), can be communicated via wireless
systems, and has a fuzzy drive system, was obtained to
be in the interval between 50 and 60 ms which is suitable
for realtime processing. In followingup experiments of
the colored object, with an average speed of 7 cm/s,
maximum 2 and 3 cm deviation was determined for linear
and circular orbits in using fuzzy control, respectively.
THE DESIGNED VISIONBASED MOBILE ROBOT
Mobile robot platform designed and created in this study
is displayed in Figure 1. The camera image is firstly
transferred to the computer by using the transmitter
wireless camera placed on the mobile robot and a
receiver attached to the mobile computer; and thus the
image is captured. After this captured image is processed
and evaluated by the motion controller, the result data,
which would enable the motion of the robot’s drive
system, is transferred to the robot via the Bluetooth
adapter. The robot, according to this received result data,
controls the drive system with microcontroller placed on
it.
Visionbased mobile robot was designed to move
without using wires on smooth floors. It was taken special
care that the robot had simple mechanics, easy
controlling system and features that could be useful for
implementations such as observing and discovering.
Hardware
General block diagram of the developed mobile robot
system is given in Figure 2. The mobile robot system can
be separated into four main parts as presented in Figure
2. These parts are vehicle motion system, communication
system, vision system and control center. The vehicle
motion system is a threewheeled robot vehicle
mechanism; in which the motion is enabled by the two
front wheels where the back wheel can easily move with
their steering. The vision system is based on the principle
of processing of image, captured by a wireless camera
with transmitter and receiver, in the computer by using
image processing techniques. The communication
system, on the other hand, is performed safely by using
the Bluetooth modem and developed safety algorithm. A
mobile personal computer, termed as the control center,
4810 Sci. Res. Essays
Wireless
camera
Receiver
Bluetooth
adaptor
Control center
Microcontroller
(PIC18F4620)
Bluetooth
modem
(BlueSMİRF)
DC motor
driver
DC motor
with gearbox
DC motor
with gearbox
Wireless
camera and
Transmitter
Mobile Robot
Video capture
Device
LCD
USB
USB
Figure 2. General block diagram of the designed mobile robot.
controls all the activities of the mobile robot.
A general purposed mobile computer was chosen as
the control center. This computer (Casper personal
computer) contains 2.40 GHz Core I5 processor, 3 GB
DDR3 RAM, 320 GB HDD and DVDRW. Besides these
features, a video capture device and Bluetooth USB
Adaptor was needed for the experiments.
A microcontroller (PIC18F4620) unit is placed on the
robot for the purpose of controlling the movement. This
microcontroller unit communicates with the control
center by the help of the Bluetooth module. Furthermore,
two direct current (DC) motors containing reducers and
encoders, H bridge circuit to drive these motors, and one
LCD are significant equipments that are used in design of
the mobile robot. Moreover, a differential drive system,
which mechanically eases the system, was also used for
the developed mobile robot.
Software
Two basic elements are contained in the software
infrastructure of the designed robot. The first ones are the
software applications that are operated inside the micro
controller where the robotic movements are enabled.
PICC assembler was utilized to program this
microcontroller. In the software program for micro
controller, functions detailed are performed:
a) Communication with the control through the RS232
port with the speed 115200 Baud by the data format of 8
bits data (no parity bit) 1 stop bit,
b) Primitive control which actualizes the leftandright and
backwardandforward movements of the motor
The second basic element of the software is the control
center main software, termed as control center in short,
which performs all the visionbased robotic activities and
controls the robotic movements with the fuzzy logic. This
software was developed by using MATLAB programming
language. The elements constituting the main software of
the project are image capturing functions, image
processing functions and communication functions.
THE PROPOSED VISIONBASED FUZZY MOTION
CONTROL
Block diagram of visionbased mobile robot with fuzzy
motion control system is shown in Figure 3.
Image processing
For the image processing, a method which
simultaneously uses the form and the color of the object,
was developed for separating the object from the
background. In this study, the object was chosen as a
ball since round objects can be easily move in level
surfaces. Image capturing is first enabled in this method;
and RGB color space is then used for actualizing the
colorbased selection (Baykan et al., 2010). The related
color pixels are found by applying chromaticity method,
according to the desired color, on the algorithm. The
chromaticity values as r, g, b are calculated by Equation 1
Uzer and Yilmaz 4811
Captured
image
Fuzzy logic
controller
MOBILE ROBOT
PWMbased DC motor
speed control
Wireless
camera
Object
Position
Image
Transfer
(RF)
Image
Feedback
_
+
Color and shape
properties based
object tracking
algorithm
Image
Center
Video capture
device
Robot
Figure 3. Block diagram of visionbased mobile robot fuzzy motion control system.
Find the
center of
gravity
Find the
properties
of objects
Capture
ımage
Find object
(Roundness
and area
greater than a
certain value)
Calculate the
roundness of
objects
Threshold
process
Histogram
equalization
Apply
chromaticity
method
Remove small
areas and fill
image holes
Horizontal
position
Normalize
(0255)
Vertical
position
Camera
Figure 4. Block diagram for image processing.
(Yılmaz and Sağıroğlu, 2009).
[ ]
R G B
r g b
R G B R G B R G B
(
=
(
+ + + + + +
¸ ¸
(1)
Improvement on the image is then performed by using
the histogram equalization method which enables to
decrease the differences between these pixel values. An
image in a binary format is then obtained by performing a
threshold process. The sections in white represent the
area of the object that is desired to be perceived. Aside
from this, some whiteness (mistakes) can also be
perceived. The white areas that are displayed by less
than 20 pixels should be eliminated in order to minimize
these mistakes; and their inside should be covered with
black. Likewise, the black sections falling inside the area
of the object should be turned into white areas.
Since a ball was used as the object in the system, this
object with a definite roundness degree, and footprint
area bigger than a definite value, is accepted as a circle.
Roundness metric are calculated by Equation 2.
Roundness metric
2
4 .
=
S
C
t
(2)
where, S is surface area of object, C is contour of object.
The closer to 1 the value of a roundness metric of an
object, the higher its roundness degree is. Block diagram
for image processing is given in Figure 4 (Uzer et al.,
2010a).
The identification of the object location on the image is
implemented by finding the center of gravity for the image
in the binary format. The horizontal and vertical values of
this center of gravity are used for determining the distance
4812 Sci. Res. Essays
Right motor speed
INPUTS FUZZY CONTROLLER
OUTPUTS
Rule table
1.If …then ..
2. If …then ..
3. If …then ..
Defuzzification
Fuzzification
Inference
mechanism
Vertical position
Horizontal position
Left motor speed
Figure 5. The structure of designed fuzzy logic controller.
values of the image, in pixels, from the left or right of the
center.
A fuzzy controller is utilized for enabling the location of
object to be exactly in the middle of the image. This fuzzy
controller produces speed values for right and left motors.
These values are then sent to the microcontroller located
on the robot via Bluetooth. Consequently, the movement
of the robot is provided so that the location of the
perceived object falls exactly in the middle of the image.
Fuzzy motion control
Since introduced by L. A. Zadeh in 1965, Fuzzy Logic
control became a huge industry in many countries
because of ease of implementation in a standard
computer (Zilberstein). There are specific components
characteristic of a fuzzy controller to support a design
procedure. The controller consists of pre and post
processing, inference engine, rule base, fuzzification and
defuzzification. The important ones are explained below:
Fuzzification: This process converts each piece of input
data to degrees of membership by a lookup in one or
several membership functions. The fuzzification thus
matches the input data with the conditions of the rules to
determine how well the condition of each rule matches
that particular input instance. There is a degree of
membership for each linguistic term that applies to that
input variable (Abdessemed et al., 2004).
Defuzzification: The resulting fuzzy set must be
converted to a number that can be sent to the process as
a control signal in this process. The resulting fuzzy set is
thus defuzzified into a crisp control signal. The strategy
adopted in this here is the height defuzzification method
because of its simplicity and fast calculation. It uses the
individual scaled control outputs, and builds the weighted
sum of the peak values, as it is clear from Equation 3
(Polat et al., 2003; Uzer et al., 2010b).
1
1
( )
( )
L
i i
i
F L
i
i
y y
y C
y
µ
µ
=
=
=
¿
¿
(3)
where,
i
y is the support value at which the membership
function reaches the maximum value,
F
C is the scaling
factor defined for the output universe of discourse; L
designates the number of rules used (Abdessemed et al.,
2004; Yilmaz and Sagiroglu, 2007).
The drive system of the visionbased robot is actualized
by using Mamdani type fuzzy logic controller. This fuzzy
logic controller has two inputs, as horizontal and vertical
positions (object position), and two outputs; which are
right motor speed and left motor speed. The form of the
membership functions of the controlling variables are
chosen as triangular. The structure of designed fuzzy
logic controller for this study is given in Figure 5.
The names of membership functions of designed fuzzy
logic controller are shown in Table 1 and 2.
The rules of fuzzy logic controller, designed to enable so
that the object stays in the center of the image, in this
study are presented in Figure 6 and Table 3 and 4. The
membership functions belong to input and output
variables for the utilized fuzzy control are given in Figure
7. Surface views of fuzzy logic controller outputs are
given in Figure 8 and 9.
A PD controller was firstly utilized in the system. The
utilized PD controller is designed as given in Equation 4,
5 and 6. It is given that
P
K =3.2,
D
K =0.0001, and
Ofset =5000.
Uzer and Yilmaz 4813
Table 1. The names of Inputs membership functions.
Inputs
Horizontal position (HP) Vertical position (VP)
Too near (TN) Too near (TN)
Near (N) Near (N)
Medium (M) Medium (M)
Far (F) Far (F)
Too far (TF) Too far (TF)
Table 2. The names of outputs membership functions.
Outputs
Right motor speed (RMS) Left motor speed (LMS)
Negative fast (NF) Negative fast (NF)
Negatif slow (NS) Negatif slow (NS)
Dur (M) Dur (M)
Positive slow (PS) Positive slow (PS)
Positive fast (PF) Positive fast (PF)
1. If (HP is TN) and (VP is TN) then (RMS is M)(LMS is NF)
2. If (HP is TN) and (VP is N) then (RMS is M)(LMS is NF)
3. If (HP is TN) and (VP is M) then (RMS is PS)(LMS is NS)
4. If (HP is TN) and (VP is F) then (RMS is PF)(LMS is M)
5. If (HP is TN) and (VP is TF) then (RMS is PF)(LMS is M)
6. If (HP is N) and (VP is TN) then (RMS is NS)(LMS is NF)
7. If (HP is N) and (VP is N) then (RMS is M)(LMS is NS)
8. If (HP is N) and (VP is M) then (RMS is M)(LMS is NS)
9. If (HP is N) and (VP is F) then (RMS is PS)(LMS is M)
10. If (HP is N) and (VP is TF) then (RMS is PF)(LMS is PS)
11. If (HP is M) and (VP is TN) then (RMS is NF)(LMS is NF)
12. If (HP is M) and (VP is N) then (RMS is NS)(LMS is NS)
13. If (HP is M) and (VP is M) then (RMS is M)(LMS is M)
14. If (HP is M) and (VP is F) then (RMS is PS)(LMS is PS)
15. If (HP is M) and (VP is TF) then (RMS is PF)(LMS is PF)
16. If (HP is F) and (VP is TN) then (RMS is NF)(LMS is NS)
17. If (HP is F) and (VP is N) then (RMS is NS)(LMS is M)
18. If (HP is F) and (VP is M) then (RMS is NS)(LMS is M)
19. If (HP is F) and (VP is F) then (RMS is M)(LMS is PS)
20. If (HP is F) and (VP is TF) then (RMS is PS)(LMS is PF)
21. If (HP is TF) and (VP is TN) then (RMS is NF)(LMS is M)
22. If (HP is TF) and (VP is N) then (RMS is NF)(LMS is M)
23. If (HP is TF) and (VP is M) then (RMS is NS)(LMS is PS)
24. If (HP is TF) and (VP is F) then (RMS is M)(LMS is PF)
25. If (HP is TF) and (VP is TF) then (RMS is M)(LMS is PF)
Figure 6. ifthen rules for fuzzy logic controller.
,
x desired measured y desired measured
e x x e y y = ÷ = ÷
(4)
. . , . .
y
x
x x P D y y P D
de
de
u e K K u e K K
dt dt
= + = +
(5)
,
r x y l x y
v u u Ofset v u u Ofset = + + = ÷ + (6)
where
desired
x and
desired
y are the position values of
4814 Sci. Res. Essays
Table 3. Rule table for right motor speed output.
Right motor speed
Horizontal position
TN N M F TF
Vertical position
TN M NS NF NF NF
N M M NS NS NF
M PS M M NS NS
F PF PS PS M M
TF PF PF PF PS M
Table 4. Rule table for left motor speed output.
Left motor speed
Horizontal position
TN N M F TF
Vertical position
TN NF NF NF NS M
N NF NS NS M M
M NS NS M M PS
F M M PS PS PF
TF M PS PF PF PF
1
TN N M F TF
50 100 200 150 250
0.5
0
NF NS M PS PF
300 200
1
200 100 300
0.5
0
100 0
250
1
TN N M F TF
50 100 200 150
0.5
0
300 200
1
200 100 300
NF NS M PS PF
0.5
0
100 0
(a)
(c) (d)
(b)
Figure 7. The membership functions of fuzzy logic controller. a, horizontal position input; b, vertical position input; c,
right motor speed output; d, left motor speed output.
image center,
measured
x and
measured
y are the position
values of the object center,
r
v
is the right motor speed,
l
v
is the left motor speed, Offset is an approximate value
related to the resolution of PWM,
x
e is the position error
Uzer and Yilmaz 4815
Figure 8. Right motor speed surface view of fuzzy logic controller.
Figure 9. Left motor speed surface view of fuzzy logic controller.
in the x axis,
y
e is the position error in the y axis. Then,
the same system is controlled with fuzzy logic controller.
SYSTEM MONITORING AND SIMULATION STAGE
Kinematic equations of robot movement mechanism are
required because of observation of movements of
developed mobile robot and determination of different
control algorithms responses in robot simulation
environment.
Developed mobile robot uses differential drive system.
Differential drive is the simplest drive mechanism for a
groundcontact mobile robot and often used on small, low
cost, indoor robots. Robot with differential drive consists
of two wheels mounted on a common axes controlled by
separate motors and is shown in Figure 10.
Where l is the distance along the axle between the
centers of the two wheels, the left wheel moves with
velocity
l
v along the ground and the right with velocity
r
v , and R is the signed distance from the ICC to the
midpoint between the two wheels. At any instant in time,
solving for R and e
,
results in Equation 7, 8, 9 and 10.
2
r
R e u
 
+ =

\ .
(7)
4816 Sci. Res. Essays
Figure 10. Differential drive robot control (Dudek and Jenkin, 2000).
2
l
R e u
 
÷ =

\ .
(8)
1( )
2( )
l r
r l
v v
R
v v
+
=
÷
(9)
r l
v v
e
÷
= (10)
If the robot has pose ( ) , , x y u at some time t , and if the
left and right wheels have groundcontact velocities
r
v
and
l
v during the period t t t o ÷ + , then the ICC is
given by Equation 11.
( ) ( ) sin , cos ICC x R y R u u = ÷ + (
¸ ¸
(11)
And at time t t o + the pose of the robot is given by
Equation 12.
( ) ( )
( ) ( )
' cos sin 0
' sin cos 0
0 0 1 '
x x
y y
x t t x ICC ICC
y t t y ICC ICC
t
eo eo
eo eo
u eo u
÷ ÷ ( ( ( (
( ( ( (
= ÷ +
( ( ( (
( ( ( (
¸ ¸ ¸ ¸ ¸ ¸ ¸ ¸
(12)
Equation 12 describes the motion of a robot rotating a
distance R about its ICC with an angular velocity given
by e .
The forward kinematics for differential drive robots, a
robot capable of moving in particular direction ( ) t u at
velocity ( ) v t given by Equation 13, 14 and 15.
   
0
1
( ) ( ) ( ) cos ( )
2
t
r l
x t v t v t t dt u = +
}
(13)
   
0
1
( ) ( ) ( ) sin ( )
2
t
r l
y t v t v t t dt u = +
}
(14)
 
0
1
( ) ( ) ( )
t
r l
t v t v t dt
l
u = ÷
}
(15)
Inverse kinematics for differential drive robots, if it is
assumed that ( )
l l
v t v = , ( )
r r
v t v = , and
l r
v v = , then
Equation 16, 17 and 18 yields:
( ) 1
( ) sin ( )
2 ( )
r l
r l
r l
v v t
x t v v
v v l
+
(
= ÷
(
÷
¸ ¸
(16)
( ) ( ) 1 1
( ) cos ( )
2 ( ) 2 ( )
r l r l
r l
r l r l
v v v v t
y t v v
v v l v v
+ +
(
= ÷ ÷ +
(
÷ ÷
¸ ¸
(17)
( ) ( )
r l
t
t v v
l
u = ÷ (18)
where
0
( , , ) (0, 0, 0)
t
x y u
=
= . Given a goal time t and
Uzer and Yilmaz 4817
Figure 11. Results for image processing.
goal position ( , ) x y . Equation 16, 17 and 18 solves for
r
v and
l
v , but does not provide independent control u .
if
r l
v v v = = , then robot’ s motion can be simplified to
Equation 19 and the robot can move in straight line.
' cos( )
' sin( )
'
x x v t
y y v t
u o
u o
u u
+ ( (
( (
= +
( (
( (
¸ ¸ ¸ ¸
(19)
And if we choose
l r
v v v ÷ = = , then robot’ s motion can
be simplified to Equation 20 and the robot can rotates in
place.
'
'
2 '
x x
y y
v t l u o u
( (
( (
=
( (
( ( +
¸ ¸ ¸ ¸
(20)
Robots do not use differential drive with synchronous
drive technology together, therefore one or more steered
wheel that is fixed to the axle in the system (Dudek and
Jenkin, 2000).
These equations are use for calculation of system
position at the real time working. In our system, l is 8
cm.
EXPERIMENTAL RESULTS
For the image processing, a method that uses the form
and color of the object was developed. The findings
obtained by this method at any moment are presented in
Figure 11.
As can be seen from Figure 11, the first image
indicates the captured image while the second and third
images represent the binary image obtained and black
point, as the round object’s center of gravity respectively.
Experiments related to the followingup of an object,
moving with a constant speed in linear and circular orbits,
by using PD and fuzzy logic controllers were performed in
this visionbased mobile robot system. The robot’s
followingup of the ball, moving with a constant speed in
the linear orbit, is displayed in Figure 12. Actually, our
robot system is tested different and has complex orbits,
but all of them are not suitable for comparatable results.
Therefore, we gave only experimental results for standart
orbits (linear and circular).
The robot’s followingup of the ball, moving with a
4818 Sci. Res. Essays
10 20 30 40 50 60 70 80
10
15
20
25
30
35
x(cm)
y
(
c
m
)
The path of the ball moving
PD controlled robot path followed by
Fuzzy controlled robot path followed by
Figure 12. The robot’s followingup of the ball, moving in the linear orbit.
0 20 40 60 80 100
0
10
20
30
40
50
60
70
80
90
x(cm)
y
(
c
m
)
The path of the ball moving
PD controlled robot path followed by
Fuzzy controlled robot path followed by
Figure 13. The robot’s followingup of the ball, moving in the circular orbit.
constant speed in the circular orbit, is displayed in Figure
13.
DISCUSSION
The mobile robot system were moved 20 times in linear
and circular orbits by using PD and fuzzy logic
controllers; and the robot’s maximum deviation distances
from the orbits are presented in Table 5. Process periods
object tracking by using fuzzy logic control are displayed
in Table 6.
The experimental comparison of motion controller with
PD controller and fuzzy motion controller with image
Uzer and Yilmaz 4819
Table 5. The robot’s maximum deviation distances from the orbits.
Microcontroller Linear orbit (cm) Circular orbit (cm)
Two PD (x,y) (Uzer et al. 2010a) 3.5 4.5
Fuzzy Logic (proposed) 2 3
Table 6. Process periods object tracking by using fuzzy logic control.
Methods Process periods (ms)
Image processing 25
Fuzzy control 15
Communication 20
feedback was performed; and it was observed that the
fuzzy motion controller with image feedback was more
suitable for this system.
CONCLUSIONS
The design and manufacturing of a visionbased mobile
robot system was successfully implemented in this study.
Experiments regarding realtime followingup of a moving
ball according to its color and form were performed in this
robot system; and the experimental results of the robot’s
following the ball, which moves with constant speed in
linear and circular orbits, were presented in this study.
This application, more specifically the detection different
objects by the computer using the same method, would
enable more comprehensive indoor navigation. The robot
system focused on fuzzy control and image based control
is useful as a test bench tool for graduate and
undergraduate students who are interested in robot
control and image processing.
Activities related to the visionbased object
identification of this designed mobile robot, the
coordination of its movements by fuzzy control, the
autonomous operation of the system and enabling the
communication with the control center were conducted in
the study. The utilized Bluetooth, with the help of its
communication infrastructure, added the features that
enabled the system to be used for distant observation,
safety and control purposes to the system.
The rapid processing time of the algorithm, is especially
significant in this experiment because algorithm time is
important for enabling the rapid response of the robot. By
using MATLAB for the image processing of the following
up of the different colors (red, blue, green) objects by the
robot, fuzzy logic controller for the motion control;
wireless camera for the transfer of the image to the
control center and Bluetooth for sending the data from
the control center, the response period of the robot could
be decreased to the values in the time interval between
50 and 60 ms. These findings suggest that the proposed
algorithm, by means of processing 16 input image per
second, could be used for real time implementations. It
was observed in this experiments that developed image
processing algorithm resulted in 89% identification
success for the utilized colored objects.
ACKNOWLEDGEMENT
The authors are grateful to Selçuk University Scientific
Research Projects Coordinatorship for press support of
the manuscript.
REFERENCES
Abdessemed F, Benmahammed K, Monacelli E (2004). A fuzzybased
reactive controller for a nonholonomic mobile robot. Robot. Auton.
Syst., 47: 3146.
Backes PG, Norris JS, Powell MW, Vona MA, Steinke R, Wick J (2003).
The science activity planner for the Mars Exploration Rover mission:
FIDO field test results. IEEE Aerospace Conf. Proc., 18: 35253539.
Baykan NA, Yilmaz N, Kansun G (2010). Case study in effects of color
spaces for mineral identification. Sci. Res. Essays., 5(11): 1243
1253.
BoninFont F, Ortiz A, Oliver G (2008). Visual navigation for mobile
robots: A survey. J. Intell. Robot. Syst., 53(3): 263296.
Briones L, Bustamante P, Serna MA (1994). WallClimbing Robot for
Inspection in NuclearPowerPlants. 1994 IEEE Int. Conf. Robot.
Autom. Proc., 14: 14091414.
De Almeida AT, Khatib O. (1998). Autonomous Robotic
Systems(Lecture Notes in Control And Information Sciences).
Springer, Verlag, p. 236.
DeSouza GN, Kak AC (2002). Vision for mobile robot navigation: A
survey. IEEE Transactions on Pattern Analysis and Mach. Intell.,
24(2): 237267.
Dudek G, Jenkin M (2000). Computational Principles of Mobile
Robotics. New York: Cambridge University Press.
Huang SJ, Wu SS (2009). VisionBased Robotic Motion Control for
Nonautonomous Environment. J. Intell. Robot. Syst., 54(5): 733754.
Ibrahim D, Alshanableh T (2009). An Undergraduate Fuzzy Logic
Control Lab Using a Line Following Robot. Comput. Appl. Eng. Educ.,
pp. 110.
Inanc T, Dinh H (2009). A LowCost Autonomous Mobile Robotics
Experiment: Control, Vision, Sonar, and Handy Board. Comput. Appl.
Eng. Educ., pp. 111.
Parhi DR (2005). Navigation of mobile robots using a fuzzy logic
controller. J. Int. Robot. Syst., 42(3): 253273.
4820 Sci. Res. Essays
Peker M, Zengin A (2010). Realtime motionsensitive image
recognition system. Sci. Res. Essays., 5(15): 20442050.
Polat K, Şahan S, Güneş S (2003). Finding The Direction of A Mobile
Robot Using Microcontroller Based Ultrasonic Distance Measuring
Device And Fuzzy Logic. IJCI Proc. Int. Conf. Signal Proc., 1: 313
316.
Sagiroglu S, Yilmaz N (2009). Webbased mobile robot platform for
realtime exercises. Expert Syst. Appl., 36(2): 31533166.
Tsalatsanis A, Valavanis K, Yalcin A (2007). Vision based target
tracking and collision avoidance for mobile robots. J. Intell. Robot.
Syst., 48(2): 285304.
Uzer MS, Yilmaz N, Bayrak M (2010a). A RealTime Tracking
Application of Different Coloured Objects with a Vision Based Mobile
Robot. J. Facul. Eng. Architect. Gazi Univ., 25(4): 759766.
Uzer MS, Yılmaz N, Uzer D (2010b). Case Study: A VisionBased
Fuzzy Logic Controller for Mobile Robot Systems. Int. Symp. Innovat.
Intell. Syst. Appl., pp. 6064.
Yilmaz N, Sagiroglu S (2007). Webbased maze robot learning using
fuzzy motion control system. The Sixth Int. Conf. Mach. Learn. Appl.,
pp. 274279.
Yılmaz N, Sağıroğlu Ş (2009). RealTime Line Tracking Based on Web
Robot Vision. Comput. Appl. Eng. Educ., DOI: 10.1002/cae.20367.
Yılmaz N, Sağıroğlu Ş, Bayrak M (2006). General Aimed Web Based
Mobile Robot: SUNAR. J. Facul. Eng. Architect. Gazi Univ., 21(4):
745752.
Zilberstein S (2011). What is 'fuzzy logic'? Are there computers that are
inherently fuzzy and do not apply the usual binary logic? Retrieved
from http://www.scientificamerican.com/article.cfm?id=whatisfuzzy
logicaret.