You are on page 1of 6

Proeeedings of the 2009 IEEE

International Con ference on Meehatronies and Automation


August 9 - 12, Changehun, China

Moving Object Tracking based on Mobile Robot Vision


Lin Rui, Du Zhijiang and Sun Lining
State Key Laboratory of Robotics and System
Harbin Institute of Technology
Harbin.Heilongjiang Province, China
{linrui,duzjOl,lnsun }@hit.edu.cn

Abstract-The paper describes a robotic application that dynamic world. The key technique of tracking moving object
tracks a moving objcct by utilizing a mobile robot with capacity is to identify moving object from the surrounding objects
of avoiding obstacles. Real-time tracking algorithm is based on in image sequences, and then calculate the motion vector
mobile robot vision using adaptive color matching and Kalman
filter. Adaptive color matching is used to obtain motion vectors of moving object. Adaptive color matching is carried out to
of moving object in the robot coordinate system . It can adjust reduce influence of lighting variations and to attain motion
color matching threshold adaptively to reduce the influence of vector of moving object relative to mobile robot, which is
lighting variations in the scene. Kalman filter is applied to design used in obstacles avoidance algorithm. A view window is
a view window. View window, which contains the next position designed by Kalman filter on image plane synchronously. It
of the moving object estimated by Kalman filter, is determined
on image plane to reduce the image processing area. Color is applied to limit possible search space for a moving object
matching threshold can adjust itself adaptively in view window. in image sequences and to attain the best matching results
A virtual targets based algorithm is also presented for real-time for searching moving object.
obstacle avoidance in this paper. Experimental results show While tracking moving object, the mobile robot should
that the tracking algorithm can adapt to lighting variations
have real-time obstacle avoidance capacity. We propose vir-
and has good tracking precision. It can also avoid obstacles
smoothly while tracking moving object. Furthermore, it can be tual targets based algorithm to avoid collision in dynamical
implemented in real time. environment with stationary and moving obstacles. Fuzzy
control is used to obtain a set of virtual targets to avoid
Index Terms-Virtual targets.View window.Real-time track- obstacles. Strategies for formulating the fuzzy rule sets for
ing.Avoiding obstacles
motion and obstacle avoidance have been proposed. Then we
I. I NTROD UCTIO N applied Virtual Force Method to calculate the control values:
linear velocity v and angular velocity w.Then the mobile
For many robot domains, such as robot soccer, the problem
robot moves toward the virtual target directly. The method
of tracking objects and avoiding obstacles has received a good can fulfill the requirements of tracking an object and avoiding
deal of attention over the last few years. There are many obstacles simultaneously in real time.Fig.1 shows the whole
techniques for the problem. Many methods are proposed to algorithm architecture.
track moving objects in [1]-[4]. Tracking algorithm using
vision should adapt to the complex variations of lighting and
Camera Laser Sensor
backgrounds , as well as be of low computational complexity.
Such methods have some limits in a dynamic environment Image Sequences Obstacles Detecting

while the camera is moving with mobile robot or the back-


ground changes a lot. The mobile robot should have capacity Jl Jl
of avoiding obstacles while tracking moving objects . Aiming Adap tive co lor ma tching Fuzzy Con trol
at the problem of obstacles avoidance, several traditional Motion Vector of Moving Objec t Virtual Targets
control methods have been applied, like Potential Field [5]
and Virtual Force Field [6]. But they may not work well in an
unknown and dynamical environment. In addition, intelligent
Jl lr Jl
Kalman Filter Virtua l Force Field
control methods such as fuzzy control and artificial neural
network [7] have been developed greatly. View Window Control Values

In this paper, we propose a real-time tracking algorithm


for a moving object using mobile robot vision. It is based Jl
on adaptive color matching and Kalman filter. We focus Robot Con tro l Onboard Computer
on the problem of robot vision for a robot operating in a I
* This work was supported by National High Technology Research and
Development Program of China (2005AA404290) Fig. 1. The whole algorithm architecture.

978-1-4244-2693-5/09/$25.00 ©2009 IEEE 3625


II. TRACKING ALGORITHM of pixel in the moving region. It is not constant and has
A. Adaptive Color Matching certain relation with distance of moving object relative to the
mobile robot. Color uniformity represents color uniformity
While an object is moving in the dynamic environment at
level of the whole region of moving ball. Possible position
higher speed, color of moving object on image plane may
means the possible pixel position of moving ball centroid
change a lot because of lighting variations in the scene. In
estimated by Kalman filter. Here possible position is used to
addition to lighting variations, there are additional problems
limit the search space on the image plane. We assume that
of non-diffuse lighting causing specula reflections, shadows,
each feature is independent and set theses features different
and large variations in color from the sun-side to the shade-
weight coefficients. The features describe the likelihood of
side. In order to reduce influence of lighting variations,
bonding box dimensions, locations of the regions compared
we utilize lighting-compensating algorithm in image pre-
to some expectation, and area of regions. We assume that
processing. The lighting-compensating algorithm is partially
the features can be matched probabilistically and have a
carried out in view window region of moving object only. As
Gaussian uncertainty distribution. Lastly, results of matching
for color features of moving object, we adopt YCrCb color
probability are given in (2).
space to reduce the influence of illumination. We use Cr and
Cb values for color matching. The YCrCb color values can M
be obtained from RGB color values by equation r, == L a.S, j == I ...N (2)
i=l
(1) Where P, is the character of the region being matched;
Where, Md is conversion matrix; M T is constant matrix; S, represents the region features; siqma; is the weight
M p c is the RGB value matrix of image after lighting com- coefficient. N is the number of regions to be matched after
pensating. coarse filtering. M is the number of likelihood features and
We select an initial label for image pixels classification. here M is 4. Every possible region after coarse filtering has a
The combination of initial-labeling and adaptive color match- matching value Pj . If the maximum matching value is bigger
ing provides the plasticity for lighting variations. To label than matching threshold which is set by user in advance, the
pixels according to whether they belong to moving object, region with maximum matching value is reserved and other
we utilize an initial-labeling scheme. Essentially, we examine regions are filtered out. Otherwise, the mobile robot does not
each pixel p on image plane and estimate the likelihood of find moving object and begins to implement task of searching
features of each pixel. After an initial-labeling, the image will moving object.
be transferred to binary image in order to reduce calculation After threshold adapting and region searching, the moving
cost. The white speckles appear because there are some noise object can be recognized on image plane. The motion vector
and other objects with similar color in the scenes. These of object can be calculated in the robot coordinate system.
regions of speckles may be not small and therefore are not After calculating the centroid vector M k of region on image
easily filtered. Such speckles are common in the indoor plane, we can obtain the position of moving object relative
scenes, which will influence the search space of moving to mobile robot by
object if they are not filtered precisely.
(3)
TABLE I
THE FEATURE USED TO FILTER REGIONS Where, Wk represents the position of moving object ,
which the mobile robot should move to, in the 3D robot co-
Feature Description
ordinate system; C- 1 R; 1 R;; 1 p- 1 is the conversion matrix,
Area Number of pixels in region
Shape Ratio of width to height
which converts the image coordinates into the 3D robot coor-
Color uniformity The color uniformity in region dinate system. The conversion matrix includes some system
Possible position The location relative to the robot estimated by parameters such as focal length of camera, the rolling angle
Kalman filter () c and pitching angle O:c of camera, converting distance from

the image coordinate to the origin of 3D robot coordinate


In order to reduce the computational cost, we filter the system and etc. Wk is used as measure matrix by Kalman
small regions whose pixel numbers are smaller than thresh- filter. It is also applied to judge its position relation with
old, such as the regions of single speckle. After coarse filter- obstacles and mobile robot in the next section. The position
ing, we define some features of moving object to filter out the relation is taken as an input value of fuzzy control.
noise of these regions. We set weight coefficients for these
likelihood features. Table 1 lists the 4 features we use to filter B. View Window Designed by Kalman Filter
noise regions. Shape is the ratio of width to height. As for a In this paper, we utilize Kalman filter to estimate the
ball, the ratio is approximately equal one. Area is the number next position and velocity of moving object relative to the

3626
mobile robot and to set a view window on image plane. In III. OBSTACLES AVOIDANCE ALGORITHM
other words, we design Kalman filter to reduce the space
A. Virtual Targets Derived from Fuzzy Control
where moving object is searched for, and to attain its motion
vector W Zk from W k at the next frame time. We assume While tracking moving object, the mobile robot should
that the velocity of moving object relative to mobile robot have real-time obstacle avoidance capacity. In order to avoid
remains changeless in the course of image processing at every collision with obstacles, the mobile robot system needs to
frame time. This is equivalent to encoding a tracking history judge the direction in which the robot should move and to
into state parameters of Kalman filter and estimating motion generate proper instructions to low-level drivers, by consid-
vector of moving object relative to the robot. In order to ering the obstacle-free distances and their time differences.
adjust the color matching threshold, we define a view window Virtual target are applied to be the next temporary targets
according to the resulting estimates. View window is mainly mobile robot will move toward while tracking moving ob-
used for updating the matching threshold. The robot tracks ject. To obtain effective virtual targets, the system should
the centroid of moving object in the robot coordinate system. take all account of the parameters of obstacles surrounding
The center of view window is possible position of moving mobile robot. Fuzzy control provides a useful mechanism
object centroid on image plane although there are position for judging a situation correctly and generating a proper
error and velocity error. The size of moving object on image control instruction. The developed rule base for virtual targets
plane changes along with the distance from moving object consists of 108 rules. These rules are classified into the
relative to the mobile robot. View window will change with following 4 categories. These are the number of obstacle
the size of moving object on image and the velocity of it to clusters, direction of real target relative to mobile robot,
ensure including the region of moving object. We can use (4) obstacle avoidance layers and position relation of target and
to calculate the coordinate of moving object relative to the obstacle clusters.
mobile robot. We set 2 meters as an available maximum range limit for
detecting obstacles. obsti(d s , as, de, a e , d m , am) is applied
to describe contour parameters of obstacle i . Where, ds ,
(4) as, de, a e , d m and am are the initial, terminal, and nearest
distance and angle parameters of obstacle i relative to mobile
robot in laser polar coordinate system.
The center of view window is set by M v w . Its size is
determined by the distance and velocity of moving object
relative to the mobile robot. It is a possible region of moving
object at the next frame time. It is used to reduce the
J d;,i + dL - 2d e,i d s ,j cos (O:e,i - O:s,j) < dt h (7)

space of image sequences where the lighting configuration Where, dth is a safe threshold of desired size for mobile
is re-counted. We search for Ym , which represents the Y robot to pass. No is the final number of obstacle clusters after
value of the most pixels in view window. As we know, combination. No is fuzzitied and expressed by the fuzzy sets:
moving object in the possible area is the largest region in N Z , N Land N M, where they stand for zero, less and more.
view window. Y values in view window mainly present the The membership function is depicted in Fig. 2(a).
lighting configurations of moving object. Cr and Cb color The obstacles exert strong effects on motion of mobile
configuration have a certain non-linear relation with Y value. robot when they are in the immediate vicinity of the robot,
We can adjust the Cr and Cb color matching thresholds by and weak effects when they are further away. So layered
equation: obstacle avoidance policy is applied according to the distance
between obstacles and mobile robot. The mobile robot's work
area is divided into three layers, exigent obstacle avoidance
M N layer, normal obstacle avoidance layer and directly advancing
(5) layer according to the radius H; and R n .
Ym == most(L L Yi*width+j)
j
The distributing area of obstacle clusters surrounding
mobile robot is categorized into three memberships: LE( ==
[Crth Cbth]T == f(Ym ) (6)
exigent obstacle avoidance layer), LN (== normal obstacle
avoidance layer), LD (== directly advancing layer), as shown
Where, Ym represents the Y value whose pixels are in Fig.2 (b).
the most in view window; Crth and Cb th are the color The direction of real target relative to mobile robot is
matching thresholds at the next frame time; Then adaptive fuzzified and expressed by the fuzzy sets: FL, FR, BL and
color matching evolves from the current frame to the next, BR ,where they stand for front-left, front-right, back-left
capturing the changes in the image size of an object as it and back-right, respectively. The target may be between two
moves. obstacle clusters, or in front of some obstacle cluster, or

3627
1.0

(a) The number of obstacle clusters

LE LN LD
1.0

(a) Virtual target is the same as the position of real


target

(b) The obstacle avoidance layers

:xk -20 0

(c) The obstacle avoidance layers


20 t ill

<rlX>.
-180 -135 -90 -45 0 45 90

(d) The direction of moving object relative to mobile robot


135 180 deg
(b) Virtual target to detour obstacle cluster

Fig. 3. Virtual target derived from fuzzy control

at the back of some obstacle cluster. It is fuzzified and


1.0 expressed by the fuzzy sets: PN, PF, and PB respectively.
Their membership functions are depicted in Fig.2 (c) and
(d).
The virtual target as final output of fuzzy control is updated
at every control interval time. The membership functions of
output variables dv and (tv are depicted in Fig. 2 (e) and
(e) The position between obstacle clusters and moving (f) respectively. Its coordinate is described as (dv, (tv ) in the
object
laser polar coordinate system. Then it can be converted into
4 1.0 Pv(x v , Yv, (tv ) in 20 world coordinate system by:

¥illfJftljjj)I.
- 180 -150 -120 -90 -60 -30 0 30 60

(I) The angle virtual target relative to mobile robot


90 120 150 180 dcg
Xv = Xr + dv cos(Or + (tv -
Yv = Yr + dv sin(Or + (tv -
7f

7f
/ 2)
/ 2)
(8)
(9)

Where, P; (x r , Yr , (tr) is present coordinate of mobile


Fig. 2. Fuzzy membership functions of input and output variables robot in 2D world coordinate system. Some instances of
virtual targets are derived from fuzzy control as shown in
Fig. 3.

3628
B. "Local Minimum Traps" of Virtual Targets
One problem of obstacle avoidance based on virtual targets
is the possibility for the mobile robot to get "trapped". This
situation may occur when the virtual target is the same as
the current position of mobile robot. Then the robot will not
move any more. Traps can be created by a variety of different
obstacle configurations, and different types of traps can be
distinguished easily. This section presents a comprehensive
set of rules to recover from the trap condition.
In an ideal, trap-states may be discovered by simply
monitoring the speed of the robot. If caught in a trap, the
robots speed will become zero as the robot converges to the
Fig. 4. Research platform HR-l and Coordinate systems
equilibrium position. In a dynamic system, however, the robot
may stop to avoid moving obstacles in the exigent obstacle
avoiding layer and will either oscillate or run in a closed control values for linear velocity t/ and angular velocity w
loop. Therefore, it is practical to monitor the position of can be calculated based on Virtual Force Method by
virtual targets relative to mobile robot. Firstly, our method for
trap-state detection is comparing virtual target with current
position of mobile robot. If the virtual target is the same as (12)
the current position or F t is a zero vector, the mobile robot is
very likely about to get trapped . Second, the Virtual Target- (13)
To-Robot directions will be recorded. If the direction angles Where, F et is force constant; Me is coefficient matrix; F t
vibrate around zero and the robot' linear velocity is equal to is an attracting force vector. It is generated by the virtual
zero, the robot is very likely about to get trapped. Third, if target point P; (xv ,Yv), whose coordinates are outputs of the
the robot's direction of travel is more than a certain degree fuzzy control. At any time during tracking moving object, Ft
Co ( Co > 360°) set beforehand, the robot is also very likely pulls the mobile robot toward the virtual target. The repulsive
about to get trapped. Co > 360° indicates that the robot has force vector F; generated by obstacle clusters is omitted. So
traveled at least one full loop. the mobile robot can keep on moving toward virtual targets
Once detected, there are several ways to remedy the to detour obstacles while tracking moving object.
situation . On these trap condition, a random angle etr and
a random distance d; will be added to the angle value etr V. EXPERIMENTAL RESULTS
and distance value d; of virtual target respectively. et r and This section presents some experimental results which
d; should satisfy : illustrate the operational characteristics of the proposed
method. Fig. 4 shows our robot research platform and robot
coordinate systems. The camera and Laser sensor are fixed
letr I < Cn and letr I =1= 0 (10)
up on the top of mobile robot. We set a frame interval to be
Cd < d; < 0 (I I)
III 5 seconds, since the current implementation of the method
Where, Co and Cd are constants . The recovery algorithm requires such a processing time per frame for a video image
is computationally simple, and does not reduce the overall (320x240 pixels with 24 bits) . The maximum linear velocity
performance (if evoked only temporarily), even if the mobile v and angular velocity w of the mobile robot are set to be
robot would not be trapped. 700 mm/sec and 50 degree/sec, respectively . We evaluated
The recovery algorithm is computationally simple, and the suggested methods under realistic tracking conditions.
does not reduce the overall performance (if evoked only The initial model of moving ball can be made manually by
temporarily), even if the mobile robot would not be trapped. user or automatically by some identification module . Then
the mobile robot begins to track the moving orange ball.
IV. CONTROL VALUES SENT TO ONBOARD As shown in Fig.5, the green rectangle is the region
COMPUTER detected by adaptive color matching at the current frame
In order to control mobile robot to track moving object, time . The red rectangle around the moving ball is the region
two control values must be sent to its onboard computer estimated from the previous frame by Kalman filter. It is
to control the two tracking motors: linear velocity v and also taken as a view window. While the ball moves at lower
angular velocity w. The virtual target (dv , et v) is the final speed relative to the mobile robot, the green rectangle almost
output of fuzzy control. It is also a temporary target mobile overlaps with the red rectangle on image plane . When the ball
robot advances toward at the current interval time . The final moves at higher speed, the red rectangle has bigger size than

3629
For example, predicting the pOSItIOn of moving ball when
the moving ball is completely obstructed by obstacles or an
obstacle and recognizing the moving ball from other similar
balls which are moving nearly are currently in progress.
REFERENCES
[1] K. Takaya, R. Malhotra, "Tracking Moving Objects in a Video Sequence
by the Neural Network Trained for Motion Vectors," in 2005 IEEE
Pacific Rim Conference on Communications, Computers and signal
Processing, 2005, pp. 153-156.
[2] D. Jang, H. Choi, "Moving Object Tracking Using Active Models," in
1998 International Conference on Image Processing, Chicago, 1998, pp.
648-652 .
[3] B. Browning, M. Veloso, "Real-Time, Adaptive Color-based Robot
Vision," in 2005 IEEEIRSJ International Conference on Intelligent
Robots and Systems, 2005, pp. 3871-3876.
[4] S. Pei, W. Kuo, and W. Huang, "Tracking moving objects in image
sequences using 1-0 trajectory filter," IEEE Signal Processing Letters,
vol. 13, January. 2006, pp. 13-16.
[5] Khatib 0, " Real time obstacle avoidance for manipulators and mobile
robots," The International Journal of Robotics Research, vol. 5, no. 1,
1986, pp. 90-98.
[6] Borenstein J, Koren Y, " Real-time obstacle avoidance for fast mobile
robots," IEEE Transactions on Systems, Man, and Cybernetics, vol. 19,
September. 1989, pp. 1179-1187.
[7] Liu X., Peng L., Li J., Xu Y., "Obstacle avoidance using fuzzy neural
Fig. 5. Experiment results networks," in Proceedings of the 1998 International Symposium on
Underwater Technology, Tokyo, 1998, pp.282-286.
[8] C. Chen, C. Cheng, D. Page, A. Koschan, and M. Abidi, "A Moving Ob-
ject Tracked by A Mobile Robot with Real-Time Obstacles Avoidance
the green rectangle . And the centers of two rectangles have Capacity," in the 18th International Conference on Pattern Recognition,
a little deviation because of position error and velocity error. Hong Kong, 2006, pp. 1091-1094.
When the moving ball is partly obstructed by obstacles or an [9] B. A. G, P. Ioannis, "Prediction and tracking of moving objects in image
sequence," IEEE Transactions on Image Processing, vol. 9, August.
obstacle, the mobile robot can track it well. The algorithm 2000, pp. 1441-1445.
is able to adapt and continue to recognize the moving ball [10] Y. Cho, J. Lee, " High Speed Tracking Algorithm of Moving Object,"
despite lighting variations. Although there are some objects Industrial Electronics, vol. 3, June. 2001, pp. 1563-1568.
[11] P.Lanvin, J-C.Noyer, and M.Benjelloun, "Object detection and tracking
with similar color and shape in the scene, the robot can also using the particle filtering," 2003. Conference Record of the Thirty-
track the moving ball precisely. While tracking the moving Seventh Asilomar Conference on Signals, Systems and Computers, vol.
ball, the mobile robot can detour the stationary and moving 2, November. 2003, pp. 1595-1599.
[12] E. Malis, E. Marchand, "Experiments with robust estimation techniques
obstacles in the environment almost smoothly. in real-time robot vision," in Proceedings of the 2006 International
Conference on Intelligent Robots and Systems, Beijing, 2006, pp. 223-
VI. CONCLUSIONS 228.
[13] F.Belkhouche, B.Belkhouche, "On the tracking and interception of a
In this paper, we have proposed a technique for real time moving object by a wheeled mobile robot," Robotics, Automation and
tracking moving object and avoiding obstacles algorithm used Mechatronics , vol. I, December. 2004, pp. 130-135.
in robot vision domains, such as robot soccer. The key to [14] D.Hsu, J.Leu,S.Chen, W. Chang, W. Fang, "Using shape feature
matching to track moving objects in image sequences," Multimedia and
tracking algorithm resides: an adaptive color matching to Expo, vol. 2, Aug. 2000, pp. 785-788.
calculate the present motion vector of moving object and the [15] Lei B., Li w., " A Fuzzy Behaviours Fusion Algorithm for Mobile
Kalman filter used to predict the next possible position, to set Robot Real-time Path Planning in Unknown Environment," In the 2007
IEEE International Conference on Integration Technology, Shenzhen,
a view window used for adjusting color matching threshold. 2007, pp. 173-178.
The new method has a simple design and is implemented [16] Foka A F, Trahanias P E, " Predictive control of robot velocity to avoid
in real time with very low computational complexity. It is obstacles in dynamic environments," In the 2003 IEEE International
Conference on Intelligent Robots and Systems, Las Vegas, 2003, pp.
also robust to lighting variations. The obstacle avoidance 370-375.
algorithm uses fuzzy control to judge the status of the [17] X. Chen and Y. Li, "Smooth Formation Navigation of Multiple
obstacles and to generate proper virtual targets for mobile Mobile Robots for Avoiding Moving Obstacles", International Journal
of Control, Automation , and Systems, Vol. 4, No.4,2006,pp.466-479 .
robot to advance toward. Then the mobile robot can fulfill
avoiding obstacles smoothly while tracking moving object.
The experimental results have shown that the algorithm has
good tracking precision.
Our future work will focus on extending tracking algorithm
to work better for more practical and complex applications.

3630

You might also like