You are on page 1of 7

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/261231152

Localization of a unicycle-like mobile robot using LRF and omni-directional


camera

Conference Paper · November 2012


DOI: 10.1109/ICCSCE.2012.6487193

CITATIONS READS
3 50

4 authors, including:

Tran Hiep Dinh Manh Duong Phung


Vietnam National University, Hanoi Vietnam National University, Hanoi
13 PUBLICATIONS   64 CITATIONS    36 PUBLICATIONS   109 CITATIONS   

SEE PROFILE SEE PROFILE

Tran Quang Vinh


Vietnam National University, Hanoi
45 PUBLICATIONS   114 CITATIONS   

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Monitoring of Infrastructure and Environment Interactions through Satellite Data View project

Development of a Control Architecture for Multiple Quadcopters View project

All content following this page was uploaded by Manh Duong Phung on 04 July 2016.

The user has requested enhancement of the downloaded file.


Localization of a Unicycle-like Mobile Robot Using
LRF and Omni-directional Camera
Tran Hiep Dinh, Manh Duong Phung, Thuan Hoang Tran, Quang Vinh Tran
VNU University of Engineering and Technology
Hanoi, Vietnam
tranhiep.dinh@vnu.edu.vn

Abstract— This paper addresses the localization problem. The The core of the method is the development of a point to point
extended Kalman filter (EKF) is employed to localize a unicycle- matching algorithm. Though this approach is efficient and
like mobile robot equipped with a laser range finder (LRF) requires low computational cost, its accuracy may be vastly
sensor and an omni-directional camera. The LRF is used to scan reduced if extracted feature points are not exactly maintained
the environment which is described with line segments. The through detecting steps.
segments are extracted by a modified least square quadratic
In this paper, the LRF and omni-directional camera are
method in which a dynamic threshold is injected. The camera is
employed to determine the robot’s orientation. The prediction fused in a robot platform for the problem of localization.
step of the EKF is performed by extracting parameters from the Instead of using features points as in [4], the LRF is employed
kinematic model and input signal of the robot. The correction to retrieve the structure of the environment, present it in form
step is conducted with the implementation of a line matching of line segments, match the segments with a global map and
algorithm and the comparison between line’s parameters of the finally estimate the robot position. The camera is employed to
local and global maps. In the line matching algorithm, a determine the robot’s orientation. Those data are then
conversion matrix is introduced to reduce the computation cost. combined in an extended Kalman filter (EKF) to extract the
Experiments have been carried out in a real mobile robot system best estimate of the robot’s pose in a statistical sense. The
and the results prove the applicability of the method for the
main contribution of the paper is the proposal of a novel line
purpose of localization.
matching algorithm with the present of a conversion matrix
Index Terms— sensor fusion; Kalman filter; localization. and a dynamic line-detection threshold. Simulations in
MATLAB were carried out and experiments in a real robot
I. INTRODUCTION system were implemented. The results confirm the
effectiveness of the proposed fusion algorithm.
The mapping and localization are fundamental to the The paper is structured as follows. The localization
automation of the mobile robot. The more accurately the robot algorithm is presented in section II. Section III describes the
could determine its position, the more autonomously it could identification of Kalman filter’s parameters from the LRF and
operate to complete given tasks. In general, the robot omni-directional camera. Experiments are introduced in
determines its position based on feedback measurements from section IV. The paper concludes with an evaluation of the
a sensor system such as the quadrature encoder, ultrasonic system.
sensor, laser range finder (LRF) and visual camera.
Nevertheless, the estimation based on data from one sensor is II. LOCALIZATION ALGORITHM
often not reliable enough to solve the localization and In this work, the extended Kalman filter (EKF) is used as the
mapping problems. The multi-sensor data fusion therefore has localization algorithm. The EKF which is a set of recursive
received much attention recently due to its ability to combine equations to estimate the state of a process in a way that
advantages of each sensor type to extract more accurate minimizes the mean of the squared error has been proven to be
information. Several combinations have been proposed such effective for the robot localization [2, 8, 9, 10]. This section
as the LRF and sonar, LRF and odometry, and LRF and CCD describes the implementation of the EKF to the mobile robot.
camera [1-3]. In [1], A. Diosi succeeded in combining a LRF
with an advanced sonar sensor for better range and bearing
measurements as well as classification of environment
features. In [3], the operating environment was represented as
a set of uncertain geometric features including segments,
corners and semi planes based on data from the LRF and
vision camera. These features were then fused together to
extract the position and orientation of the mobile robot. A
fusion approach to build a 2D-local map of the environment
using the LRF and omni-directional camera is proposed in [4]. Figure 1. The robot’s pose and parameters
⎡Ts cos θˆk+ 0⎤
The unicycle-like mobile robot with non-slipping and pure ∂f ⎢ ⎥
rolling is considered. Fig.1 shows the coordinate systems and Wk = k = ⎢ Ts sin θˆk+ 0⎥ (6)
notations for the robot, where (XG, YG) is the global coordinate,
∂w ( xˆ +k , uk , 0 ) ⎢ ⎥
(XR, YR) is the local coordinate relative to the robot chassis. R ⎢⎣ 0 Ts ⎥

denotes the radius of driven wheels, and L denotes the distance Q k is the input-noise covariance matrix. In the system, the
between the wheels.
input noise is modeled as being proportional to the angular
Let Ts be the sampling period, ωL (k ) and ωR (k ) be the speed ωL,k and ωR,k of the left and right wheels, respectively.
measurements of rotational speeds of the left and right wheels Thus, the variances equal to δω L2 and δω R2 , where δ is a
with the encoders at the time k, respectively. The discrete constant with the value 0.01 determined by experiments. The
kinematics model of the robot is given by: input-noise covariance matrix Qk is defined as:
R
xk +1 = xk + Ts (ω L (k ) + ωR (k )) cos θ k
2 ⎡δωR2 , k 0 ⎤
Qk = ⎢ ⎥ (7)
R
yk +1 = yk + Ts (ωL (k ) + ω R (k ))sin θ k (1) ⎢⎣0 δωL2, k ⎥⎦
2
R
θ k +1 = θ k + Ts (ωL (k ) − ωR (k )) B. The correction step with measurement update equations:
L The correction step adjusts the priori estimate by
In the present of feedback measurements and disturbances, incorporating measurements into the estimation. The
the system model is rewritten in state space representation as measurements in this paper are represented in form of line
follows. parameters extracted from the LRF data and deflective angular
determined from the omni-directional camera.
Let x = [ x y θ ] be the state vector. This state can be
T
At the first scan of the LRF, a global map of the
observed by an absolute measurement, z. This measurement is environment, which consists of a set of line segments described
described by a nonlinear function, h, of the robot coordinates by parameters βj and ρj, is constructed. The line equation in
and an independent Gaussian noise process, v. Denoting the normal form is:
function (1) as f, with an input vector u and a disturbance w, xG cos βj + yG sin βj = ρj (8)
the robot can be described as:
xk +1 = f (xk , uk , w k ) When the robot moves, a new scan of LRF is performed
(2) and a new map of the environment, namely local map, is
z k = h( x k , v k ) constructed which also consists of a set of line segments
where the random variables wk and vk represent the process and described by the equation:
measurement noise respectively. They are assumed to be xR cos i + yR sin i = ri (9)
independent to each other, white, and with normal probability
where i and ri are the parameters of lines. The line segments
distributions: w k ~ N(0, Qk ); v k ~ N(0, R k ); E (wi v jT ) = 0. of the global map are then transformed into the local coordinate
The steps to calculate the EKF are then performed through two (XR, YR) and match with line segments of the local map (The
phases: the prediction and correction as follows. line segments detection and map matching algorithms will be
presented in next sections). The matching line parameters i
A. The prediction step with time update equations:
and ri from the current local map are collected in the
The prediction step updates the robot state based on the measurement vector zk. In order to enhance the accuracy, the
system model and input signal before the measurement is robot orientation detected by the omni-camera is also added to
taken. Equations for this step are as below: zk resulting in:
xˆ −k = f (xˆ k −1 , u k −1 , 0) (3) z k = [ r1 ,ψ 1 ,...., rN ,ψ N , ϕ k ]T (10)
Pk− = A k Pk −1 A Tk + Wk Q k −1 WkT (4) This measurement vector is used as the input for the correction
step of the EKF to update the robot’s state (eq. 15).
where xˆ −k ∈ ℜ n is the priori state estimate at step k given The parameters βj and ρj of the matched line segment from
the global map (according to the global coordinates) are
knowledge of the process prior to step k, Pˆ − denotes the k
transformed into the parameters ψˆ i and ρˆi (according to the
covariance matrix of the state-prediction error, Ak is the
Jacobian matrix of partial derivates of f to x: coordinates of the robot) by
−k
⎡1 0 −Ts vc sin θˆk+ ⎤ ρˆi = (11)
⎢ ⎥ m2 + 1
Ak =
∂f k
= ⎢0 1 Ts vc cos θˆk+ ⎥ (5)
⎢ ⎥ k −mk
∂x ( xˆ +k ,u k ,0) ψˆ i = arctan 2( , ) (12)
⎢⎣0 0 1 ⎥⎦ 1 + m 1 + m2
2

W is the Jacobian matrix of partial derivates of f to w:


where m and k are the slope and intercept of the line in the local The slope m and intercept k of the line were computed using
coordinate as shown in (19), (20). In the present of robot’s the formula:
orientation, the measurement function h is described by:
⎡ −k ⎤ ∑xy −
∑ ∑
( x)( y)

⎢ ⎥ m= n (19)
⎡ ρˆi ⎤ ⎢

2
m +1 ⎥
⎥ ∑ 2
x −
( ∑x)2
⎢ ⎥ k − mk
⎢ψˆ i ⎥ = h(xˆ k , β i , ρi ) = ⎢ arctan2(1 + m 2 , 1 + m 2 ) ⎥ (13)
− n
⎢ θˆ ⎥ ⎢ ⎥
⎣ ⎦ ⎢ ⎥ k = y − mx (20)
θˆ − k
⎢ ⎥
⎣⎢ ⎦⎥ where n is the total number of data points, x and y are the
mean of the x- and y- coordinates of the data points
Equations for the correction step are now described as respectively. The algorithm would then be used to find the
follows: remaining points of the cloud to add more points to the line
segment. Points that lie too far from their previous neighbors
K k = Pk− HTk (H k Pk− HTk + R k ) −1 (14)
in the array would be ignored. The last point that met the
(
xˆ k = xˆ −k + K k z k − h ( xˆ -k ) ) (15) distance threshold was then treated as the other end of the
found line segment. The algorithm was repeated with the new
Pk = (I − K k H k )Pk− (16) group until all elements of the collected point cloud were
where xˆ k ∈ ℜ is the posteriori state estimate at step k given
n
checked.
measurement z k , Kk is the Kalman gain, H is the Jacobian Nevertheless, a fixed threshold disT did not work for all
matrix of partial derivates of h to x, Rk is the covariance matrix scans. Experiment results show that in about 10% of the scans,
of measurement noises. In the next section, algorithms to the threshold of the distance from one point to the best fit line
identify parameters for the correction step are presented. should be larger in order to detect lines from data points.
However if the threshold applied for all scans was too large,
III. IDENTIFICATION OF KALMAN FILTER’S PARAMETERS FROM extracted line segments would be not smooth and could affect
THE LRF AND OMNI-DIRECTIONAL CAMERA the matching result. In this paper, a simple dynamic threshold
was developed to solve this problem. A maximal threshold
A. Identification of Kalman filter’s parameters from the LRF disTmax was defined as the largest distance to determine if a
1) Line segments extraction point belongs to a line. For each group of points, instead of
This section describes the algorithm to extract line segments comparing the distance from each point to the best fit line with
from a set of points based on the least square quadratic (LSQ) fixed disT, all distances would be stored in an array and sorted
method. The point cloud was obtained from a SICK – LMS in an ascending order. If the N2th distance was smaller than the
221 laser scanner with an 180o field of view and a 0.5o angular maximal threshold disTmax, it would be chosen as the distance
resolution. threshold for this line segment. Therefore, the number of
The data collected by the LRF was first converted into distance threshold in each scan would be the same as the
Cartesian coordination and stored in an array according to the number of detected line segments.
equation: 2) Local and global maps matching
π The extracted line segments of same landmarks from local
x = r cos(ϕ ) (17)
180 and global maps are then matched together using a straight
π forward algorithm. The extracted line segments from global
y = r sin(ϕ) (18) map Gj and local map Li are described as follows:
180
where r and l are the range and bearing from robot position to Gj = [x1G, j x2G, j y1G, j x2G, j] j = 1…nG
obstacles respectively. Points that lie too close to robot Li = [x1L, i x2L, i y1L, i y2L, i] i = 1…nL
position (< 40cm) were treated as noise and were eliminated. where (x1, y1) and (x2, y2) are Cartesian coordinate of two end
Because the LRF scans from right to left, neighboring points points of extracted line segments respectively, nG and nL are
have high chance to lie on a same landmark such as on the number of extracted global and local line segments. The
wall. The LSQ was applied to a group of N1 consecutive laser detected global line segments are first transformed into local
readings starting from the first points to find the best fit line of map with the equation:
these points. A distance threshold disT was predefined and
⎛ ⎡ x1G , j y1G , j ⎤ ⎡ x0 y0 ⎤ ⎞
compared with the distance from each point in the group to the GTj = G ⎜ ⎢ ⎥− ⎟ (21)
best fit line. If there were more than N2 of the points in the ⎜ ⎢ x2G , j y2G , j ⎥ ⎢⎣ x0 y0 ⎦⎥ ⎟
⎝⎣ ⎦ ⎠
group (in our research, N2=0.75N1), the distance of which is
fewer than the disT, a line was detected and a new best fit line
was calculated based on these points. The point with the
smallest index would be saved as one end of the line segment.
⎛ cos ϕ0 − sin ϕ0 ⎞ 3) Estimation of line parameter’s covariances
where G = ⎜ ⎟ is the conversion matrix, In order to compute the measurement covariance for the
⎝ sin ϕ0 cos ϕ0 ⎠
EKF, each extracted line segment could be presented as ( , )
[ xo yo φo ]T are indicators of robot’s position in global map parameters where stands for the perpendicular distance from
estimated by odometry. Each local line segment Li is robot’s position to the line and is the line orientation:
compared with all transformed global line segments GT, j, and −k
ρ= (22)
two line segments are considered as “matched” if their range
m2 + 1
and bearing to robot position are approximately the same and
the overlapping rate between them is less than previous ψ = arctan 2(
k
,
−mk
) (23)
defined threshold [2]. 1 + m2 1 + m2
The range and bearing of global and local line segments to where (m, k) are slope and intercept of detected line which
robot position in robot’s coordinate are presented as were computed in (19), (20).
( ρG , j ,ψ G , j ) and ( ρ L ,i ,ψ L ,i ) respectively From the solution give by Derich [6], the measurement
covariance matrix, R, can be calculated with the assumption
that each data point has the same Cartesian uncertainty:
2 2 2
Gj aiσ yy − biσ xy + ciσ xx ⎛ 1 −e ⎞
Ri = ⎜⎜ 2⎟⎟
(ai − ci )2 + bi 2 ⎝ −e e ⎠
Li
⎛0 0 ⎞
⎜ ⎟
+⎜ σ yy cos ϕi + σ xx sin ϕi − 2σ xy sin ϕi cos ϕi ⎟ (24)
2 2 2 2 2

⎜0 ⎟
G,j ⎝ n ⎠
L,i ⎡var(ri ) 0 ⎤
≅⎢
L,i
⎣ 0 var(ψ i ) ⎥⎦
G,j where
π 1 1
ei = y cosψ i − x sinψ i ; ϕi = (ψ i + ); x = xj; y = yj;
x n n
Figure 2. The range and bearing of global and local line segments to robot’s
position in robot’s coordinate
ai = ∑ ( x j − x ) 2 ; bi = 2 ∑ ( x j − x )( y j − y ); ci = ∑(y j − y )2 .

The overlapping rate between the local line segment Li and


In the presence of the covariance of the orientation
the transformed global line segment GT, j is defined as follows:
determined by the omni-directional camera (shown in next
section), the measurement noise covariance matrix for the
Ok (ak , bk ) = ak + bk − GT , j GT,j correction step of the EKF is written as:
ak
Ok (ck , d k ) = ck + d k − GT , j bk ⎡var(ri ) ... 0 0 0 ⎤
ck
⎢ ⎥
k = ( j − 1)nL + i dk Li ⎢ ... var( ψ i ) ... ⎥
k = 1, 2,..., nL nG − 1, nL nG R k ≅ ⎢ ... ... ... ⎥ (25)
⎢ ⎥
Figure 3. The overlapping parameters ⎢ ... var(ψ i ) 0 ⎥
between global and local line segments ⎢ 0 0 0 ... 0.0265⎥⎦

where G T , j is the length of transformed global line segment, B. Identification of Kalman filter’s parameters from the omni-
ak, bk, ck and dk are Euclidean distances between the end points directional camera
of the line segments Li and GT, j. In this paper, the omni-directional camera is used as an
The inequations below represent conditions for matching absolute orientation measurement and is fused with the LRF
local and transformed global line segments in robot’s sensor to enhance the localization accuracy for the robot. The
coordinate. Two line segments Li and GT,j are matched if all method is based on the detection of a red vertical landmarks
following conditions are met: located at a fixed position (xm, ym) and paralleled with the
⎧Ok (ak , bk ) < T optical axis of the camera. The conservation of line feature
⎪ ensures that the shape of the landmark is unchanged in both
⎪Ok (ck , d k ) < T
⎨( ρ − ρ )2 < T omni-directional and panorama images. The robot’s orientation
⎪ L ,i G, j ρ determination then becomes the problem of calculating the
⎪(ψ −ψ )2 < T orientation l of the landmark (fig.4). The algorithm can be
⎩ L ,i G, j ψ
summarized as follows: From the capture image, a digital filter
where T, T , T are predefined threshold. is applied to eliminate random noises. The red area is then
detected and the image is transformed from the RGB color
space to the grey scale. By applying the Hough transform, the 10000
vertical line is extracted and the value l corresponding to the
robot’s orientation is obtained. 8000
In order to determine the variance of the estimation, the
orientation values calculated from the omni-directional camera 6000
are experimentally compared with the values measured by a

Y (mm)
high accurate compass sensor. In our system, the variance is
determined with the value 0.0265 rad2. 4000

2000

-1 -0.5 0 0.5 1
X (mm) 4
x 10
Figure 4. Line detection using Hough transform Figure 6. Extracted line segments with dynamic threshold

IV. EXPERIMENTS 10000


In this section, experiments are conducted to evaluate the
efficiency of the fusion algorithm. 8000
A. Experimental Setup disT 5=169
A rectangular shaped flat-wall environment constructed 6000 disT 4=160
from several wooden plates surrounded by a cement wall is set
Y (mm)
up. The robot is a unicycle-like mobile robot with two wheels, 4000
disT 2=140
differential drive (fig.5). Its wheel diameter is 10 cm and the
distance between two drive wheels is 60cm. The sampling disT 3=300
period Ts of the EKF is 100ms. 2000
disT 1=237
0

-1 -0.5 0 0.5 1
X (mm) 4
x 10
Figure 7. Extracted line segments with dynamic threshold

10000
global
8000 local
G2
6000 G3
L4
Y (mm)

G1
4000
L2
L5 L3
Figure 5. A unicycle-like mobile robot 2000 L1
B. Line segments extraction
The algorithm was programmed in MATLAB and the 0
computation time for a data set of 360 points with fixed G4
threshold disT is 0.15s. Computation time for scanning with -1 -0.5 0 0.5 1
dynamic threshold is from 0.20s to 0.25s depending on the X (mm) 4
x 10
number of detected line segments. Fig.6 shows the result with Figure 8. Matching local and global line segments
fixed threshold disT = 200; fig.7 shows the result with
different thresholds for different line segments.
C. Matching local and global maps and omni-directional camera. The deviations between each
Fig.8 presents the result of the matching line segments trajectories and the real one are represented in fig.10.
where the blue ones stand for the transformed global line It is concluded that the EKF algorithm improves the robot
segments and the red ones are for local line segments. In localization and its combination with the LRF and omni-
reality, there are line segments detected in global map but do directional camera gives the best result.
not appear in the local map and vice versa. Therefore, in this
V. CONCLUSION
case, the authors only studied those appearing in both maps.
With the matching criteria considered, (L3, G1), (L4, G2), In this paper, the EKF is implemented for the problem of
(L5, G3) met all the conditions and were paired up together. mobile robot localization. A combination of the LRF and omni-
directional camera is introduced as the measurement system.
This measurement system in combination with novel line
3.5 extraction and map matching algorithms simplifies the
Odometry
implementation while enhancing the accuracy of the EKF.
LRF
Experiments confirmed the effectiveness and applicability of
3 LRF and Omni
the proposed approach.
T RUE
2.5 ACKNOWLEDGMENT
This work was supported by Vietnam National Foundation for
Y-axis (m)

2 Science and Technology Development (NAFOSTED).


REFERENCES
1.5
[1] A. Diosi, L. Kleeman, “Advanced Sonar and Laser Range Finder
Fusion for Simultaneous Localization and Mapping”, Proc. of
1 2004 IEEE/RSJ International Conference on Intelligent Robots
and Systems, pp. 1854-1859, Sendai, Japan, 2004
[2] L. Tesli´c, I. Škrjanc and G. Klanˇcar, “EKF-Based Localization
0.5
of a Wheeled Mobile Robot in Structured Environments”, J.
Intelligent and Robotic Systems, vol. 62 Issue 2, pp.187-203,
0 2011
0 0.5 1 1.5 2 [3] J. A. Castellanos, J. Neira, J. D. Tardo, “Multisensor Fusion for
X-axis (m) Simultaneous Localization and Map Building”, IEEE
Figure 9. Estimated robot trajectories under different EKF configurations Transaction on Robotics and Automation, Vol. 17, No. 6, pp.
908 – 914, 2001
2 [4] G. Gallegos and P. Rives. Indoor SLAM Based on Composite
Odometry Sensor Mixing Laser Scans and Omnidirectional Images. In
LRF IEEE International Conference on Robotics and Automation, pp.
LRF and Omni 3519-3524, Alaska, USA, May 2010
1.5
[5] A. Diosi and L. Kleeman. Laser Scan Matching in Polar
Coordinates with Application to SLAM. In Proceedings of the
Deviation (m)

IEEE/RSJ International Conference on Robotics and


1 Automation, pp 3317-3322, Edmonton, Canada, 2005
[6] Deriche, R., Vaillant, R. & Fauregas, O., “From Noisy Edges
Points to 3D Reconstruction of a Scene: A Robust Approach and
Its Uncertainty Analysis”, Vol. 2, World Scientific, pp. 71-79.
0.5
Series in Machine Perception and Artificial Intelligent
[7] S. Jia, Yasuda, A., Chugo, D., Takase, K. “LRF-based self-
localization of mobile robot using Extended Kalman Filter”,
0 SICE Annual Conference, pp. 2295-2298, 2008.
0
40 20 60 80 100 [8] Sasisdek, J.Z “Sensor data fusion using Kalman filter”,
T ime (100ms) Proceedings of the Third International Conference on
Figure 10. The deviation between estimated positions and the real one Information Fusion, 2000, WED5/19 - WED5/25 vol.2.
[9] Georgiev. A, “Localization methods for a mobile robot in urban
D. Localization using the EKF with the LRF and Omni- environments” IEEE Transactions on Robotics, Vol 20, Issue 5,
directional camera p.851-864.
In order to evaluate the efficiency of the fusion algorithm [10] G. Welch, G. Bishop, “An Introduction to the Kalman Filter”,
for the problem of localization, different configurations of the Proc. of SIGGRAPH, 2001.
EKF were implemented. Fig.9 describes the trajectories of a
robot movement estimated by three methods: the odometry, the
the EKF with LRF, and the EKF with the combination of LRF

View publication stats

You might also like