You are on page 1of 11

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/257940995

Performance Assessment of an Ultra-Tightly Coupled Vision-Aided INS/GNSS


Navigation System

Conference Paper · January 2011

CITATIONS READS

3 231

4 authors, including:

Benoît Priot Mohamed Sahmoudi


Institut Supérieur de l'Aéronautique et de l'Espace (ISAE) Telespazio France
16 PUBLICATIONS   40 CITATIONS    44 PUBLICATIONS   374 CITATIONS   

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Multi-Frequency GNSS RTK for Precise Positioning, ETS Montréal & GEDEX View project

Advanced Processing of Non-Stationary Signals in Impulsive Noise View project

All content following this page was uploaded by Mohamed Sahmoudi on 03 July 2018.

The user has requested enhancement of the downloaded file.


Performance Assessment of an
Ultra-Tightly Coupled Vision-Aided INS/GNSS
Navigation System
Benoit Priot, Christelle Peillon, Vincent Calmettes and Mohamed Sahmoudi,
ISAE, Université de Toulouse, France

BIOGRAPHIES an associate professor at the French Institute of


Benoit Priot received the Engineering Diploma from Aeronautics and Space (ISAE). His research interest
ENSIL (Ecole Nationale Supérieure d’Ingénieur de includes weak multi-GNSS signals processing, , multipath
Limoges, France) in the speciality Electronic, mitigation, multi-sensor fusion and robust signal
Telecommunications and Instrumentation, in 2004. Since processing for communication systems.
2007, he is working as an Engineer on INS and GNSS
navigation systems at the French Institute of Aeronautics
and Space (ISAE), Toulouse, France. ABSTRACT
Christelle Peillon received in 2008 the engineer diploma
from the French civil aviation university ENAC (Ecole In this paper, we propose to design a multi-sensor
Nationale de l’Aviation Civile) in Electronic engineering, navigation system for harsh GNSS environments, by
Imagery and Embedded Systems, in Toulouse, France. deeply integrating a MEMS-based INS, GNSS signals and
Since 2009, she is working as a research engineer on a vision system with a single camera for performing
image processing for SLAM applications at the French inertial sensors calibration. We process advantageously
Institute of Aeronautics and Space (ISAE), Toulouse, the images delivered by the camera to perform the
France. calibration of INS sensors which is necessary for aiding
Vincent Calmettes received Ph.D. degree in signal efficiently GNSS receiver tracking loops, by providing a
processing from ISAE (French Institute of Aeronautics consistent estimate of acceleration in satellites-receiver
and Space, Toulouse, France). He is a training and lines of sight (LOS).
research scientist at the signal communications, antenna
and navigation laboratory of ISAE. His research interests The camera is used as an optical odometer to achieve INS
include the development of solutions based on DSP and calibration in a GNSS-denied environment, by tracking a
programmable logic devices or ASICs for applications in set of points (i.e. landmarks) included in the images. The
Digital communications and Signal processing. He is key idea of the proposed method relies on the way the
currently working on new Galileo signal processing and different sensor measurements are combined and the
GNSS receivers’ architectures. He is also involved in covariance matrices are managed, by considering the prior
several projects devoted to positioning and attitude knowledge we have on the system. A second contribution
determination, with great interest in integrated navigation is the improvement of the image processing methodology.
solutions for difficult environments. The adopted method, based on the SURF algorithm
Mohamed Sahmoudi received a PhD in signal exploits a set of landmarks by relating their positions in
processing and communications systems from Paris-Sud the image to the platform position-attitude. First, we
University, France, in collaboration with Telecom Paris in optimize the primary detection of landmarks when the
2004, and a M.S. degree in probability and statistics from navigation system faces problems under harsh GNSS
Pierre and Marie Curie University in 2000. During his environments. The direction and the distance of these
PhD, he was an assistant lecturer at Ecole Polytechnique, landmarks are initialized and estimated over the tracking
then a lecturer at Paris-Dauphine University. From 2005 process. When new points are acquired, the state model is
to 2007, he was a post-doc researcher on GPS and signal constructed by taking into account the knowledge we
processing at Villanova University, PA, USA. In august have on the vehicle position and attitude estimates at the
2007, he joined the ETS School of Engineering at acquisition epoch. Moreover, by exploiting the a priori
Montreal, Canada, to work on GNSS RTK for robust and covariance on the position and attitude estimates, taking
precise positioning. Starting December 2009, he becomes benefit from the INS solution, the algorithm is able to

International Technical Meeting of The Institute of Navigation, 652


San Diego, CA, January 24-26, 2011
bound the area of each landmark and to eliminate the integrates the inertial module unit (IMU) in the receiver
outliers. for reducing tracking loops bandwidth and providing
other valuable aids. Performances of such a navigation
Another important contribution of this study is the system depend on the quality of inertial sensors. Tracking
experimental performance assessment of the system, from capability depends highly on the noise level, bias stability,
a measurements campaign. The data was collected using a scale factor linearity and bandwidth of inertial sensors.
terrestrial vehicle, equipped with time-synchronized
systems including: For low cost applications, MEMS-based INS and GNSS
- A Synchronized Position Attitude Navigation system sensors are coupled for providing low cost systems for
(SPAN) based on a DGPS receiver combined with a attitude, velocity and position estimation. In particular,
tactical grade IMU, used as a reference navigation MEMS gyroscopes and accelerometers are used to
platform; provide a low cost navigation solution by taking
- A camera which collects the images from the main advantage of a fast growing market and rapid progressing
front direction. That camera is mounted with the front technology. However, tracking weak GNSS signals in a
of the base towards the primary viewing direction. degraded environment cannot be achieved when a
MEMS-based SINS/GNSS deeply integrated system is
The recorded data is post-processed using a Matlab considered, because MEMS measurement errors grow
software of tools developed by the Navigation team of rapidly over time. In that case, IMU sensors must be
SCAN Lab at ISAE, Toulouse, France. GNSS signal is calibrated using external systems. In general,
modeled considering 7 satellites in open environment. A improvements are obtained under two conditions:
MEMS based IMU is simulated by degrading the tactical - The Inertial Module Unit (IMU) must be calibrated
grade IMU measurements. An important effort is devoted and the INS output must be compensated, by using
to use efficiently each sensor measurement (GNSS, IMU, GNSS measurements,
Image sensor), and to assess the performance of this - INS drift must remain bounded to bridge satellites
system under realistic conditions. Initially validated by outages.
simulations, the goal of this work is to ensure a coherent
interaction and synchronization between each module of In indoor environment, these conditions are not met and
the system and to test its functionality from experimental MEMS-based INS/GNSS systems are not able to provide
data. The algorithm shows good performance according to the usual required level of performance. Then two main
the simulations and processed data. When the carrier to issues are to be considered: GNSS signals tracking and
noise density ratio (C/N0) decreases to low levels the IMU sensors calibration. These operations cannot be
combined use of the MEMS-based INS and vision carried out without the aid of alternative navigation
provides the vehicle attitude and position with a good sensors.
accuracy. The main benefit of this system is that it allows
us to perform inertial sensors calibration and This paper proposes to assess the performance of a deep
consequently to facilitate weak GNSS signals tracking. integrated vision-aided INS/GNSS system, which has
been designed to address navigation issue in indoor
environment. We show in this research that improvements
I- INTRODUCTION are possible by processing the images provided by a low
cost camera mounted on the vehicle. Indeed, by tracking
Although the market of satellite navigation systems is a set of points included in the images sequence of the
witnessing a rapid growth, for most demanding navigation scene [6, 10, 11, 13], such a system is able to
applications such as navigation at deep urban canyons and achieve inertial sensors calibration [10], to improve the
indoor, stand-alone GNSS devices cannot be used [7]. aided-GNSS receiver performances [14], and to provide
Thus, hybridization of complementary navigation sensors the vehicle position, velocity and attitude [2].
is intended to satisfy user’s requirements [7]. Inertial
Navigation Systems (INS) and GNSS systems are often This paper is organized as follows: section II describes
integrated together to take advantage of their the system which has been specifically designed for
complementary strengths. Integration of strapdown addressing the navigation problem in harsh environments.
inertial navigation system (SINS) and global navigation Then, the experimental conditions are reported in section
satellites system (GNSS) is often performed to improve III. Section IV analyzes GNSS/INS performance whereas
the performance of navigation systems in a degraded Vision/INS integration is discussed in section V. Then the
GNSS environment (low C/No ratio, jamming, GNSS adopted multi-sensors fusion method is described in
signals blockage, etc) [1, 2, 4, 7]. Under high dynamic section VI. Finally section VII summarizes some
conditions, an ultra-tightly coupled SINS/GNSS conclusions and list our perspectives on remained issues
navigation system is ideally adopted to improve GNSS to be considered in the future.
signals tracking performance [1, 7]. This architecture

International Technical Meeting of The Institute of Navigation, 653


San Diego, CA, January 24-26, 2011
II- DEVELOPED NAVIGATION SYSTEM and INS/Vision estimates. The federated filter is a master
filter which combines information from local filters of
1. Problematic single or partially combined sensors to obtain a total
system solution [2, 5]. The federated fusion approach
Stand-alone GNSS receiver doesn’t seem to be suitable allows several different information sharing strategies for
for navigation in indoor environments. First, GNSS signal different applications.
suffers from blockage and multipath problems. Then
GNSS signal attenuation (30 dB or more) leads to a strong
degradation of the signal over noise power spectrum III- MEASUREMENT CAMPAIGN
density ratio (C/N0). Consequently, GNSS receivers are
not able to track GNSS signal. To solve this problem, INS The aim of this campaign was to collect synchronized
is often coupled to a GNSS receiver. When MEMS-based data in order to assess the performance of the system
INS systems are used for addressing mass market described in figure 1. The experimental platform is shown
consumer demand, the usual required level of in the figure 2.
performance is not reached. In that case, improvements
are made by implementiong additional sensors.

2. Deep integrated Vision aided-INS/GNSS


IMU
In the frame of this study, we propose to design a deep
integrated Vision aided-INS/GNSS system. This system is
described in [2] and its architecture is shown in figure 1.

DGPS IMU Camera


GNSS signal Image Digital Receiver
GNSS/INS Vision/INS
Tracking processing Camera
Estimator Estimator
modules Figure 2- experimental platform
Fusion
The platform is mainly composed of a Novatel DGPS/INS
integrated system which includes a FSAS IMAR Inertial
Figure 1-Deep integrated Vision aided-INS/GNSS Module Unit, and a single digital camera. The Novatel
DGPS/INS integrated system is used as a reference. This
This system is based on two separated estimators which system provides not only the platform position, velocity
are fused to faces GNSS signal attenuation in indoor and attitude, but also the IMU sensors measurements. The
environment. camera is mounted in the direction of the main axis of the
platform. It is based on a fixed focal-length lens.
The GNSS/INS estimator is designed for calibrating IMU
sensors in open GNSS environment. Such an estimator 1. GNSS signal
provides an estimation of the vehicle position, velocity
and attitude and depends mainly on the GNSS system Signal tracking modules are based on a bank of local
(Satellites geometry, C/N0 ratios). It uses pseudoranges extended Kalman filter which uses correlators output as
measurements provided by a GNSS receiver. Here the measurements. Such a module should be composed of a
GNSS receiver is based on an extended Kalman filter signal sampler, numerically controlled oscillators, and a
(EKF) which exploits external aids to reach the bank of correlators. Each of these modules should be
performance of a static receiver under high dynamic synchronized using an ultra stable reference clock. For
conditions, and consequently to track weak signals in our purpose, it was easier to model signal parameters and
poor environment. correlators outputs. Using a complex notation (two arms I
and Q signal), the expression of the signal, at the receiver
The second estimator is an image-aided inertial input is:
navigation system based on a single camera, which
exploits the location of landmarks in the image in order to Nk
update the inertial navigation solution and to calibrate rn = ∑ u nk ; N k satellites number,
inertial sensors. Of course this estimator is independent k =1

( )+ w .
of the GNSS signal.
where u nk = C k c(n − τ nk ) exp j ϕ nk k
n
The global sensor fusion is performed using a
combination of distributed local estimates : GNSS/INS

International Technical Meeting of The Institute of Navigation, 654


San Diego, CA, January 24-26, 2011
k 2. IMU outputs
In this expression C represents the kth satellite
signal amplitude, c is a pseudo-random code sequence, τ The aim of this study is to show the interest of a vision
and ϕ are respectively the propagation delay and the sensor during satellites outages. Improvements brought by
carrier phase. The additive noise w is a complex white this sensor depend on inertial sensors performance.
Gaussian noise. By considering a pilot signal or an Simulations presented in this paper were performed by
assisted GPS receiver, the GPS data message is not considering poor quality sensors in order to reduce the
considered here. simulation duration. Especially sensors driving noises
power was increased for accentuating biases drift over the
The number of satellites is equal to 7 over the simulation. simulation period. Thus IMU measurements were
Satellites C/N0 ratios are controlled for modeling open obtained from the tactical grade IMAR IMU by adding a
environment and indoor environment. The other white noise signal and a random walk process with the
parameters of the signal are obtained from the satellite following properties.
ephemeris and from the DGPS/INS system, which
provide respectively the satellites positions and velocities, Gyroscope Accelerometer
and the receiver’s position and velocity. White Noise PSD 2e-2 °/s/√Hz 2e-2 m/s2/√Hz

(x ) + (y ) + (z )
2 2 2
Random Walk driving noise PSD 2e-2°/s2/√Hz 5e-2 m/s3/√Hz
− x nk − y nk − z nk + bn + n k
τ nk = n n n
Table 1- IMU Sensors noises
c
n
f L1Ts
ϕ nk = ϕ 0k + 2π
c
∑ (V
m=0
m − Vmk ) 3. Image sensor

Where : A single camera was used to acquire the images collected

[ ] represents,
over the path of a pedestrian. The acquisition rate was
- [x n yn z n ] and xnk
T
y nk z nk
T
fixed to 10Hz. A calibration was performed in order to
respectively, the platform position and the kth satellite determine the focal length of the camera lens. The camera
position at the epoch n. was used for tracking some singular points included in the
image as shown in figure 4.
- Vn − Vnk represents the platform - kth satellite relative
velocity at the epoch n.
- nk is a random value which models pseudorange error.

The receiver clock bias bn is modeled under the


assumption that the receiver is able to estimate the
parameters of the clock predictable error. The expression
of the clock bias is:
t1

b(t ) = b(t 0 ) + a0 (t − t 0 ) + a1 (t1 − t 0 ) + ∫ nc (u )du


2

t0

Where the coefficients a0 and a1 characterize the


oscillator bias and drift, and nc represents a white noise.

The non-predictable error is : Figure 4- Image processing


t1

bnp (t ) = ∫ nc (u )du
t0 IV- GNSS/INS DEEP INTEGRATION
In our model, only the non-predictable error is considered.
An ultra stable “Oven Controlled Crystal Oscillator” is 1. GNSS signals tracking
assumed to be used here. Evolution over the time of the
non-predictable error of this oscillator is represented in The receiver designed in the frame of this study is based
figure 3. on an extended Kalman filter. This estimator exploits,
during the state propagation step, the knowledge of the
No predictable error 10 m vehicle acceleration in the satellite-receiver line of sight
(LOS). It uses, for updating the a priori estimate,
10 mn measurements obtained from the outputs of a set of
Figure 3- Non predictable receiver clock error correlators. This module is described in [2]. Its
architecture is shown in figure 5.

International Technical Meeting of The Institute of Navigation, 655


San Diego, CA, January 24-26, 2011
{
External aids a ( k ) , fG ( k ) } operations are achieved here by using, as measurements
in an EKF filter, the pseudoranges estimated by a set of 7
Received Tracking module tracking modules (7 satellites are used).
signal τ (k )
Coherent EKF Estimated LOS accelerations and velocities
Correlator a1n , Vn1
Satellite 1
τ n1
Local replica Hybridization
Tracking module Navigation errors
Signal generator an7 , Vn7 &
τ n7 Bias estimation
τ (k )
,f (k )
, A ,ϕ
(k ) (k )
Satellite 7 (EKF)
Tracking module
Calibration Correction

Figure 5- GNSS Signal tracking module Inertial


sensors INS Position,
Velocity,
In open environment, GNSS signal provides Attitude
Corrected
measurements of high-quality and the receiver shows very Position, Velocity, Attitude
good performances. At the contrary, during satellites
outages, the receiver cannot take advantage of GNSS Figure 7- GNSS/INS integration
signal; in that case the capability to track depends on the
quality of the external aids. Figure 6 shows the For degraded C/N0 ratios, improvements are brought
performance obtained by processing the data collected through the feedback link; calibrated inertial sensors
during the measurement campaign. The tactical grade measurements and corrected velocity and attitude are
IMU is used here to carry out the estimation of the LOS processed to provide the LOS satellites-vehicle
satellite-vehicle acceleration. accelerations and velocities to the tracking modules. We
5 will explain in the section VI how are managed the
1 mn tracking modules and the main EKF filter covariance
0 matrices.
C/N0 degradation
Pseudorange error

-5 from 50 to 30dB-Hz Performances obtained with this system are analyzed


C/N0 degradation
from 50 to 0dB-Hz
through the figures 8-10. Data collected during the
-10
measurement campaign are processed over 100 seconds.
Red curves are obtained when the tactical grade IMU is
-15
directly used whereas blue one are established from
Convergence
-20 degraded IMU measurements (see Table 1).

-25 Conclusions can be easily drawn for this simulation. The


260 270 280 290 300 310 320 330 340 350
designed INS/GNSS integrated system shows very poor
t in sec performances when the MEMS sensors described in table
Figure 6- GNSS Signal tracking module performance 1 are used; these performances are due to bias drift of
inertial sensors. In that case, satellites tracking modules
can’t performed GNSS signal tracking. On the contrary
The error on pseudorange estimate is expressed in meters. high quality IMU allows to maintain these modules
As the C/N0 ratio is decreased, drift on pseudorange locked during the considered satellites outage.
estimate is observed. For very poor C/N0 ratio, this drift
depends only on the IMU quality. So this analysis 2000
MEMS IMU
highlights how the external aid is important during IMAR-FSAS IMU
position error (in m)

satellites outage. . 1500

1000
2. GNSS/INS integration

A deeply integrated INS/GNSS navigation system was 500


designed [2]. This system is described in figure 7.

Navigation is carried out by an INS based on the IMU 0


-20 0 20 40 60 80 100
whose sensors are defined in table 1. IMU calibration and time (in s)
INS outputs correction must be performed for satisfying Figure 8- GNSS/INS integration/Positioning error
the requirements related to the application. These

International Technical Meeting of The Institute of Navigation, 656


San Diego, CA, January 24-26, 2011
100
1. Interest point detection
MEMS IMU
LOS velocity error (in m/s)

80 IMAR-FSAS IMU In order to detect points in images we have chosen the


algorithm used for the SURF. The interest point detection
60 uses a very basic Hessian-matrix approximation. We
compute an approximation for the Hessian matrix with
40 box filters. These calculations are done by using integral
images. Integral images have been made popular by Viola
20 and Jones [12]; they reduce the computation time
drastically. Then we compute the Hessian’s determinant
0
with a relative weight w of the filter responses. It is used
-20 to balance the expression for the Hessian’s determinant.
-20 0 20 40 60 80 100 For theoretical correctness the weighting changes
time (in s) depending on the scale. In practice [3] propose to keep
Figure 9- GNSS/INS integration/LOS Velocity error this factor constant because it doesn’t have a significant
impact on the results.
Det(Happrox)=DxxDyy-(wDxy)2 with w = 0.912
LOS acceleration error (in m/s²)

4
MEMS IMU
IMAR-FSAS IMU The approximated determinant of the Hessian represents
3
the blob response in the image at location x. These
responses are stored in a blob response map over different
2 scales. The interest points detected correspond to the
maxima of the map.
1
2. Interest point description
0
Under assumption of slow motion of the vehicle, some
-1 algorithms perform pixels detection and matching by
-20 0 20 40 60 80 100 searching in the new image, for each tracked pixel, a point
time (in s)
in its vicinity. This method is not very robust. In the
Figure 10- GNSS/INS integration/LOS Accel. error frame of this application a descriptor is defined in order to
compare the regions of the detected points and then to
As a corollary of these results, we can state that the level eliminate outliers. Indeed in order to characterise each
of performance of MEMS based INS/GNSS system detected point we used the descriptor described in [3].
should be increased by using additional sensors for This descriptor doesn’t analyse the distribution of the
calibrating IMU sensors gradient like in other algorithms but the first order Haar
wavelet responses in x and y direction. This processing is
performed on integral images. Compared to the algorithm
described in [3], the INS attitude and position can be used
V- GNSS/Vision INTEGRATION here to compute the interest point orientation. It provides,
for each pixel, a descriptor vector of size 64.
In this section an image-aided inertial navigation system
is considered. Image processing consists of extracting 3. Vision/INS integration
landmarks in the image and to track this landmark in
order to update INS errors and sensors bias estimates. The The algorithm is presented and analyzed in [2]. It is based
system is described in [2] when simulated images are on an extended Kalman filter, which is used to estimate
processed. In the frame of this paper, the processed the INS position, velocity and attitude errors, IMU
images are provided by the camera which is mounted on sensors biases, and landmarks parameters which are
the platform. This processing consists of several stages: exploited by the measurement process; ie the initial
- Landmarks detection and description, optical ray’s direction vector of the point n in the
- Landmarks matching, navigation frame and the initial inverse-distance of the
- Use of Landmarks for updating the INS solution and point n. Tracked landmarks are used in the EKF for
for calibrating IMU sensors. updating the estimates. The architecture of this system is
shown in figure 11.
We present in this paper the operations related to the
algorithm which are not analyzed in [2].

International Technical Meeting of The Institute of Navigation, 657


San Diego, CA, January 24-26, 2011
[
State vector x = δr δv δψ bacc bgyr υ1N ρ1 ... υ 20N ρ 20 ] T
descriptors are constructed. Then matching is performed;
each descriptor is compared to the appropriate one; the
Pixels
Exclusion descriptor of a pixel included in kth ellipsoid, is compared
Hybridization
to the descriptor of the kth pixel. This comparison, which
Navigation errors
& { P (k ) }
m
M
Image Digital needs to compute distances between descriptors, aims to
Bias estimation processing Camera
(EKF) select the most adapted pixel. Finally selected pixels are
used to provide the measurement innovation.
Calibration Correction

Inertial
INS Position,
The last step consists of acquiring new pixels when the
sensors number of tracked pixels is lower than 20. Pixels selection
Velocity, PmM (k ) represents the pixel is an important issue. This selection eliminates pixels in
Attitude
coordinates of the mth pixel
the vicinity of the currently tracked points. New pixels
and their descriptors are added to the set of tracked pixels.
Figure 11- GNSS/INS integration
4. Algorithm assessment
The measurement update is based on images processing,
following the approach described in [9]. As explained Performances of the algorithm were assessed within the
previously the algorithm relies on pixels tracking. A set of frame of the measurement campaign. Figure 13 shows the
twenty pixels is considered over the measurement number of tracked landmarks over the time. Most of the
campaign. The innovation, which represents the time, the number of points is lower than 20. The
measurement error, is obtained over several steps. algorithm favors robustness of the solution by selecting
the most appropriate set of points.
First interest points are detected over the whole image, of points landmarks
using a very basic Hessian-matrix approximation. Then 20
tracked

only the relevant pixels are conserved. Outlier pixels


exclusion needs to define the area where pixels should be 15
located. For each tracked pixel, this area is obtained by
of tracked

using the state vector and the error covariance matrix Pk− , 10
obtained after the system-propagation phase. This area is
number

represented by an ellipsoid (see figure 12). 5


Number

Pixel k 0
PˆkM [k ]
0 10 20 30
x x tim e (in s)

Figure 13- Number of tracked landmarks


Relevant pixel
x Other figures (14, 15 and 16) shows performance
y x Outlier pixel
obtained with the algorithm over the period of the
measurement campaign. Position and velocity errors are
Figure 12- kth pixel area strongly reduced when the images are exploited. The
MEMS based IMU leads to an important position and
Its center is defined by the estimated pixel coordinates, velocity drifts, which are related to the random walk noise
deduced from the vehicle position and attitude, and from (see table 1). Using the vision, performance is increased.
the pixel parameters. Nevertheless this performance cannot reach the level of
(
PˆkM [k ] = h rINS + δr − , ψ INS + δψ − , ρ k− , υ k− ) the tactical grade IMU. In the same way, it is shown
(figure 16) that the vision can be used for sensors
The ellipsoid equation, expressed in the frame defined calibration. In the context of this study, after 30s, the
figure 12 is: projected acceleration error is reduced from 1.5 m/s2 to

[ ] [x y]
0.2 m/s2 when the vision is used.
[x y ] H k Pk− H kT + Rk
−1 T
= 32

Where H is the observation matrix and R is the


measurement noise covariance matrix.

The next step consists to characterize relevant pixels.


Once all the relevant pixels are identified, their

International Technical Meeting of The Institute of Navigation, 658


San Diego, CA, January 24-26, 2011
measurement. It has been designed for tracking low C/NO
20 signal. The second one performs images processing for
INS MEMS
IMAR-FSAS
tracking points of interest in the image of view in order to
improve the INS solution by calibrating the inertial
position error (in m)

15 INS MEMS + vision


sensors. In this section we propose to fuse these two
distributed solutions for taking advantages of their
10 complementary properties. The architecture of the global
system is presented in figure 18. A very simple strategy is
5 adopted to manage this system. Two estimates are
conducted independently. For each estimator,
measurements outliers are detected and rejected.
0 Concerning the vision, outlier pixels are excluded over
0 10 20 30 two stages (see the previous section). First pixels
time (in s) outside the bounding ellipse are removed. Then pixels
descriptors are analyzed. Only pixels with correct
Satellites outage
descriptor are selected.
Figure 14- Vision/INS integration. Positioning error
4 relevants pixels
INS MEMS selected pixel
IMAR-FSAS
velocity error (in m/s)

3 INS MEMS + vision

0 Mahalanobis distance
0 10 20 30
time (in s) Figure 17- Vision/INS integration -Velocity error
Figure 15- Vision/INS integration -Velocity error
Concerning GNSS tracking modules, impact of low C/N0
1.5 ratio is reduced by increasing the estimated pseudorange
INS MEMS
acceleration error (in m/s²)

variance as follows:
IMAR-FSAS 2
Tchip
INS MEMS + vision σ τ (k ) = σ τ (k ) + 10.
2 2
in s 2
1 C / N0
This degradation impact GNSS/INS estimates by
reinforcing the weight of IMU measurements.

0.5
GNSS/INS Estimator Vision/INS Estimator
State Propagation State Propagation

τ m (k ) { P (k ) }
M

0 Measurement Update m
Measurement Update
0 10 20 30
time (ins) θ ,P
(1)
k k
(1)
θ k( 2) , Pk( 2 )
Fusion
Figure 16- Vision/INS integration -Acceleration error
θ ( Fusion )
k ,P
k
( Fusion )

Figure 18- Sensors fusion

VI- MULTISENSORS DATA FUSION Since outliers are processed inside each estimator, their
output are merged under the assumption of consistent
In the previous sections two systems were analyzed: a local estimates. Indeed fusion is performed to provide the
GNSS/INS deeply integrated system and a Vision/INS optimized estimate [8].
integrated system. The first one exploits GNSS signal as θˆ fusion = ∑ aiθˆi avec ∑ ai = 1
i i

International Technical Meeting of The Institute of Navigation, 659


San Diego, CA, January 24-26, 2011
Weights are computed for minimizing the power J of the ACKNOWLEDGMENTS
error depending on the estimates covariance.

{
J = E (θ fusion − θ )
2
} This work was supported by DGA (the French Defense
Agency) in the context of the REI Project “Multi-Sensor
Data Fusion for GNSS Positioning”.
Up to now, two simple scenarii were considered for the
simulations: open GNSS environment and indoor Authors would like to thank our colleague Jean-Philipe
environment. RANCE for his assistance with the experimental setup.
We thank also our collaborators at M3Systems Company.
In open environment, sensors fusion favors GNSS/INS
sytem. This system is used to calibrate the vision unit.
This calibration consists of landmarks selection and
landmark description. REFERENCES

In indoor environment, GNSS/INS is highly degraded. [1] S. Alban, D. M. Akos S. M. Rock and D. Gebre-
The global system takes advantage of vision Egziabher, “Performance Analysis and Architectures for
measurement. This measurement allows us to achieve INS-aided GPS Tracking Loops,” ION-NTM 2003, 2003.
IMU calibration and INS correction. That is a clear
benefit for GNSS tracking loops. [2] V. Barreau, B. Priot, V. Calmettes and M. Sahmoudi,
“A New Approach for Deep Integration of GNSS and
VII- CONCLUSION Vision-Aided MEMS IMU,” in Proceed. Of the ION
GNSS’2010, Oregan, PO, USA, Sept. 2010.
This study shows the interest of the vision to improve the
performance of a MEMS-based INS in the context of a [3] Herbert Bay, Andreas Ess,Ttinne Tuytelaars, and Luc
deeply integrated vision-aided/GNSS navigation system. Van Gool. SURF: Speeded Up Robust Features, ETH
Exploiting the vision allows us to provide an IMU whose Zurich, Katholieke Universiteit Leuven.
performance is close to those obtained with a tactical
grade IMU. This research demonstrates that many [4] Besser, J. et al., “Trunav: A Low-Cost
applications could take advantage of low cost systems for Guidance/Navigation Unit Integrating a SAASM-Based
aiding GNSS receiver in degraded environments and GPS and MEMS IMU in a Deeply Coupled
consequently improving the availability and the continuity Mechanization”, Proceedings of ION GPS’2002,
of service of a multisensor navigation system. Portland, OR, September 2002.

It should be noted that during prolonged outage of GPS [5] Neal A. Carlson, “Federated Filter for Distributed
signals, the position-attitude estimation by INS sensors Navigation and Tracking Applications,” in Proceedings of
will degrade over time if the vision information is not the ION 58th AM, pp. 340 353, 2002.
sufficient to update the calibration parameters of the gyro
and accelerometers. This scenario can be encountered in
indoor rooms with little features that may be exploited as [6] E. Dill, M. Uijt de Haag, “Integration of 3D and 2D
landmarks by the vision system. The developed system Imaging Data for Assured Navigation in Unknown Indoor
also achieves his limitations in very high dynamic Environments,” in Proceedings of the IEEE/ION
applications, because the landmark tracking algorithm is PLANS’2010, Indian Wells CA, USA, May 2010.
not able to this challenging case. Consequently
mechanism should be considered to determine when the [7] Paul D. Groves, Priciples of GNSS, Inertial, and
quality of the GPS signals, vision images, and INS Multisensor Integrated Navigation Systems, Artech
solutions are not acceptable to be used for the local filters House, London, 2008.
and/or the global navigation solution.
[8] Nga-Viet Nguyen, Georgy Shevlyakov, Vladimir
Our future work will explore several novel aspects. First a I.Shin, Gwangju Inst. of Sci. & Technol., Gwangju, “A
longer measurement campaign will be performed by robust two-stage multisensory fusion in contaminated
considering a realistic MEMS based IMU. This IMU is Gaussian channel noise,” , In proceedings of the 3rd
being designed in our group to address the issue of UAV International Conference on Sensing Technology, Tainan,
navigation. Another aspect will involve fusion techniques Taiwan, Nov. 30-Dec.3, 2008.
in order to address more complex scenario than open
environment and indoor environment. [9] Joan Sola, “Consistency of the monocular EKF-
SLAM algorithm for 3 different landmark

International Technical Meeting of The Institute of Navigation, 660


San Diego, CA, January 24-26, 2011
parameterizations,” Proceedings of IEEE International
Conference on Robotics and Automation (ICRA’2010),
Anchorage, Alaska, May 2010.

[10] A. Soloviev, D. Eaton, M. Uijt de Haag, Z. Zhu,


“Integration of Ladar Vision and Inertial Data for GNSS
Denied Navigation,” in Proceedings of ION GNSS’09,
Savannah, Georgia, USA, Sept. 2009.

[11] M. Veth, “Fusion of imaging and inertial sensors for


navigation,” Ph.D. dissertation, Air Force Institute of
Technology, Wright Patterson Air Force Base, Ohio,
United States, 2006.

[12] P.A. Viola and M. J. Jones. Rapid object detection


using a boosted cascade of simple features. In
proceedings of the CVPR (1), pages 511 – 518, 2001.

[13] S. Winkler, H.-W. Schulz, M. Buschmann, T. Kordes


and P. Vorsmann “Improving low-cost GPS/MEMS-
based INS Integration for Autonomous MAV Navigation
by Visual Aiding,” in proceedings of the ION ITM’2004,
Long Beach, CA, Sept. 2004.

International Technical Meeting of The Institute of Navigation, 661


San Diego, CA, January 24-26, 2011
View publication stats

You might also like