Professional Documents
Culture Documents
net/publication/257940995
CITATIONS READS
3 231
4 authors, including:
Some of the authors of this publication are also working on these related projects:
Multi-Frequency GNSS RTK for Precise Positioning, ETS Montréal & GEDEX View project
All content following this page was uploaded by Mohamed Sahmoudi on 03 July 2018.
( )+ w .
of the GNSS signal.
where u nk = C k c(n − τ nk ) exp j ϕ nk k
n
The global sensor fusion is performed using a
combination of distributed local estimates : GNSS/INS
(x ) + (y ) + (z )
2 2 2
Random Walk driving noise PSD 2e-2°/s2/√Hz 5e-2 m/s3/√Hz
− x nk − y nk − z nk + bn + n k
τ nk = n n n
Table 1- IMU Sensors noises
c
n
f L1Ts
ϕ nk = ϕ 0k + 2π
c
∑ (V
m=0
m − Vmk ) 3. Image sensor
[ ] represents,
over the path of a pedestrian. The acquisition rate was
- [x n yn z n ] and xnk
T
y nk z nk
T
fixed to 10Hz. A calibration was performed in order to
respectively, the platform position and the kth satellite determine the focal length of the camera lens. The camera
position at the epoch n. was used for tracking some singular points included in the
image as shown in figure 4.
- Vn − Vnk represents the platform - kth satellite relative
velocity at the epoch n.
- nk is a random value which models pseudorange error.
t0
bnp (t ) = ∫ nc (u )du
t0 IV- GNSS/INS DEEP INTEGRATION
In our model, only the non-predictable error is considered.
An ultra stable “Oven Controlled Crystal Oscillator” is 1. GNSS signals tracking
assumed to be used here. Evolution over the time of the
non-predictable error of this oscillator is represented in The receiver designed in the frame of this study is based
figure 3. on an extended Kalman filter. This estimator exploits,
during the state propagation step, the knowledge of the
No predictable error 10 m vehicle acceleration in the satellite-receiver line of sight
(LOS). It uses, for updating the a priori estimate,
10 mn measurements obtained from the outputs of a set of
Figure 3- Non predictable receiver clock error correlators. This module is described in [2]. Its
architecture is shown in figure 5.
1000
2. GNSS/INS integration
4
MEMS IMU
IMAR-FSAS IMU The approximated determinant of the Hessian represents
3
the blob response in the image at location x. These
responses are stored in a blob response map over different
2 scales. The interest points detected correspond to the
maxima of the map.
1
2. Interest point description
0
Under assumption of slow motion of the vehicle, some
-1 algorithms perform pixels detection and matching by
-20 0 20 40 60 80 100 searching in the new image, for each tracked pixel, a point
time (in s)
in its vicinity. This method is not very robust. In the
Figure 10- GNSS/INS integration/LOS Accel. error frame of this application a descriptor is defined in order to
compare the regions of the detected points and then to
As a corollary of these results, we can state that the level eliminate outliers. Indeed in order to characterise each
of performance of MEMS based INS/GNSS system detected point we used the descriptor described in [3].
should be increased by using additional sensors for This descriptor doesn’t analyse the distribution of the
calibrating IMU sensors gradient like in other algorithms but the first order Haar
wavelet responses in x and y direction. This processing is
performed on integral images. Compared to the algorithm
described in [3], the INS attitude and position can be used
V- GNSS/Vision INTEGRATION here to compute the interest point orientation. It provides,
for each pixel, a descriptor vector of size 64.
In this section an image-aided inertial navigation system
is considered. Image processing consists of extracting 3. Vision/INS integration
landmarks in the image and to track this landmark in
order to update INS errors and sensors bias estimates. The The algorithm is presented and analyzed in [2]. It is based
system is described in [2] when simulated images are on an extended Kalman filter, which is used to estimate
processed. In the frame of this paper, the processed the INS position, velocity and attitude errors, IMU
images are provided by the camera which is mounted on sensors biases, and landmarks parameters which are
the platform. This processing consists of several stages: exploited by the measurement process; ie the initial
- Landmarks detection and description, optical ray’s direction vector of the point n in the
- Landmarks matching, navigation frame and the initial inverse-distance of the
- Use of Landmarks for updating the INS solution and point n. Tracked landmarks are used in the EKF for
for calibrating IMU sensors. updating the estimates. The architecture of this system is
shown in figure 11.
We present in this paper the operations related to the
algorithm which are not analyzed in [2].
Inertial
INS Position,
The last step consists of acquiring new pixels when the
sensors number of tracked pixels is lower than 20. Pixels selection
Velocity, PmM (k ) represents the pixel is an important issue. This selection eliminates pixels in
Attitude
coordinates of the mth pixel
the vicinity of the currently tracked points. New pixels
and their descriptors are added to the set of tracked pixels.
Figure 11- GNSS/INS integration
4. Algorithm assessment
The measurement update is based on images processing,
following the approach described in [9]. As explained Performances of the algorithm were assessed within the
previously the algorithm relies on pixels tracking. A set of frame of the measurement campaign. Figure 13 shows the
twenty pixels is considered over the measurement number of tracked landmarks over the time. Most of the
campaign. The innovation, which represents the time, the number of points is lower than 20. The
measurement error, is obtained over several steps. algorithm favors robustness of the solution by selecting
the most appropriate set of points.
First interest points are detected over the whole image, of points landmarks
using a very basic Hessian-matrix approximation. Then 20
tracked
using the state vector and the error covariance matrix Pk− , 10
obtained after the system-propagation phase. This area is
number
Pixel k 0
PˆkM [k ]
0 10 20 30
x x tim e (in s)
[ ] [x y]
0.2 m/s2 when the vision is used.
[x y ] H k Pk− H kT + Rk
−1 T
= 32
0 Mahalanobis distance
0 10 20 30
time (in s) Figure 17- Vision/INS integration -Velocity error
Figure 15- Vision/INS integration -Velocity error
Concerning GNSS tracking modules, impact of low C/N0
1.5 ratio is reduced by increasing the estimated pseudorange
INS MEMS
acceleration error (in m/s²)
variance as follows:
IMAR-FSAS 2
Tchip
INS MEMS + vision σ τ (k ) = σ τ (k ) + 10.
2 2
in s 2
1 C / N0
This degradation impact GNSS/INS estimates by
reinforcing the weight of IMU measurements.
0.5
GNSS/INS Estimator Vision/INS Estimator
State Propagation State Propagation
τ m (k ) { P (k ) }
M
0 Measurement Update m
Measurement Update
0 10 20 30
time (ins) θ ,P
(1)
k k
(1)
θ k( 2) , Pk( 2 )
Fusion
Figure 16- Vision/INS integration -Acceleration error
θ ( Fusion )
k ,P
k
( Fusion )
VI- MULTISENSORS DATA FUSION Since outliers are processed inside each estimator, their
output are merged under the assumption of consistent
In the previous sections two systems were analyzed: a local estimates. Indeed fusion is performed to provide the
GNSS/INS deeply integrated system and a Vision/INS optimized estimate [8].
integrated system. The first one exploits GNSS signal as θˆ fusion = ∑ aiθˆi avec ∑ ai = 1
i i
{
J = E (θ fusion − θ )
2
} This work was supported by DGA (the French Defense
Agency) in the context of the REI Project “Multi-Sensor
Data Fusion for GNSS Positioning”.
Up to now, two simple scenarii were considered for the
simulations: open GNSS environment and indoor Authors would like to thank our colleague Jean-Philipe
environment. RANCE for his assistance with the experimental setup.
We thank also our collaborators at M3Systems Company.
In open environment, sensors fusion favors GNSS/INS
sytem. This system is used to calibrate the vision unit.
This calibration consists of landmarks selection and
landmark description. REFERENCES
In indoor environment, GNSS/INS is highly degraded. [1] S. Alban, D. M. Akos S. M. Rock and D. Gebre-
The global system takes advantage of vision Egziabher, “Performance Analysis and Architectures for
measurement. This measurement allows us to achieve INS-aided GPS Tracking Loops,” ION-NTM 2003, 2003.
IMU calibration and INS correction. That is a clear
benefit for GNSS tracking loops. [2] V. Barreau, B. Priot, V. Calmettes and M. Sahmoudi,
“A New Approach for Deep Integration of GNSS and
VII- CONCLUSION Vision-Aided MEMS IMU,” in Proceed. Of the ION
GNSS’2010, Oregan, PO, USA, Sept. 2010.
This study shows the interest of the vision to improve the
performance of a MEMS-based INS in the context of a [3] Herbert Bay, Andreas Ess,Ttinne Tuytelaars, and Luc
deeply integrated vision-aided/GNSS navigation system. Van Gool. SURF: Speeded Up Robust Features, ETH
Exploiting the vision allows us to provide an IMU whose Zurich, Katholieke Universiteit Leuven.
performance is close to those obtained with a tactical
grade IMU. This research demonstrates that many [4] Besser, J. et al., “Trunav: A Low-Cost
applications could take advantage of low cost systems for Guidance/Navigation Unit Integrating a SAASM-Based
aiding GNSS receiver in degraded environments and GPS and MEMS IMU in a Deeply Coupled
consequently improving the availability and the continuity Mechanization”, Proceedings of ION GPS’2002,
of service of a multisensor navigation system. Portland, OR, September 2002.
It should be noted that during prolonged outage of GPS [5] Neal A. Carlson, “Federated Filter for Distributed
signals, the position-attitude estimation by INS sensors Navigation and Tracking Applications,” in Proceedings of
will degrade over time if the vision information is not the ION 58th AM, pp. 340 353, 2002.
sufficient to update the calibration parameters of the gyro
and accelerometers. This scenario can be encountered in
indoor rooms with little features that may be exploited as [6] E. Dill, M. Uijt de Haag, “Integration of 3D and 2D
landmarks by the vision system. The developed system Imaging Data for Assured Navigation in Unknown Indoor
also achieves his limitations in very high dynamic Environments,” in Proceedings of the IEEE/ION
applications, because the landmark tracking algorithm is PLANS’2010, Indian Wells CA, USA, May 2010.
not able to this challenging case. Consequently
mechanism should be considered to determine when the [7] Paul D. Groves, Priciples of GNSS, Inertial, and
quality of the GPS signals, vision images, and INS Multisensor Integrated Navigation Systems, Artech
solutions are not acceptable to be used for the local filters House, London, 2008.
and/or the global navigation solution.
[8] Nga-Viet Nguyen, Georgy Shevlyakov, Vladimir
Our future work will explore several novel aspects. First a I.Shin, Gwangju Inst. of Sci. & Technol., Gwangju, “A
longer measurement campaign will be performed by robust two-stage multisensory fusion in contaminated
considering a realistic MEMS based IMU. This IMU is Gaussian channel noise,” , In proceedings of the 3rd
being designed in our group to address the issue of UAV International Conference on Sensing Technology, Tainan,
navigation. Another aspect will involve fusion techniques Taiwan, Nov. 30-Dec.3, 2008.
in order to address more complex scenario than open
environment and indoor environment. [9] Joan Sola, “Consistency of the monocular EKF-
SLAM algorithm for 3 different landmark