You are on page 1of 6

Proceedings,18th IFAC Symposium on System Identification

Proceedings,18th IFAC Symposium on System Identification


July 9-11, 2018. Stockholm,
Proceedings,18th
Proceedings,18th IFAC Sweden on
IFAC Symposium
Symposium on System Identification
System Identification
July 9-11, 2018. Stockholm, Available
Sweden on online at www.sciencedirect.com
Proceedings,18th
July
July 9-11,
9-11, 2018. IFAC Symposium
2018. Stockholm,
Stockholm, Sweden
Sweden System Identification
July 9-11, 2018. Stockholm, Sweden
ScienceDirect
IFAC PapersOnLine 51-15 (2018) 43–48
Experimentally
Experimentally Validated Extended
Experimentally Validated
Validated Extended
Extended
Kalman Filter
Experimentally
Kalman for UAV
ValidatedState Estimation
Extended
Kalman Filter for UAV State Estimation
Filter for UAV State Estimation
Kalman UsingFilter
Using Low-Cost
for UAV
Low-Cost Sensors
State Estimation
Sensors
Using Low-Cost
Using Low-Cost Sensors Sensors
S.P.H. Driessen ∗ N.H.J. Janssen ∗ L. Wang ∗∗ J.L. Palmer ∗∗∗
S.P.H. Driessen ∗
N.H.J. Janssen ∗ ∗∗ ∗∗∗
S.P.H. Driessen ∗
N.H.J. Janssen ∗ L.

L. ∗ Wang ∗∗ J.L.
∗∗ Palmer ∗∗∗
Driessen ∗∗ ∗ Wang J.L. Palmer ∗∗∗
S.P.H. N.H.J.H. Nijmeijer
Janssen L. Wang J.L. Palmer
S.P.H. Driessen N.H.J.H. Nijmeijer
Janssen
H. Nijmeijer
H. Nijmeijer ∗

L. ∗ Wang

∗∗
J.L. Palmer ∗∗∗

∗ H. Nijmeijer
∗ Eindhoven University of Technology, Den Dolech 2, 5600 MB

∗ Eindhoven
Eindhoven University
University of
of Technology,
Technology, Den
Den Dolech
Dolech 2, 2, 5600
5600 MB MB
Eindhoven,
∗ Eindhoven
Eindhoven, The
The Netherlands
University
Netherlands of (e-mail:
Technology,
(e-mail: s.p.h.driessen@student.tue.nl)
Den Dolech 2,
s.p.h.driessen@student.tue.nl) 5600 MB
∗∗Eindhoven
Eindhoven, The University
Netherlands ofLa Technology,
(e-mail: DenMelbourne
Dolech 2,VIC
s.p.h.driessen@student.tue.nl) 56003000, MB
∗∗ RMIT University,
Eindhoven, The Netherlands124 Trobe
(e-mail: Street,
s.p.h.driessen@student.tue.nl)
∗∗ RMIT
Eindhoven,
∗∗
RMIT University,
The Netherlands
University,
Australia
124
124
(e-mail:
La
La Trobe
(e-mail:
Trobe Street, Melbourne
s.p.h.driessen@student.tue.nl)
Street,
liuping.wang@rmit.edu.au) Melbourne VIC
VIC 3000,
3000,
∗∗ RMIT Australia
University, 124 Laliuping.wang@rmit.edu.au)
(e-mail: Trobe Street, Melbourne VIC 3000,
∗∗∗RMIT Australia
University, 124 Laliuping.wang@rmit.edu.au)
(e-mail: Trobe Street,
liuping.wang@rmit.edu.au) Melbourne VIC VIC 3207,3000,
∗∗∗ DST Group,
Australia 506 Lorimer
(e-mail: Street, Port Melbourne
∗∗∗
∗∗∗ DST
DST Group,
Australia
Group, 506
506 Lorimer
(e-mail:
Lorimer Street, Port
liuping.wang@rmit.edu.au)
Street, Port Melbourne
Melbourne VIC
VIC 3207,
3207,
∗∗∗ DST
Australia
Group,
Australia (e-mail:
506 Lorimer
(e-mail: jennifer.palmer@dst.defence.gov.au)
Street, Port Melbourne VIC 3207,
jennifer.palmer@dst.defence.gov.au)
DST Group,
Australia
Australia 506 Lorimer
(e-mail:
(e-mail: Street, Port Melbourne VIC 3207,
jennifer.palmer@dst.defence.gov.au)
jennifer.palmer@dst.defence.gov.au)
Australia (e-mail: jennifer.palmer@dst.defence.gov.au)
Abstract: Visually
Abstract: Visually basedbased velocity
velocity and and position
position estimations
estimations are are often
often usedused to to reduce
reduce or or remove
remove
Abstract:
the dependency
Abstract: Visually
Visuallyof based
an
based velocity
unmanned
velocity and
aerial
and position
vehicle
position estimations
(UAV)
estimations on are often
global
are often used
navigation
used to
to reduce
satellite
reduce or
or remove
system
remove
the
the dependency
Abstract:
dependency Visuallyof
of bean
an unmanned
based velocityinaerial
unmanned and
aerial vehicle
position
vehicle (UAV)
estimations
(UAV) on global
are often
on unavailable
global navigation
used
navigation satellite
to reduce
satellite system
or remove
system
signals,
the which
dependency
signals, which may of an unreliable
unmanned urban
aerial canyons
vehicle and
(UAV) are
on global indoors.
navigation In this
satellite paper,
system
the
signals,
a which may
dependency
sensor-fusion
signals, which may
may of bean unreliable
be
algorithm
be unmanned
unreliable
based
unreliable
in
on
in
urban
inaerial
urban
an
urban
canyons
vehicle
canyons
extended
canyons
and
and are
(UAV)
Kalman
and are
unavailable
onfilter
are global
unavailable
is
unavailable
indoors.
navigation
indoors.
developed
indoors. for
In
Inthe
In
this
satellite paper,
system
thisvelocity,
this paper,
paper,
a sensor-fusion
signals, which mayalgorithm based
be unreliable on an
inofurban extended
canyons Kalman
and arefilter is
unavailable developed for the velocity,
a
a sensor-fusion
position,
sensor-fusion
position, and
algorithm
and attitude
attitude
algorithm based
based on
estimation
estimation on of
an
ana UAV
a
extended
UAV
extended usingKalman
using low-cost
Kalman
low-cost
filter
filter is
sensors. Inindoors.
is developed
sensors. developed
In forInthe
for
particular
particular the this
an
paper,
anvelocity,
inertial
velocity,
inertial
a sensor-fusion
position,
measurement
position, and algorithm
attitude
unit
and unit
attitude(IMU) based
estimation
and
estimation anon of ana extended
UAV
optical-flow
of a UAV using using
sensorKalman
low-cost
that
low-cost filter is
sensors.
includes developed
In
a
sensors. aIn sonarsonar for
particular the
module
particular an velocity,
inertial
and
an and
inertialan
measurement
position,
measurement and attitude
unit (IMU)
(IMU) and
and an
estimation an optical-flow
of a UAV using
optical-flow sensor
sensor that
low-cost
that includes
sensors. aaIn sonar
includes module
particular
sonar module an inertial
and an
an
additional
measurement
additional gyroscope
unit
gyroscope are
(IMU)
are used.
and The
an algorithm
optical-flow is
sensorshown thatexperimentally
includes to be able
module to handle
and an
measurement
additional
measurements
additional unit
with (IMU)
gyroscope
gyroscope are used.
are and
used.
different
used.
The
an
sampling
The
algorithm
Theoptical-flow
algorithm
rates and
algorithm
is
andis
shown
issensor
shownthat
missing
shown
experimentally
includes
caused aby
experimentally
data,
experimentally
to
bysonar
to
to
be able
able to
module
beindoor,
thebe able
handle
and
tolow-light
to handle
handle an
measurements
additional
measurements with
gyroscope
with different
are used.
different sampling
The
sampling rates
algorithm
rates and is missing
shown
missing data, caused
experimentally
data, caused by tothe
thebeindoor,
able
indoor, tolow-light
handle
low-light
conditions. State
measurements
conditions. State
with estimations are compared
different sampling
estimations are compared
rates and to aamissing
to ground-truth
data, caused
ground-truth pose history
pose history obtainedlow-light
by the indoor,
obtained with aa
with
measurements
conditions.
motion-capture
conditions. with
State
State systemdifferent
estimations
to
estimations show sampling
are
the
are rates and
compared
influence
compared to
of
to aamissing
the data, caused
ground-truth
optical-flow
ground-truth pose
and
pose sonarby the
history
history indoor,
obtained
measurements
obtained low-light
with
on itsaa
with
motion-capture
conditions.
motion-capture State system
system to
to show
estimations show the
are influence
thecompared of
influence resultsto the
of the optical-flow
a ground-truth
optical-flow and
pose
and sonar
history
sonar measurements
obtained
measurements on
on its
with itsa
performance.
motion-capture
performance. Additionally,
system
Additionally, to the
show
the experimental
the influence
experimental of the
results demonstrate
optical-flow
demonstrate that
and
that the
sonar
the velocity and
measurements
velocity and attitude
on
attitude its
motion-capture
performance.
can bebe estimated
estimated
performance. system
Additionally, to show
without drift,
Additionally, drift,
the the influence
the experimental
experimental
despite the of
the magnetic the
magnetic
results optical-flow
results demonstrate
demonstrate and
distortions typicalthat sonar
that the the of measurements
velocityenvironments.
indoor
velocity and attitude
and on
attitude its
can
performance.
can be estimated without
Additionally,
without the
drift, despite
experimental
despite the results
magnetic distortions
demonstrate
distortions typical
that
typical the of
of indoor
velocity
indoor environments.
and attitude
environments.
can be estimated without drift, despite the magnetic distortions typical of indoor environments.
© 2018,
can IFAC (International
be estimated Federation
without drift, despiteof Automatic
the magnetic Control) Hosting by
distortions Elsevier
typical of Ltd. All environments.
indoor rights reserved.
Keywords: Unmanned
Keywords: Unmanned aerial aerial vehicle,
vehicle, Sensor
Sensor fusion,
fusion, Extended
Extended Kalman Kalman filter,filter, Optical
Optical flow,flow,
Keywords: Unmanned
Visual-inertial
Keywords: Unmanned
state aerial vehicle,
estimation,
aerial vehicle,
Missing Sensor
Sensordata, fusion,
Multi-rate
fusion, Extended
Extended sampled Kalman
data.
Kalman filter, Optical
filter, Optical flow,flow,
Visual-inertial
Keywords:
Visual-inertial state
state estimation,
Unmanned aerial vehicle,
estimation, Missing
Missing Sensordata,
data, Multi-rate
fusion,
Multi-rate Extended sampled
sampled data.
Kalman
data.filter, Optical flow,
Visual-inertial state estimation, Missing data, Multi-rate sampled data.
Visual-inertial state estimation, Missing data, Multi-rate sampled data.
1. INTRODUCTION
1. INTRODUCTION wart, 2011).
2011). Within
Within these these sensors,
sensors, sequential
sequential images images are are
1. wart,
1. INTRODUCTION
INTRODUCTION wart,
used
wart,
used
2011).
to calculate
2011).
to calculate
Within
Withinthe
the
these
optical
these
optical
sensors,
flow.
sensors,
flow.
sequential
To relate
sequential
To relate
images
optical-flow
images
optical-flow
are
are
1. INTRODUCTION wart, 2011). Within
Unmanned aerial vehicles (UAVs) can be used for a wide used to
measurements
used to
measurements
calculate
calculate to thethese
the
to the
the optical
camera’s
optical
camera’s
sensors,
flow.
flow. sequential
To
translational
To
translational
relate images the
optical-flow
velocity,
relate velocity,
optical-flow are
the
Unmanned aerial vehicles (UAVs) can be used for a wide used
scene to
measurements calculate
depth
measurements to
must the
the
be
to the optical
camera’s
known.
camera’s Itflow. To
translational
may be
translationalrelate
measured optical-flow
velocity, the
directly
velocity, the
Unmanned
range of
Unmanned aerial
aerial vehicles
of applications,
applications, such(UAVs)
vehicles as rescue
(UAVs) rescuecan be
(Pó‰
can(Pó‰ used
be llka
ka
usedet for
al.,aa2017)
for wide
2017)
wide scene depth must be known. It may be measured directly
range
Unmanned aerial such
vehicles as
(UAVs) can(Pó‰
be llka
usedet al.,
for wide scene
a2017) measurements
using, depth
scene for example,
depth to the
must
example,
must be camera’s
known.
sonar
besonarknown. It
or LiDARtranslational
It may
LiDAR
may be measured
sensors
be measured velocity,
or estimated the
directly
directly
range
(Waharte
range of
of applications,
and Trigoni,
applications, such
such as
2010),
as rescue
visual
rescue inspection
(Pó‰ ka et
et al.,
al.,(Omari
2017) using, for or sensors or estimated
(Waharte
range and Trigoni, 2010), visual inspection (Omari
2017) using, scene depth
for must
for example, be known.
sonar or It
or LiDARmay be measured
sensors or directly
or estimated
al., of
(Waharte
et al.,
(Waharte
et 2015)
2015)
applications,
and
andandTrigoni,
and
such
manufacturing
Trigoni, as rescue
2010),
2010),
manufacturing
visual (Pó‰
(Khosiawan
visual
(Khosiawan
lka and
inspection
inspectionet al.,
and
(Omari
Nielsen,
(Omari
Nielsen,
using sensor
using,
using
using,
sensor fusion. The
example,
fusion.
for example,
The
sonar
latter
sonarlatter approach
LiDAR
approach
oranLiDAR
is taken
sensors
is
sensors
taken in Bleser
in Bleser
estimated
or estimated
(Waharte
et al.,
2016). 2015) and
Knowledge
et al., 2015) and Trigoni, 2010),
manufacturing
of the
and manufacturing UAV’s visual
(Khosiawan
velocity,
(Khosiawaninspectionand
position, (Omari
Nielsen,
and
and Nielsen, at- using
and
using sensor
Hendeby
sensor fusion.
(2010),
fusion. The
where
The latter
latter approach
approach is
optical-flow-based
is taken
taken in
in Bleser
extended
Bleser
and
using Hendeby
sensor (2010), where an optical-flow-based
is taken extended
2016).
et
2016).
titude
2016).
Knowledge
al., 2015)
Knowledge
is essential
Knowledge
of
of
in
the
and manufacturing
the UAV’s velocity,
the UAV’s
these
of these (Khosiawan
velocity,
applications.
UAV’s velocity,
position,
position,
On-board
position,
and
and Nielsen,
and
sensors
and
at-
at- and
at- Kalman
and
Kalman
Hendeby
Hendeby filterfusion.
filter (EKF)The
(2010),
(2010),
(EKF)
where
where
is
latter
an approach
an optical-flow-based
is developed
developed and validated
validated
optical-flow-based
and
inthrough
Bleser
extended
extended
through
titude
2016). is essential
Knowledge in
of the applications.
UAV’s velocity, On-board
position, sensors
and at- and
Kalman Hendeby
simulation.
Kalman filter
A
filter (2010),
(EKF)
real-time
(EKF) where
is
is an
developedoptical-flow-based
implementation
developed and
and validated
of an
validated extended
through
optical-flow-
through
titude
generally
titude is
is essential
are required
essential in
in these
required to be
these applications.
be inexpensive,
applications. On-board
compact,sensors
On-board light, simulation.
sensors A real-time
generally
titude are
is essential to
inAnthese inexpensive,
applications. compact,
On-board light,
sensors Kalman
simulation.
based EKF
simulation.EKF filter
A
with
A (EKF)
real-time
a similar is implementation
similar
real-time developed
approachand
implementation
implementation
of
of
bean
of
an optical-flow-
can validated through
optical-flow-
found
an in Weiss
optical-flow-
generally
and are
low-powered.
generally are required
required to
to be
inertial
be inexpensive,
measurement
inexpensive, compact,
unit
compact, light,
(IMU)
light, based with a approach can be found in Weiss
and low-powered.
generally are requiredAn inertial
to be measurement
inexpensive, unit
compact, (IMU)
light, simulation.
based
et
basedal. EKF
(2012).
EKF A
withreal-time
a
LiDAR
with a similar implementation
approach
measurements
similar approach can
are
can of
be
used
be an optical-flow-
found
in,
found in
for
in Weiss
exam-
Weiss
and low-powered.
satisfies these
and low-powered. An
requirements inertial
An inertial and measurement
allows
measurement for unit
tri-axial (IMU)
unit (IMU) mea- et al. (2012). LiDAR measurements are used in, for
satisfies these requirements and allows for tri-axial mea- based EKF et with a similar and approach can be(2017).
found in exam-
Weiss
and low-powered.
satisfies
surement
satisfies
surement
these
of the
these
of the An inertial
requirements
acceleration
requirements
acceleration
and measurement
and
and
and
allows
angular
allows
angular
for unit (IMU)
tri-axial
velocity.
forvelocity.
tri-axial mea- et
mea-
Often
Often
ple,al.
et
ple,
et al.
(2012).
Yun
al.Yun
(2012).
et
(2012).
al.LiDAR
al. (2016)
LiDAR
(2016)
LiDAR
measurements
and Goppert et
measurements
Goppert
measurements
are
et
are
al.used
areal.used in,
in, for
(2017).
used in,
for
for
exam-
exam-
exam-
satisfies
IMUthese
surement
the IMU
surement
the
of the
also
of the
also
requirements
acceleration
features
acceleration
features
and
andallows
and
a magnetometer.
a magnetometer.angular
angular forAs tri-axial
velocity.
As a result
velocity.
a result mea-
Often
Often of ple,
of
ple,
In this
ple,
Yun
this
et
et al.
Yunpaper,
Yunpaper,
al. (2016)
(2016)
the low-cost
et al. (2016)
and
and Goppert
low-cost Goppert
PX4Flow
and Goppert
et
et al.
al. (2017).
(2017). sensor
optical-flow
et al. (2017). sensor
surement
the
sensor
the IMU
IMU noise of the
alsoand
alsoand acceleration
features
a slowly
featuresslowlya and
magnetometer.angular
varying sensor
a magnetometer. sensor bias, velocity.
As
bias, a Often
result
combined
As a combined of
result of In In the PX4Flow optical-flow
sensor noise a varying In this
(OFS)
this paper,
is used.
paper, the
This
the low-cost
is
low-cost an PX4Flow
open-source
PX4Flow optical-flow
based software
optical-flow sensor
software
sensor
the
sensorIMU noise
with vibrations
sensor vibrationsalso
noise and features
aa slowly a magnetometer.
slowly varying
generated
andgenerated by the
varyingthe sensor
UAV’s bias,
sensor As a
motors,
bias, result
combined
combined of
direct (OFS) In this is used.
paper, This
the is
low-cost an open-source
PX4Flow based
optical-flow sensor
with by UAV’s motors, direct (OFS)
and
(OFS) is
hardware
is used.
used. This
platform
This is
is an
that
an open-source
uses a
open-source CMOS based
vision
based software
sensor,
software
sensor
with
integration
with noise
vibrations
vibrations ofand a slowly
generated
acceleration
generated varying
by
and
by the sensor
UAV’s
angular
the UAV’s bias,
motors,
velocity
motors, combined
direct
leads
directto and
(OFS) hardware
is used. platform
This is that
an uses a
open-source CMOS vision
based sensor,
software
and hardware
a gyroscope,
gyroscope, platform
and that uses aa CMOS vision
vision sensor,
integration
with vibrations
integration
poor velocity
velocity
integration
of
of acceleration
generated
acceleration
and
of and
acceleration
and
by the
and
attitude estimations
angular
UAV’s
angular
estimations
and angular
velocity
motors,
velocity
that
leads
todirect
tend leads
velocity leads
to
to aaand
to
drift
and hardwareand
hardware
gyroscope, aaa sonar
platform
platform
and
sonar
sonar
that
that
range
range
uses
range
finder
uses finder CMOS
a CMOS
finder
(Honegger
(Honegger
vision
(Honegger
et al.,
et al.,
sensor,
sensor,
et al.,
poor
integration of and attitude
acceleration and greatly that
angularincrease tend
velocity to
leads drift
to 2013). a2013). The
gyroscope,
The sonar
and
sonar module
a sonar can
range be replaced
finder with
(Honegger a LiDAR
et al.,
poor
poor velocity
over time.
time.
velocity Sensor
and attitude
fusion can
attitude estimations
can
estimations that
that tend
tendthe to
to drift
esti-
drift a2013).
gyroscope,
2013).
sensor The
to
The and module
sonar
increase
sonar athesonar
module
module
can
datacanrange be
be
output
can
replaced
be finder
replaced
rate and
replaced and
with
(Honegger
with a
aa LiDAR
et al.,
LiDAR
measurement
with LiDAR
over
poor
over velocity
time. Sensor
Sensor fusion
and attitude
fusion can greatly
estimations
greatly increase
that tendthe
increase the esti-
to drift
esti- sensor to increase the data output rate measurement
mation
over time. performance.
Sensor In
fusion this
can context
greatly accelerometer
increase the and
esti- 2013).
sensor
quality.to
sensor The sonar
increase
increase the
However,
toHowever, module
LiDAR
the data can be
output
sensors
datasensors replaced
output are rate
are and with a
and measurement
considerably
rateconsiderably LiDAR
measurement more
mation
over time. performance.
Sensor In
fusion this
cancancontext
greatly accelerometer
increasetothe and
esti-
mation
magnetometer
mation
magnetometer
performance.
measurements
performance.
measurements
In
In this
this context
can be
accelerometer
be combined
context combined
accelerometer to and quality.
and
retrieve
retrieve
sensor
quality.
expensive.
quality. toHowever,
increase
Therefore
However,
LiDAR
the
LiDAR
LiDARa data output
sensors
different
sensors rateconsiderably
are
approach
are andis measurement
taken
considerably in
more
more
this
more
mation
attitude performance.
magnetometer
magnetometer measurements
information.
measurementsIn this
For context
can be
translational
can accelerometer
be combined
velocityto
combined and
toand and expensive.
retrieve
posi-
retrieve quality. Therefore
However, aa different
LiDAR sensors approach
are is taken
considerably in this
more
attitude information. For translational velocity posi- expensive.
paper.
expensive. Sonar Therefore
measurements
Therefore a different
are
different approach
not used
approach is
is taken
directly
taken toin
in this
scale
this
magnetometer
attitude measurements
information. For can be combined
translational velocity toandretrieve
can paper.
posi- expensive. Sonar measurements
Therefore are not used directly to scale
tion, global
tion, global
attitude
attitude
navigation
information.
navigation
information.
satellite
For
satellite
For
system
translational
system
translational
(GNSS)
(GNSS)
velocity
signals
velocitysignalsand posi-
andalways can
posi-
paper.
the flow
paper.
the flow Sonar
velocity
Sonar to athe
measurements
measurements
velocity to thedifferent
are
camera’s
camera’s
approach
are not
not used is takenvelocity,
used directly
translational
directly
translational toinscale
to this
scale
velocity,
tion,
provide
tion, global
globala navigation
reference.
navigation satellite
However,
satellite system
GNSS
system (GNSS)
signals
(GNSS) are signals
not
signals can
can paper.
the flow Sonar measurements are not used directly to scale
provide
tion, a
global reference.
navigation However,
satellite GNSS
system signals
(GNSS) are not always instead
the flowaavelocity
velocity to
to the
state representingthe camera’s
representing the UAV
camera’s translational
UAV height, which
translational velocity,
which
velocity, is
provide
reliable
provide
reliable
aaand
and
reference.
are
However,
are typically
typically
reference. However, GNSS
unavailable
GNSS
unavailable
signals
indoors.
signals
indoors. are signals
are not
not always can instead
always the
instead
updated
insteadflowaawith state
velocity
state
state toand theIMU
representing
sonar
representing
the
camera’s
the
the UAV
measurements,
UAV
height,
translational
height,
is
height, velocity,
which
used.
which This
is
is
is
provide
reliable aandreference.
are However,
are typically GNSS signals
unavailable indoors. are not always updated instead with sonar and IMU measurements, is used. This
reliable
Recentlyand
reliable developed
and
typically unavailable
camera-based
are typically unavailable
indoors.
vision sensors are
indoors. light updated
are light results in
updated
results inawith
a state
with
a more
more
representing
sonar
robust
sonar
robust
and IMU the UAV height,
IMU measurements,
translational-velocity
andtranslational-velocity
measurements, is which
is used.
estimate.
used.
estimate.
This
An
This
An
is
Recently developed camera-based vision sensors updated
results
EKF-based
results in
in with
a
a more
more sonar
robust
framework,
robust and IMU measurements,
translational-velocity
modified to handle
translational-velocity is used.
estimate.
multi-rate
estimate. This
An
sam-
An
Recently
and inexpensive
Recently developed
inexpensive
developed andcamera-based
can also
also be
camera-based vision
be used
used
visiontotosensors
estimate
sensors are light
veloc-
areveloc-
light EKF-based framework, modified to handle multi-rate sam-
and
Recently developed and can
camera-based visionto estimate
sensors light EKF-based
areveloc- results in a more
pled measurements
measurements
EKF-based robust
framework,
framework, translational-velocity
andmodified
missing to
modified handle
data,
to handleis used
used estimate.
multi-rate
to perform
multi-rate performAn
sam-
sam-
and
ity
and inexpensive
(Mebarki
inexpensive et and
al.,
and can
2013)
can also
and
also be
be used
position
used to estimate
(Weiss
estimate and Sieg-
veloc- pled and missing data, is to
ity
and (Mebarki
inexpensive et al.,
and 2013)
can and
also beposition
used to(Weiss
estimate and Sieg-
veloc- EKF-based
pled framework,
measurements and modified
missing to handle
data, is multi-rate
used to sam-
perform
ity
ity (Mebarki
(Mebarki et et al.,
al., 2013)
2013) andand position
position (Weiss
(Weiss and and Sieg-Sieg- pled measurements and missing data, is used to perform
pled measurements and missing data, is used to perform
ity (Mebarki
2405-8963 © 2018,et IFAC
al., 2013) and position
(International Federation (Weiss and Sieg-
of Automatic Control) Hosting by Elsevier Ltd. All rights reserved.
Copyright © 2018 IFAC 43
Copyright
Peer review© under
2018 IFAC 43 Control.
responsibility of International Federation of Automatic
Copyright
Copyright ©
© 2018
2018 IFAC
IFAC 43
43
10.1016/j.ifacol.2018.09.088
Copyright © 2018 IFAC 43
2018 IFAC SYSID
44 9-11, 2018. Stockholm, Sweden
July S.P.H. Driessen et al. / IFAC PapersOnLine 51-15 (2018) 43–48

this update and to estimate the UAV’s velocity, position, scene depth is needed. It is related to the UAV’s position,
T
and attitude. The resulting fusion algorithm is validated p = [px py pz ] (m), in the z-direction along with the roll
experimentally. and pitch angles. The sonar measures the scene depth, sbz
(m). The optical flow and scene depth can be expressed as
 
2. SENSORS MODELS b cos(φ) cos(θ)vxb b
ρ̃x = − + ωy τ f + µρx ,
pwz
2.1 Notation  
b
cos(φ) cos(θ)vyb b
ρ̃y = − + ω x τ f + µρy , (2)
For simplicity, the IMU and OFS are assumed to be aligned pwz
and to produce measurements with respect to the same pwz
s̃bz = + µ sz .
right-handed body frame, which is attached to the UAV. cos(φ) cos(θ)
The right-handed world frame is fixed to the Earth, with
the x-axis aligned with the local magnetic north and the Noise terms are modelled as µi ∼ N (0, Σi ), with i ∈
z-axis pointing opposite to gravity. Superscripts b and w {a, ωI , m, ρx , ρy , sz , ωO } and Σi diagonal, where N (0, Σi )
are used to indicate that a variable is expressed in the denotes a normal (Gaussian) distribution with a zero mean
body or the world frame, respectively; while subscripts x, and a covariance of Σi . Bias terms are often modelled
y, and z refer to the component of a variable along each independently, following a random-walk process (Sabatini,
axis. The rotation matrix, representing the rotation of the 2011):
world frame with respect to the body frame, is denoted ḃj = µbj , j ∈ {a, ωI , m, ωO }, (3)
T
as R(Φw )bw , with attitude Φw = [φ θ ψ] , where φ, θ, and
ψ are the roll, pitch, and yaw of the UAV, respectively, with µbj ∼ N (0, Σbj ) and Σbj diagonal. Because magne-
measured with respect to the x-, y-, and z-axes of the tometer bias drift is usually negligible, bm is primarily
world frame. To transform the angular velocity from the used to compensate for magnetic distortions. Note that
world to body frame, the matrix T (Φw )bw is used. However, variables v and ω in the model for the pixel flow are
if the IMU and OFS measure in different frames, due to, expressed in the body frame. Transformation to the world
for example, misalignment, the EKF can be adapted to frame can be done using the rotation matrices.
account for the difference in their reference frames. Even
if the relative position and rotation are not known, the 3. SENSOR-FUSION ALGORITHM
EKF can be used to estimate those quantities (Weiss and
Siegwart, 2011). An EKF is used to estimate the UAV’s position, attitude,
and velocity. Therefore, the state vector is defined as
2.2 Sensor equations  T
ζ = (pw )T (v w )T (Φw )T (ω w )T bTa bTωI bTωO bTm . (4)
The accelerometer, magnetometer, and gyroscopes mea-
sure in three perpendicular directions. These measure- The biases in the sensors are included in the state space,
ments are modelled with bias and noise terms, b and µ, so they can be estimated by the EKF algorithm, which
respectively, which yields (Sabatini, 2011): consists of multiple steps.
ãb = Rw
b
(Φw )(aw + g w ) + ba + µa , 3.1 Prediction step
ω̃Ib = Twb (Φw )ω w + bωI + µωI ,
(1) An a-priori state prediction is made using a discrete-time
m̃b = Rw
b
(Φw )mw + bm + µm ,
b
process model with time step, t (s). The accelerometer
ω̃O = Twb (Φw )ω w + bωO + µωO , and gyroscope from
  the IMU are used to define the input
as uT = ãT ω̃IT . This allows for a state vector without
where a tilde (~) indicates the measurement under consid- acceleration states and therefore reduces its size. Fewer
T T
eration. Variables a = [ax ay az ] , ω = [ωx ωy ωz ] , and noise covariances must be estimated, which simplifies the
T
m = [mx my mz ] represent the acceleration (m/s2 ), an- tuning of the EKF. Furthermore it saves one measurement
gular velocity (rad/s), and magnetic field (µT), respec- update, which reduces the size of the matrix inversions
tively. Subscripts I and O distinguish between the angular (Bleser and Stricker, 2009). Rewriting (1) and (3), the
velocity measured by the IMU and that from the OFS, discrete equations governing the state are obtained:
respectively. The world’s gravitational vector is given by
T pw w w
k+1 = pk + tvk ,
g w = [0 0 9.81] (m/s2 ). Scale factors are omitted (i.e.,
w
chosen to be equal to 1), as they have a limited influence vk+1 = vkw + t[Rbw (Φw b w
k )(ã − ba,k − µa ) − g ],
compared to the bias drifts in these sensors (Sabatini, w w w
Φk+1 = Φk + tωk ,
2011). w
ωk+1 = Tbw (Φw b
k )(ω̃I − bωI ,k − µωI ),
The OFS measures the optical flow, ρ (pixel), in the x- and (5)
ba,k+1 = ba,k + tµba ,k ,
y-directions of the body frame, between two consecutive
bωI ,k+1 = bω,k + tµbωI ,k ,
images. This flow is induced by the translational and
T
angular velocities of the UAV, v = [vx vy vz ] (m/s) and bωO ,k+1 = bω,k + tµbωO ,k ,
ω, respectively (Honegger et al., 2013). To relate the bm,k+1 = bm,k + tµbm ,k ;
optical flow to the UAV’s velocity, the time between
two consecutive images, τ (s), and focal length of the or, in short,
sensor’s optics, f (pixel), are needed. Furthermore, the ζk+1 = f (ζk , uk , wk ). (6)

44
2018 IFAC SYSID

July 9-11, 2018. Stockholm, Sweden S.P.H. Driessen et al. / IFAC PapersOnLine 51-15 (2018) 43–48 45

The process noise, wk ∼ N (0, Q), where Q is the covari- 


 ηIMU,k , if β = 0 ∧ γ = 0,
ance matrix, given by wkT = [µTa µTωI µTba µTbωI µTbωO µTbsz ], 


  
is set to zero in the prediction because it is unknown. This 
 ηOFS,k

 , if β = 1 ∧ γ = 0,
results in the a-priori state prediction: 
 ηIMU,k


− 

ζ̂k+1 = f (ζ̂k , uk , 0), (7)  
ηk = ηsonar,k (14)


where the caret (ˆ) denotes that the variable is estimated  ηOFS,k ,
 if β = 1 ∧ γ = 1,

 ηIMU,k
and the superscript “−” denotes a prediction. The a-priori 



covariance matrix estimation is defined as 
  


Pk+1 = Fk Pk FkT + Wk QWkT , (8)  ηsonar,k ,

if β = 0 ∧ γ = 1;
ηIMU,k
where the matrices Fk and Wk are given by or, in short,
∂f
Fk = (ζ̂k , uk , 0), (9) ηk = h(ζk− , νk ), (15)
∂ζ
∂f where ν ∼ N (0, R) defines the zero-mean Gaussian mea-
Wk = (ζ̂k , uk , 0). (10) surement noise and R is the noise covariance matrix.
∂w

3.2 Measurement model 3.3 Kalman gain

Data from the IMU and the gyroscope included in the The Kalman gain, K, is computed using
OFS are assumed to have the same sampling rate and − −
to be available at each time step. The corresponding Kk+1 = Pk+1 T
Hk+1 (Hk+1 Pk+1 T
Hk+1 T
+ Vk+1 RVk+1 )−1 ,
measurement model is taken as: (16)
 b w w 
Rw (Φ̂k )g + b̂a,k + µa,k where the matrix Hk+1 is defined as
 Rb (Φ̂w )mw + b̂m,k + µω ,k 
ηIMU,k =  w k I 
 T b (Φ̂w )ω̂ w + b̂ω ,k + µω ,k  . (11) Hk+1 =
∂h −
(ζ̂ , 0) (17)
w k k I I ∂ζ k+1
Twb (Φ̂w w
k )ω̂k + b̂ωO ,k + µωO ,k
and the matrix Vk+1 is defined as
Note that the first line in (11) assumes that the linear ac-
celerations are small compared to the gravitational accel- ∂h −
Vk+1 = (ζ̂ , 0) = I, (18)
eration (Strohmeier and Montenegro, 2017). The measure- ∂ν k+1
ment models for the sonar and optical-flow measurements
are, in correspondence with (2), given by where I is the identity matrix.
p̂w
z
ηsonar,k = + µsz , (12)
cos(φ̂) cos(θ̂) 3.4 Correction step
   
b
cos(φ̂) cos(θ̂)v̂x,k b
Similar to the measurement model, the measurement ma-
− w + ω̂y,k τk f + µρx  trix also varies with time. The IMU and gyroscope of the
 p̂z,k 
  OFS have new measurements at each time step, with the
ηOFS,k =   .
 cos( φ̂) cos( θ̂)v̂ b  corresponding matrix:
− y,k
+ ω̂x,k τk f + µρy 
b
 b 
w
p̂z,k ãk
(13)  m̃bk 
η̃IMU,k =  
b .
 ω̃I,k (19)
The OFS does not output new flow data at a constant b
rate due to its automatic exposure control (AEC). The ω̃O,k
flow quality, α ∈ [0, 255], is an output of the PX4Flow
that depends on the number of pixels that match in two A new OFS measurement results in two flow values:
 b 
consecutive images. Sometimes this flow-data quality is ρ̃ ,
equal to α = 0 due to, e.g., poor lighting conditions from η̃OFS,k = x,k , (20)
ρ̃by,k
which no flow could be calculated. Because not all sensors
generate a usable output at every time step, the variables while the measurement vector of the sonar is given by
β ∈ {0, 1} and γ ∈ {0, 1} are introduced. If new flow values
have become available since the last time step and the η̃sonar,k = s̃bz,k . (21)
corresponding quality is larger than zero, β equals 1. If
new sonar information has become available, γ equals 1. Because there are multiple possibilities at each time step,
Otherwise both variables equal zero. the total measurement vector is defined as
Four measurement models can be distinguished:

45
2018 IFAC SYSID
46 9-11, 2018. Stockholm, Sweden
July S.P.H. Driessen et al. / IFAC PapersOnLine 51-15 (2018) 43–48

 As described in Section 3.2, flow data is recorded with a


 η̃IMU,k , if β = 0 ∧ γ = 0,

 variable frequency due to the AEC. Thus, the firmware

  

 η̃OFS,k of the PX4Flow is altered so that the time between the

 , if β = 1 ∧ γ = 0,

 η̃IMU,k two images used for the optical-flow calculation is also



 recorded, for use in (13). Furthermore, the built-in bias
  correction of the gyroscope is turned off, as the correction
η̃k = η̃sonar,k (22)

 has been taken into account in the EKF. The sampling rate
 η̃OFS,k ,
 if β = 1 ∧ γ = 1,

 η̃IMU,k of the gyroscope of the PX4Flow and the IMU is chosen to



 be 200 Hz. The sonar sensor has a range of 0.3–5 m, and

   its measurements are limited to a sampling rate of ∼10

 η̃sonar,k ,

if β = 0 ∧ γ = 1. Hz.
η̃IMU,k
The initial states are estimated by leaving the UAV on the
The corrected state vector and covariance matrix can now ground for a period of time. This automatically implies
be calculated as the initial states, v0 = 0 and ω0 = 0. The positions, px,0

 −

ζ̂k+1 = ζ̂k+1 + Kk+1 η̃k+1 − ηk+1 (ζk+1 , 0) and (23) and py,0 , are chosen to be zero; and the initial vertical

Pk+1 = (I − Kk+1 Hk+1 )Pk+1 . (24) position, pz,0 , is the measured distance from the sonar to
the ground. The initial attitude of the UAV is calculated
4. EXPERIMENTAL VALIDATION using (Munguia and Grau, 2011)
 b
āy
Experiments are carried out to validate and tune the EKF. φ0 = arctan ,
āb
 z 
4.1 Experimental setup ābx
θ0 = arctan − b , (25)
āy s(φ0 ) + ābz c(φ0 )
A stand-alone sensor module featuring a PX4Flow, an  
Adafruit Precision NXP 9-DOF Breakout Board, and a −m̄by c(φ0 ) + m̄bz s(φ)
ψ0 = arctan ,
Teensy 3.6 development board is attached to a hexacopter m̄bx c(θ0 ) + m̄by s(φ0 )s(θ0 ) + m̄bz c(φ0 )s(θ0 )
with a 3DR Pixhawk flight controller. All sensors are
calibrated prior to the experiments. A photograph of the where ā and m̄ are the average acceleration and mag-
hexacopter is shown in Fig 1. netic field measured during the initialisation period and
c(·) = cos(·) and s(·) = sin(·). To derive estimates of the
The UAV is flown under manual remote control in an in- covariance matrices Q and R, the UAV’s motors are turned
door flight laboratory, while sensor outputs are written to on while the UAV remains on a level surface. During this
a micro-SD card by the Teensy board. The flight time and experiment the UAV does not move, yet vibrations and
therefore the measurement duration is limited by battery disturbances also present during flight, are introduced.
capacity to approximately 150 s. This dataset is post- All accelerations other than that due to gravity are zero,
processed by the EKF. The position and attitude used because the UAV is not moving. The standard deviations
as ground truth are measured by an OptiTrack motion- of the values obtained from these measurements are used
capture system, which yields positional and attitudinal to estimate Q and R.
data with uncertainties of <1 mm and <0.02 rad, re-
spectively. The angular and translational velocities are The magnetic field in the experimental environment is
obtained by differentiation. not constant during flight due to the metallic structure
of the laboratory, which causes bad magnetometer mea-
surements. Therefore the covariance of the magnetometer
bias is set to a large value to compensate.

4.2 Results

The gathered dataset is fused in post-processing using the


described EKF and the state estimates are compared to
the motion-capture results. The first 40 s of the measure-
ment are used for sensor calibration and are not taken into
account otherwise. The EKF and motion-capture datasets
are synchronised by hand using data collected during the
calibration period. The attitude estimation is shown in
Fig. 2, where it can be seen to be estimated without drift.
The magnetometer bias (not shown) has the same trends
Active markers as the position of the UAV and is able to compensate for
Sensor position-dependent magnetic distortions.
(LEDs) used
module
with the motion- In Fig. 3, the translational-velocity estimates are shown,
capture system along with the flow quality. Between 40 and 53 s, reliable
flow values are unavailable, which means that the velocity
is estimated by integrating the accelerometer data. This
Fig. 1. UAV with sensor module attached clearly leads to errors in the horizontal translational-

46
2018 IFAC SYSID
July 9-11, 2018. Stockholm, Sweden
S.P.H. Driessen et al. / IFAC PapersOnLine 51-15 (2018) 43–48 47

Fig. 2. Estimated attitude compared to ground truth

velocity estimates. When the OFS supplies reliable data,


this error is corrected. The OFS improves the velocity
estimate and compensates for the drift that is normally
present as a result of integrating the IMU data. As long
as the OFS outputs data with a sufficiently high rate and
quality, the velocity error remains relatively low.
The results for the position estimate are shown in Fig. Fig. 3. Estimated velocity compared to ground truth
4. Drift as a result of the bad flow quality at 40–50 s is
large differences. When neither flow nor the sonar data
clearly visible in the y-position. When the flow data are
are used, state estimates have the RMSE values shown in
available, the drift becomes significantly smaller. Drift in
column five. The effects described earlier are visible, but
the z-direction also occurs during the first few seconds,
the yaw error does not increase as much as when the sonar
when the actual height of the UAV is less than 0.3 m
sensor data alone are omitted. This is caused by the bad
and thus not within the range of the sonar. As soon as
scaling of the flow data, due to a lack of an adequate range
the UAV’s height is within the sonar range, the position
measurement, which corrupts the yaw estimate.
estimate in z-direction is corrected and drift does not
occur. The other two components of position do not have
absolute measurements, and drift can therefore not be To validate that the EKF also performs well with different
entirely prevented. datasets and that the matrices Q and R do not have to be
adapted to obtain good estimates, the unaltered fusion
Table 1 summarises the estimation results in terms of root-
algorithm is used on measurement data from another
mean-squared error (RMSE) values. The second column
flight. The RMSE of the resulting estimates are given in
represents the RMSE of the state estimates using all sensor
the last column of Table 1. For this estimate, all available
measurements in the EKF. These values correspond to the
measurements were used. It can be seen that for this
estimates shown in Figs. 2, 3, and 4. The third column
dataset, the EKF provides similar results to those shown
represents a case when the optical-flow measurements are
in the first column.
not used. The main difference is the RMSE of vx and vy ,
which create poor estimates for px and py . The vertical
velocity estimate (vz ) has RMSE values similar to those 4.3 Discussion
in column two, because the estimate is independent of
the flow data. Furthermore, the influence on the attitude During the experiments, the PX4Flow sensor is found to
estimation is negligible. The fourth column represents the be very sensitive to the lighting conditions and ground
RMSE values when the sonar measurements are neglected surface. Measurements taken outside, on an asphalt sur-
and shows that the error in pz has increased significantly. face, result in high-quality flow data without gaps. The
Because pz is also used to scale flow data, all velocity indoor measurements are performed on a smooth, concrete
estimates have higher RMSE values. Also the yaw error surface, resulting in the flow quality shown in Fig. 3, in
increases significantly, while other estimates do not show which gaps are clearly visible. Fast movements in low-

47
2018 IFAC SYSID
48 9-11, 2018. Stockholm, Sweden
July S.P.H. Driessen et al. / IFAC PapersOnLine 51-15 (2018) 43–48

made by implementing the algorithm on-board the UAV


and inputting estimates from the EKF output to a flight
controller.

REFERENCES
Bleser, G. and Hendeby, G. (2010). Using optical flow
for filling the gaps in visual-inertial tracking. European
Signal Processing Conference, 1836–1840.
Bleser, G. and Stricker, D. (2009). Advanced tracking
through efficient image processing and visual-inertial
sensor fusion. Computers and Graphics, 33(1), 59–72.
Goppert, J., Yantek, S., and Hwang, I. (2017). Invariant
Kalman filter application to optical flow based visual
odometry for UAVs. Ninth International Conference on
Ubiquitous and Future Networks, 99–104.
Honegger, D., Meier, L., Tanskanen, P., Pollefeys, M., and
Eth, Z. (2013). An open source and open hardware em-
bedded metric optical flow CMOS camera for indoor and
outdoor applications. IEEE International Conference
on Robotics and Automation, 1736–1741.
Khosiawan, Y. and Nielsen, I. (2016). A system of UAV
application in indoor environment. Production and
Manufacturing Research, 4(1), 2–22.
Mebarki, R., Cacace, J., and Lippiello, V. (2013). Velocity
Fig. 4. Estimated position compared to ground truth estimation of an UAV using visual and IMU data
Table 1. RMSE values for different cases in a GPS-denied environment. IEEE International
Symposium on Safety, Security, and Rescue Robotics.
No flow, 2nd Munguia, R. and Grau, A. (2011). Attitude and heading
Ref. No flow No sonar
no sonar dataset system based on EKF total state configuration. Pro-
epx (m) 0.4281 13.5987 1.3174 13.3775 0.7826 ceedings of the 2011 IEEE International Symposium on
epy (m) 0.8573 3.7590 1.6040 3.6392 1.0779
Industrial Electronics, 2147–2152.
epz (m) 0.0782 0.0783 1.3594 3.9486 0.0810
evx (m/s) 0.1150 0.4850 0.3075 0.4846 0.1054
Omari, S., Gohl, P., Burri, M., Achtelik, M., and Siegwart,
evy (m/s) 0.1666 0.3166 0.3407 0.3161 0.1224 R. (2015). Visual industrial inspection using aerial
evz (m/s) 0.0461 0.0461 0.1341 0.1361 0.0486 robots. Proceedings of the 3rd International Conference
eφ (rad) 0.0118 0.0125 0.0138 0.0125 0.0115 on Applied Robotics for the Power Industry, (1).
eθ (rad) 0.0116 0.0131 0.0139 0.0131 0.0135 Pó¤lka, M., Ptak, S., and Kuziora, L ¤ . (2017). The use
eψ (rad) 0.0051 0.0074 0.0283 0.0074 0.0067 of UAV’s for search and rescue operations. Procedia
eφ̇ (rad/s) 0.0431 0.0450 0.0426 0.0450 0.0399 Engineering, 192, 748–752.
eθ̇ (rad/s) 0.0266 0.0264 0.0267 0.0264 0.0306 Sabatini, A.M. (2011). Kalman-filter-based orientation
eψ̇ (rad/s) 0.0176 0.0176 0.0176 0.0176 0.0186 determination using inertial/magnetic sensors: Observ-
lighting conditions also resulted in low-quality flow data. ability analysis and performance evaluation. Sensors,
11(10), 9182–9206.
Strohmeier, M. and Montenegro, S. (2017). Coupled
Sonar measurements are greatly affected by the ground GPS/MEMS IMU attitude determination of small
surface: a smooth hard surface gives good quality height UAVs with COTS. Electronics, 6(1), 15.
measurements, compared to a softer and rougher surface. Waharte, S. and Trigoni, N. (2010). Supporting search and
rescue operations with UAVs. International Conference
5. CONCLUSIONS on Emerging Security Technologies (ETS), 142–147.
Weiss, S., Achtelik, M.W., Lynen, S., Chli, M., and Sieg-
In this paper, an EKF is proposed to fuse sensor data wart, R. (2012). Real-time onboard visual-inertial state
from an IMU, an OFS, and a sonar sensor. Bias compen- estimation and self-calibration of MAVs in unknown en-
sation is used to correct for drift in these sensors. The vironments. IEEE International Conference on Robotics
EKF is validated by comparing the state estimates to the and Automation, 957–964.
ground truth provided by a motion-capture system. The Weiss, S. and Siegwart, R. (2011). Real-time metric
experimental results indicate the translational velocity of state estimation for modular vision-inertial systems.
the UAV, as well as its attitude and angular velocity, may Proceedings of the IEEE International Conference on
be estimated without drift. The availability of flow data Robotics and Automation, 231855, 4531–4537.
greatly reduces the error in velocity and position in the Yun, S., Lee, Y.J., and Sung, S. (2016). Range/optical
x − y plane. Sonar measurements result in better estima- flow-aided integrated navigation system in a strapdown
tion of the velocity and position in the z-direction. The sensor configuration. International Journal of Control,
sonar and optical-flow measurements do not significantly Automation and Systems, 14(1), 229–241.
influence the attitude estimates. Future extensions can be

48

You might also like