You are on page 1of 101

Robotics & Embedded Systems

Cognitive Systems
VL Sommer 2014

Multi‐Sensor Data Fusion:


From Signal to State Estimate

Dr. Daniel Clarke


Multi‐Sensor Data Fusion
fortiss GmbH
An‐Institut Technische Universität München
Robotics & Embedded Systems
Cognitive Systems
Why do we need this lecture
• Humans are highly capable cognitive systems capable of:
– Sensing
– Processing
– Interpretation

• Robotic/Machine cognition is difficult because


– Crude physical interface between the analog and digital domain
• Apertures (lens, antenna), CCDs, ADCs etc.
– Sensor systems are limited and unsophisticated
• High precision costs money
– The physical state of the robot and its environment is only is partially-observable,
dynamic and cluttered
• A limitation of the field of view and sensor processing/fusion capabilities
– There is never enough training data
• Humans have several years of learning to process and interpret sensory information

2 Data Fusion © fortiss GmbH


Robotics & Embedded Systems
Key Learning Points

• Signals
– Signals allow us to estimate the physical state of a target
• Fourier Transforms
– Fourier transforms are used to estimate the energy content of a signal
– Energy content is used to estimate physical state
• Sensor Systems
– Sensors are used to measure signals of different content (spectral and sematic)
– Interpret different physical states
• Kalman Filter
– Optimal Bayesian estimate under conditions of linear-Gaussian uncertainty
• Particle Filter
– Evolving set of particles providing an estimate of the state PDF
– Multi-modal non-Gaussian Probability Density Function

3 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems

Signal Sampling

0.5

-0.5

-1
0 1 2 3 4 5 6 7 8

4 Data Fusion © fortiss GmbH


Robotics & Embedded Systems
Sensor System

Source
Reflection of source
(or emission)
Measurement

Processor

Aperture
Sensor

Signal

ADC

Target

Signal
(typically a waveform)
Waveform analysis used
to infer physical properties
of the target
Interest of this lecture
5 Data Fusion © fortiss GmbH München, 22 August 2014
Robotics & Embedded Systems
Signal
Types of Signals
• A signal is “a function that conveys information about the behavior or
attributes of some phenomenon”1
• Mathematical representation of independent variables as a function of time
(radio) or position (camera) 1

0.5

• Some Examples: -0.5

-1
0 1 2 3 4 5 6 7 8
– 1D signal as a function of time
e.g. Audio, radio frequency, light

– 2D signal as a function of space


Greyscale imagery (RGB – 3x2D)

– 3D signal as a function of time


Grey scale video (RGB – 3x3D)
1. Introductory Signal Processing – R. Premier
6 Data Fusion © fortiss GmbH
Robotics & Embedded Systems
Signals
Analog and Digital
• Signals come in many forms
– Analog (e.g. voltage or resistance)
– Analog waveform (e.g. sound)
– Digital level (i.e. on/off)
– Digital waveform
• Modern computers process digital
information
• Computers interface with the real world
using
– Analog to Digital Convertors (ADCs)
– Digital to Analog Convertors (DACs)

• Digital is a finite, sampled representation


of an of an analog form
7 Data Fusion © fortiss GmbH
Robotics & Embedded Systems
Signals
Analog Signal
50 Hz Sinusoid
1

0.8
• Signals in nature are
typically analog
0.6
– Continuous in time and
0.4
amplitude
0.2
– This waveform is one period
Amplitude

0 of a (50Hz sine function)


−0.2

−0.4

−0.6

−0.8

−1
0 2 4 6 8 10 12 14 16 18 20
time (milliseconds)

8 Data Fusion © fortiss GmbH


Robotics & Embedded Systems
Digital Signals
Sampling Rates and Resolution
50 Hz Sinusoid
3
• Analog waveforms are time-varying
2 signals
Sample
interval • ADC will sample at some fixed
1

frequency (x axis)
Amplitude

0 • ADC will sample at some fixed


resolution (y axis)
−1

Sample
Resolution
−2 • This waveform (50Hz sine function)
– Sample rate of 1024Hz (interval 1ms)
−3
0 2 4 6 8 10 12
time (milliseconds)
14 16 18 20
– Sample resolution of 7 elements
– More data means more computation

9 Data Fusion © fortiss GmbH


Robotics & Embedded Systems
Sampling

• Sampling is not a reversible process


– If we lose information during sampling, it is not (in general) possible to recover it
– Under certain conditions an analogue signal can be sampled without loss and
can be reconstructed perfectly (Nyquist theory)

10 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Signals
Aliasing
• Sinusoidal functions
– Most signals are described as a sum
of sin functions
– Aliasing generally occurs when there
is periodicity in the function 1
50 Hz Sinusoid

0.8

• Aliasing: 0.6

0.4
– An effect which causes different
0.2
signals to become indistinguishable

Amplitude
0
from each other
-0.2
– When sampled signals are used to -0.4

reconstruct a continuous signal, -0.6

artifacts due to aliasing can occur -0.8

– Can be avoided by correctly -1


0 500 1000 1500 2000 2500 3000 3500 4000 4500
time (milliseconds)
applying Nyquist theory
11 Data Fusion © fortiss GmbH
Robotics & Embedded Systems
Aliasing Example

12 Data Fusion © fortiss GmbH


Robotics & Embedded Systems

Fourier Transform

Aphex Twin -
Image hidden within the music signal (viewable
using spectral analysis)

13 Data Fusion © fortiss GmbH


Robotics & Embedded Systems
Signal Transformations
The Fourier Transform
• Fourier Series
– Complicated (periodic) functions can be
written as the mathematical sum of sines and
cosines
• The Fourier Transform
– Mathematical transformation to transform
signals between the time and frequency
domain
– Provides an estimate of the energy content
of a signal as a function of the frequency

14 Data Fusion © fortiss GmbH


Robotics & Embedded Systems
Signal Interpretation
Fourier Transform

15 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Fourier Transform
Uncertainty
• Real-time processing 1
Single-Sided Amplitude Spectrum of f(t)

considerations 0.9

0.8
– 10MHz sample rate
0.7
– ADC (8-32bit)
0.6
– Subsample of the waveform (1024- Uncertain peak

|F(f)|
0.5

4096+) 0.4

– Process one sample before the next 0.3

is available 0.2

0.1

0
• Uncertainty 0 100 200 300
Frequency (Hz)
400 500 600

– Signal processing has introduced The Fourier Transform


uncertainty in the energy content provides an estimate of the
– Filtering (high, low, band etc.) can energy content of a signal as
help to reduce the uncertainty a function of the frequency

16 Data Fusion © fortiss GmbH


Robotics & Embedded Systems
Signal Processing
State Estimation
• We typically process a signal in order to estimate the physical properties
of a target object

• Even with a perfect signal, we have introduced uncertainty into our


estimate of the physical properties
– Detection
– Identification
– Localization

• We must account for this uncertainty in our subsequent use of sensor


measurements and state estimates
– Multi-dimensional uncertainty (Covariance)
– More later!

17 Data Fusion © fortiss GmbH


Robotics & Embedded Systems

Typical Robotics Sensors

18 Data Fusion © fortiss GmbH


Robotics & Embedded Systems
Robotics Sensors
Navigation Problem
• Traditional ‘Navigation’
Problem
– Where am I know?
– Where do I want to go to
– How do I get there (without
P P P colliding with the environment)?

• Multi-Sensor Problem:
– Internal navigation sensors
• GPS
• IMU
– External Sensors
• Camera
• RADAR
P P P P P • LIDAR
19 Data Fusion © fortiss GmbH
Robotics & Embedded Systems

Internal Navigation Sensors

20 Data Fusion © fortiss GmbH


Robotics & Embedded Systems
Internal Navigation Sensors
Global Navigation Satellite Systems (GNSS)
• Globally visible constellation of satellites
– Precise ephemeris (ground station monitoring)
– Precise timing (atomic clock)

• Transmission of navigation message


– Includes precise location and time of the sate
– CDMA spread-spectrum - Low-bitrate navigation
message is encoded with a high-rate pseudo-
random (PRN) sequence that is different for each satellite

• Receiver Station
– Receiver decodes the navigation message
– Computes time of flight
– Multiple (noisy) range rings are used to generate a
navigation solution
21 Data Fusion © fortiss GmbH München, 22 August 2014
Robotics & Embedded Systems
GNSS Systems
Many systems

System GPS GLONASS COMPASS Galileo IRNSS


Political entity United States Russian Federation China European Union India
Coding CDMA FDMA/CDMA CDMA CDMA CDMA
31, including
5 geostationary orbit
24 operational 4 test bed satellites
(GEO) satellites,
Number of 1 in preparation in orbit, 7 geostationary orbit
At least 24 30 medium Earth
satellites 2 on maintenance 22 operational (GEO) satellites
orbit (MEO)
3 reserve satellites budgeted
satellites
1 on tests
1.164–1.215 GHz
1.561098 GHz (B1)
1.57542 GHz Around 1.602 GHz (E5a and E5b)
1.589742 GHz (B1-
(L1 signal) (SP) 1.260–1.300 GHz
Frequency 2) N/A
1.2276 GHz Around 1.246 GHz (E6)
1.20714 GHz (B2)
(L2 signal) (SP) 1.559–1.592 GHz
1.26852 GHz (B3)
(E2-L1-E11)
15 satellites 2 satellites
Operational,
operational, launched,
Status Operational CDMA in In preparation
20 additional 5 additional
preparation
satellites planned satellites planned
Table from Wikipedia

22 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Internal Navigation Sensors
Inertial Measurement Units
• Basic Operating Principles
– Sense the attitude and velocity (acceleration)
• Gyroscope
– Orientation
• Accelerometer
– Acceleration
• Magnetometer
– Magnetic Field (e.g. electro-compass)
• Barometer
– Air pressure
– Not strictly an IMU!
• An Inertial Navigation System
– Integrates the above to
generate a navigation solution

23 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Inertial Measurement Units
Random Walk Error

Figure 2 - Acceleration Estimation Error Figure 1 - Velocity Estimation Error


0.8 0.4

0.6 0.3

0.2
0.4
Acceleration Error (ms -2)

Velocity Error (m/sec)


0.1
0.2
0
0
-0.1

-0.2
-0.2

-0.4 -0.3

-0.6 -0.4
0 5 10 15 20 25 0 5 10 15 20 25
Time (sec) Time (sec)

24 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems

Externally Looking Sensors

25 Data Fusion © fortiss GmbH


Robotics & Embedded Systems
Radar
RADAR - Radio Direction and Ranging
• Basic Operational Principal
– A ‘pulse’ of energy is transmitted
from one location
– The pulse is ‘reflected’ from objects
in the environment
– The reflected pulse is then received
at another location
– Radio Frequency (RF) signal
processing

• Possible to measure
– Range (time of flight)
– Relative velocity (Doppler effect)
– Azimuth (antenna pattern direction)
– Absolute velocity (tracking)
26 Data Fusion © fortiss GmbH München, 22 August 2014
Robotics & Embedded Systems
Types of Radar

• Pulsed (transit time) Radar


– A pulse of energy is transmitted
– Time to receipt is measured to
determine range
• Pulse Doppler
– Change in frequency of received
pulse is measured to estimate
velocity
• CW Radar
– Energy at a constant frequency is
transmitted
– Doppler used to determine velocity
• FMCW Radar
– Used to determine distance as well

27 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Radar
Challenges and Opportunities
• Most affordable radars are typically a single element system
– Poor angular resolution due to wide beam width
– Precise range and velocity of multiple objects within the beam

• Future developments
– Phased array radars providing high angular resolution

• Many Opportunities
– Relatively low cost
– Radio Frequency – weather independent
• Many more challenges
– Poorly understood capabilities in the automotive domain
– Under-researched in academia
– Dynamic environments are challenging: multi-pathing and clutter

28 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Ultra-Sonic Sensors
A short note
• Ultrasonic sensors operate using similar principles to radar
– Generate high frequency pulse of sound waves
– Interpret the physical properties of objects based on properties of the echo

• Basic properties
– Extremely low cost (€10s)
– Good range resolution
– Poor range (typically <5m)
– Low power/computational cost
– Poor noise characteristics

29 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Lidar
Light Detection and Ranging
• Basic Operational Principal
– A ‘pulse’ of laser light is transmitted
from one location
– The pulse is ‘reflected’ from objects
in the environment
– The reflected pulse is then received
• Possible to measure
– Range to object (time of flight)
– Azimuth to object
– Relative velocity (Doppler effect)
– Reflectivity
• Single Beam
– Mechanically scanned mirror to
create multi-dimensional scan

30 Data Fusion © fortiss GmbH


Robotics & Embedded Systems
Lidar
Application Examples
• Used for Obstacle Detection and Identification
• Cars: K. Fuerstenberg et al.: “Pedestrian Recognition and Tracking of Vehicles using a vehicle
based Multilayer Laserscanner”, Intelligent Vehicle Symposium, IEEE (Volume:1 ), 2002

31 Data Fusion © fortiss GmbH


Robotics & Embedded Systems
Image Sensor
Light Visible light, IR,
Basic Principles Source
mm … etc.

• Pixel grid measures incident photonic


energy (i.e. CCD or CMOS)
• Focused by a physical aperture or lens
Object
(depends on wavelength)
• Sensor optimized for specific radiation
wavelength (including filters)
• Digital processing describes grid of
normalized radiation intensity (raw image) Camera Raw image

Image
Processing

32 Data Fusion © fortiss GmbH


Robotics & Embedded Systems
Image Sensors
Computer Vision and Image Processing

33 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Image Sensor
Challenges and Opportunities
• Computer vision allows us to infer physical and semantic state from visual
representations
– Pose (location and orientation) of an object
– Spectral and spatial characteristics
– Contextual information
• Principal benefits
– Low cost
– Passive (low power consumption)
– Diverse range of capabilities
– Conceptually easy to interpret (by humans)
• Principal Challenges
– Highly affected by clutter and obscuration
– Affected by lighting conditions
– Conceptually difficult to interpret (by computers)

34 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Image Sensors
Diversity
• Image sensors have found a vast range of applications due to
– Diversity and Accuracy
– Human brain is a natural image processor

• We have shaped the world around us for biological image processing

35 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Summary

• GNSS
– Localization
• Inertial Measurement Units
– Orientation
– Acceleration
• Imagery Systems
– 2D planar projection of 3D space

36 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Summary

• GNSS
– Localization
• Inertial Measurement Units
– Orientation
– Acceleration
• Imagery Systems
– 2D planar projection of 3D space
• Lidar
– Array of range at discrete
azimuth
• Radar
– Range and velocity of target
within volume
• Ultra Sound
– Range of target within volume

37 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Summary

• GNSS
– Localization
• Inertial Measurement Units
– Orientation
– Acceleration
• Imagery Systems
– 2D planar projection of 3D space

38 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Summary

• GNSS
– Localization
• Inertial Measurement Units
– Orientation
– Acceleration
• Imagery Systems
– 2D planar projection of 3D space
• Lidar
– Array of range at discrete
azimuth
• Radar
– Range and velocity of target
within volume
• Ultra Sound
– Range of target within volume

39 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems

Uncertainty and Probability

40 Data Fusion © fortiss GmbH


Robotics & Embedded Systems
Sensor Systems and Signal Processing
Uncertainty
• Sensor Systems 1
Single-Sided Amplitude Spectrum of f(t)

– Signal represents physical state of a 0.9

target system 0.8

0.7
– Localization (GPS/IMU)
0.6
– Range (Radar, Lidar, Ultra-Sound)
Uncertain peak

|F(f)|
0.5
– Imagery (Camera, event based) 0.4

0.3

• Digital Signal Processing 0.2

– Digitally sample the signal (ADC) 0.1

0
– Process the digital signal to detect 0 100 200 300
Frequency (Hz)
400 500 600

and identify key characteristics


The Fourier Transform
provides an estimate of the
• Fourier Transform energy content of a signal
– Interrogate the signal to understand as a function of the
physical properties frequency
41 Data Fusion © fortiss GmbH
Robotics & Embedded Systems
Signal Processing
State Uncertainty
• We typically process a signal in order to estimate the physical properties
of a target object

• Even with a perfect signal, digital signal processing introduces uncertainty


– One of the least important sources of uncertainty in sensing for autonomous
driving

• Other sources of uncertainty Easy to


– Physical signal is always noisy model
– Electronics of the sensor system
– Mechanics of the sensor system (i.e. aperture, physical sensing etc.)
– Missing information (occlusions)
– Clutter (reflections/multi-path targets of non-interest)

Difficult to model
42 Data Fusion © fortiss GmbH
Robotics & Embedded Systems
Basic Probability Theory
The normal distribution
• Normal or Gaussian distribution is a continuous probability distribution
• Probability density function (1 dimension):
1
,
2

– is called the mean


– is called the standard deviation

• Statistically well behaved


• Symmetric about the mean
• Many important applications

43 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Basic Probabilities
Important probability distributions
• Poisson distribution
– Discrete distribution
– Probability that k events occur in a time interval
– Given that in average ∈ events occur per interval

– ≔
!

• Uniform distribution
– Discrete or continuous
– All elements in the sample space have the same probability

• distribution (chi-squared distribution)


– Used for hypothesis testing
– Computation of confidence intervals

44 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Normal Distribution
Noise and Bias
• An observation of the state is
defined by:

• State variable
– useful information
• Stochastic Noise
– Random variations
• Deterministic Noise
– A bias on the measurement
• We generally assume the bias to
be zero
– Mathematically more tractable
(easier)
45 Data Fusion © fortiss GmbH München, 22 August 2014
Robotics & Embedded Systems

Conditional Probabilities

46 Data Fusion © fortiss GmbH


Robotics & Embedded Systems
Conditional Probabilities
Joint Probability
• Example: Two random variables A and B on same sample space

- Probability is
1 constant within
the area
1 - Probability is
zero outside

1 1

• , is called a joint probability distribution

47 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Conditional Probabilities
Conditional probability
• A conditional probability is the probability of an event, given some other
event has already occurred
• The conditional probability can be defined by
Ω ∩

∩ |

Absolute probability Conditioned on B

48 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Conditional Probabilities
Bayes’ theorem
• From the definition of conditional probability:
– ∩
– ∩

• We have:
∩ ∩ ⇒

• This is called Bayes’ theorem:

49 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Conditional Probabilities
Bayes’ theorem – Example

• There is a rare disease which affects 1 out of 1000 people

• There is a test which guarantees:


– The test is 99% sensitive, i.e. 99% of ill patients are tested positive
– The test is 99% specific, i.e. 99% of healthy patients are tested negative

• What is the probability that:


a patient is actually sick, given he is tested positive?

50 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Conditional Probabilities
Bayes’ theorem – Example

• Using Bayes’ theorem:

0.99 0.001
0.99 0.001 0.01 0.999
0.09

• Only 9% of patients tested positive are actually sick!

51 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Conditional Probabilities
Bayes’ theorem – Example
• What is the probability that:
a patient is actually sick, given he is tested positive?

Test Test
positive negative
Patient 0.99*0.001 0.01*0.001 0.001
is ill (A)
Patient 0.01*0.999 0.99*0.999 0.999
is healthy (B)

• The answer is again: A/(A+B)


= 0.00099/(0.00999+0.00099) = 9%

52 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Conditional Probabilities
Example – Stereo Range Finding

• Measuring the position of an object

• Two sensors from different viewpoints

53 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Conditional Probabilities
Example – Stereo Range Finding
• Sensor model:

, , ,
– , ) is a possible object location
– , , ) is the pose of the sensor
– is the standard deviation of the error
• Same as , , ,
,

– is an distance function
• Each location gets a probability

54 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Conditional Probabilities
Example – Stereo Range Finding
• Sensor model:

, , ,
– , ) is a possible object location
– , , ) is the pose of the sensor
– is the standard deviation of the error
• Same as , , ,
,

– is an distance function
• Each location gets a probability

• Same for other sensor

55 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Conditional Probabilities
Example – Stereo Range Finding

• Combine both measurements

• Provided the measurements are


independent
– The joint probability is
,
– C is just a normalization constant

• What is the location with maximum


probability?

56 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Conditional Probabilities
Example – Stereo Range Finding

• What is the location with maximum probability?



argmax | ,

• Using maximum likelihood estimation:



argmax ln | ,

– The logarithm is a strictly increasing function!
– ln is called the likelihood
– Allows simplification of our problem

57 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Conditional Probabilities
Example – Stereo Range Finding
• We have:

argmax ln | ,

argmax ln |

argmax ln ln

, ,
argmax ln ln

1 1
argmax , , ln ln
∈ 2 2

argmin , ,

58 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Conditional Probabilities
Example – Stereo Range Finding

• Using maximum likelihood estimation we transformed a probabilistic


formulation to an elegant minimization problem!

– From argmax |


– To argmin , ,

• Now it is possible to use various optimization


techniques:
– Gradient Descent
– Gauss-Newton
– Levenberg–Marquardt (damped)
– Evolutionary Algorithms (GA, PSO)

59 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems

The Filtering/Tracking Problem

60 Data Fusion © fortiss GmbH


Robotics & Embedded Systems
Filtering Problem
Hidden Markov Model
• System can be considered as a Hidden Markov Model

• System transitions from one state to another

• We observe a system property which relates to that state


– A system property could be an actual observation (i.e. current position)
– Or a property indicating the change in sate (i.e. an odometer measurement)

61 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Moving Robot

62 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Moving Robot

0.5 0.5 0.5 0.5

0.45 0.45 0.45 0.45

0.4 0.4 0.4 0.4

0.35 0.35 0.35 0.35

0.3 0.3 0.3 0.3


P(A)

P(A)

P(A)
P(A)
0.25 0.25 0.25 0.25

0.2 0.2 0.2 0.2

0.15 0.15 0.15 0.15

0.1 0.1 0.1 0.1

0.05 0.05 0.05 0.05

0 0 0 0
-20 -15 -10 -5 0 5 10 15 20 -20 -15 -10 -5 0 5 10 15 20-20 -15 -10 -5 0 5 10 15 20 -20 -15 -10 -5 0 5 10 15 20
x (m) x (m) x (m) x (m)

63 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems

The Kalman Filter

64 Data Fusion © fortiss GmbH


Robotics & Embedded Systems
The Filtering Problem
Bayesian Filter
• Our goal is to estimate the state given all the observed
properties up until that time step :
– : ,…, i.e. a sequence of measurements
• The Kalman Filter is a linear quadratic estimator
– Optimal Bayesian estimate under conditions of linear-Gaussian
uncertainty
– Noisy measurements recorded over time
– Produce estimates of state variables which are typically more precise
– Tracks the estimated state variable and its uncertainty
• Two step process
– State prediction step
– State update step
– Iterative nature provides real time operation
65 Data Fusion © fortiss GmbH München, 22 August 2014
Robotics & Embedded Systems
Kalman Filter Process

66 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
The Kalman Filter (1)
Unimodal Linear Motion
• Kalman filters are typically used to model a linear system described by the
following linear equations

x is the system state; u is the system control vector; y is the measured output;
w is the process noise; z is the measurement noise A,B,C are matrices;
k is the time index; each of these elements are typically vectors dimensions

• The variable contains all information about the state of the system
– Cannot be measured directly!
– Our measurement y is an estimate of the state x, corrupted by noise z

67 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
The Kalman Filter
The filter
• We wish to use the available measurements ( ) to estimate the state of
the system
– We have knowledge (or can estimate) how the system transitions between
states and
– We know the relation between the system state and measurements

• Requirements
– Average value of the state estimate to be equal to the average of the true state
– Estimated state should vary from the true state as little as possible

68 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
The Kalman Filter
Equations
• The Kalman Filter Equations:

• Series of matrix manipulation equations

– − Kalman Gain Matrix … which can be given as:



– − State Estimate Matrix
– − State Error Estimation

69 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
The Kalman Filter
Equations
• The Kalman Filter Equations:

• - State Estimate is derived from


– State estimate at time k (multiplied by A)
– Known input at time k (multiplied by B)
– Correction term (how we influence the new state, given our measurement)
• - Generally a prediction term

– Depends upon the problem

70 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
The Kalman Filter
Equations
• The Kalman Filter Equations:

• - Kalman Gain Matrix


– - Current state error estimate
– - Derived from the measurement noise

• Equation dominated by the term


– High measurement noise – low Kalman gain
– Low measurement noise – high Kalman gain

71 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
The Kalman Filter
Equations
• The Kalman Filter Equations:

• - State Error Estimate


– - Process Noise
– - Measurement Noise
– . - Expectation

• We assume/expect that both z and w are:


– Zero mean
– Uncorrelated

72 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems

Example

73 Data Fusion © fortiss GmbH


Robotics & Embedded Systems
The Kalman Filter
Unimodal Linear Motion
• Kalman filters are typically used to remove noise from a signal described
by the following linear equations

x is the system state; u is the system control vector; y is the measured output;
w is the process noise; z is the measurement noise A,B,C are matrices;
k is the time index; each of these elements are typically vectors dimensions

• The variable contains all information about the state of the system
– Cannot be measured directly!
– Our measurement y is an estimate of the state x, corrupted by noise z

74 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
The Kalman Filter
Example – State Equations
• Model a vehicle moving in a straight line
– The state we wish to estimate is the position (p) and velocity (v)

– The state vector is given by


– We know acceleration, which is our control variable u
– We measure the position p, every dT seconds (where , are process noises)
(velocity)
(Position)

• As our measured output , is equal to the position, our state equations


become:

1
2
0 1
10

75 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Kalman Filter
Building the Example
• Matrices A, B and C are used to model our system

– Matrix A – Describes how we transition from one state to the next


– Matrix B – Input model which controls the input u
– Matrix C – Observation model which maps the true state space to observed
state space

76 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Kalman Filter
Vehicle Navigation Example
• Consider our initial problem
– Vehicle travelling along a road (linear system)

• Remember our state Equations:

1
2
0 1
10

• State
– Position error is measured at a standard deviation of 10m
– Input acceleration is 2m/s^2 (with noise of 0.2 m/s^2)
– Position recorded 10 times per second (T = 0.1)

77 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Kalman Filter
Example – Error Estimation
• - Measurement Noise
– Position error is measured at a standard deviation of 10m
– 10x10 100

• - Process Noise

1
• Remember
0 1
– . 0.2 10
– 0.1 0.2 0.02
– Remember is our known input (a = 0.2 m/s)

78 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Kalman Filter
Example Graphs
Figure 1 - Vehicle Position (True, Measured, and Estimated)
350
• True Position
300 • Measured Position
250 • Estimated Position
200
Position (m)

150

100

50

-50
0 5 10 15 20 25
Time (sec)

79 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Kalman Filter
Example Graphs
30
Figure 2 - Position Measurement Error and Position Estimation Error
• Position error based on
measurement
20
• Position error based on
10 estimate
Position Error (m)

-10

-20

-30

-40
0 5 10 15 20 25
Time (sec)

80 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Kalman Filter
Example Graphs
30
Figure 3 - Velocity (True and Estimated)
• Velocity Estimate
• True Velocity
25

20
Velocity (m/sec)

15

10

0
0 5 10 15 20 25
Time (sec)

81 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Kalman Filter
Example Graphs
0.3
Figure 4 - Velocity Estimation Error
• Velocity Error based on
Estimate
0.2

0.1
Velocity Error (m/sec)

-0.1

-0.2

-0.3

-0.4
0 5 10 15 20 25
Time (sec)

82 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Kalman Filter
Summary
• Estimates state of a system
– Position, velocity, timing
– Any other continuous state variable

• The Kalman Filter maintains


– Mean state vector
– Matrix of state uncertainty (Covariance Matrix)

• Two Step Process


– Sequential prediction
– Measurement update

• Standard Kalman filter is linear-Gaussian


– Linear system dynamics, linear sensor model
– Additive Gaussian noise (independent)
– Nonlinear extensions: extended KF, unscented KF

83 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems

Particle Filter

84 Data Fusion © fortiss GmbH


Robotics & Embedded Systems
Particle Filters
Motivation
• The Kalman Filter
– Linear quadratic estimator (recursive analytical solution)
– Provides an optimal Bayesian estimate under conditions of linear-Gaussian
uncertainty

• Numerical vs analytical methods


– Kalman Filter – Exact model for an approximate solution
– Particle Filter – Approximate model for an exact solution
– Numerical techniques allow the investigation of complex systems

85 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
The Multi-Modal Tracking Problem
Underlying Challenge
• The Particle Filter is a numerical
technique used for complex systems
– Non Linear
– Non-Gaussian
– Multi-modal

• Our goal is to estimate the system


state given all the observed
properties up until that time step
: Z. Wu, et. al. "Coupling Detection and Data Association for Multiple
Object Tracking.'' CVPR 2012

• Estimate the posterior distribution


P : | :

86 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
The Particle Filter
Sample Based PDF Representation
• Monte-Carlo characterization of the PDF
– Represent the posterior density using a set of random samples (particles) from
the PDF
P | :

– A large number of particles (N) is equivalent to a function description of the PDF


– As → ∞ we approach the optimal Bayesian estimate (i.e. we may need a lot of
particles)

• Discrete approximation of the state estimate PDF

P | :

87 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
The Particle Filter
Predict / Update
• Step 1 – Predict
– Uniformly weighted random measure 1
approximates the prediction density
P | :
2
• Step 2 – Likelihood Function
– Compute each particle weight
∝ : | : 3
• Step 3 – Resample 4
– Resample particles based on their weight

• Step 4 – Update
– ‘Move’ the particles based on prediction or
motion model

• Iterate for each measurement


Handbook of Multi-Sensor Data Fusion. M. Liggins. et. al.

88 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Moving Robot
Particle Filter Example

0.5 0.5 0.5 0.5

0.45 0.45 0.45 0.45

0.4 0.4 0.4 0.4

0.35 0.35 0.35 0.35

0.3 0.3 0.3 0.3


P(A)

P(A)

P(A)
P(A)
0.25 0.25 0.25 0.25

0.2 0.2 0.2 0.2

0.15 0.15 0.15 0.15

0.1 0.1 0.1 0.1

0.05 0.05 0.05 0.05

0 0 0 0
-20 -15 -10 -5 0 5 10 15 20 -20 -15 -10 -5 0 5 10 15 20-20 -15 -10 -5 0 5 10 15 20 -20 -15 -10 -5 0 5 10 15 20
x (m) x (m) x (m) x (m)

89 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Particle Filter
Explained (hopefully)
0.5

0.4

0.3
P(A)

0.2

0.1

0
-20 -15 -10 -5 0 5 10 15 20
x (m)

• Step 1
– Uniformly weighted random measure approximates the prediction density
P | :
• Step 2 – Likelihood Function
– Compute each particle weight
∝ : | :

90 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Particle Filter
New Measurement
0.5

0.4

0.3
P(A)

0.2

0.1

0
-20 -15 -10 -5 0 5 10 15 20
x (m)

• Step 3 – Resample
– Resample particles based on their weight

• Step 4 – Update
– ‘Move’ the particles based on prediction or motion model

91 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Particle Filter
New Measurement
0.5

0.4

0.3
P(A)

0.2

0.1

0
-20 -15 -10 -5 0 5 10 15 20
x (m)

• Step 3 – Resample
– Resample particles based on their weight

• Step 4 – Update
– ‘Move’ the particles based on prediction or motion model

• And Repeat …
92 Data Fusion © fortiss GmbH München, 22 August 2014
Robotics & Embedded Systems
The Particle Filter
Example – Single Moving Gaussian

Probabilistic Tracking and


Reconstruction of 3D
Human Motion in Monocular
Video Sequences
Hedvig Sidenbladh

93 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Particle Filter
Obtaining the State Estimate
• The state variables
– Determined from the particle estimate of PDF

• Weighted mean

• Maximum a posteriori probability (MAP) estimate


– Robust MAP estimate (mean with window about the MAP)

94 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Particle Filter
Example

http://robots.stanford.edu/

95 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Kalman Filter
Summary
• Estimates state of a system
– Position, velocity, timing
– Any other continuous state variable

• The Kalman Filter maintains


– Mean state vector
– Matrix of state uncertainty (Covariance Matrix)

• Two Step Process


– Sequential prediction
– Measurement update

• Standard Kalman filter is linear-Gaussian


– Linear system dynamics, linear sensor model
– Additive Gaussian noise (independent)
– Nonlinear extensions: extended KF, unscented KF

96 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Particle Filter
Summary
• Estimates state of a system
– Position, velocity, timing
– Any other continuous state variable (and discrete state variables)

• The Particle Filter maintains


– Set of particles providing an estimate of the state PDF

• Two Step Process


– Predictive sampling
– Computation of importance weighting and subsequent resampling

• Standard Particle filter


– Fully non-linear
– Simple to implement
– Computationally Expensive

97 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems

Summary

98 Data Fusion © fortiss GmbH


Robotics & Embedded Systems
Key Learning Points

• Signals
– Signals allow us to estimate the physical state of a target
• Fourier Transforms
– Fourier transforms are used to estimate the energy content of a signal
– Energy content is used to estimate physical state
• Sensor Systems
– Sensors are used to measure signals of different content (spectral and sematic)
– Interpret different physical states
• Kalman Filter
– Optimal Bayesian estimate under conditions of linear-Gaussian uncertainty
• Particle Filter
– Evolving set of particles providing an estimate of the state PDF
– Multi-modal-o Gaussian Probability Density Function

99 Data Fusion © fortiss GmbH München, 22 August 2014


Robotics & Embedded Systems
Conclusions

• Sensor systems allow us to infer the physical and semantic state of the
world around us
– Signal processing is used to provide digital representation of physical properties
– Sensor processing algorithms allow us to examine signals

• Sensor systems play an integral role robotics and autonomous systems


– Dynamic environment requires real time sensing
– Safe operation requires both accurate and precise sensor information
– Situational awareness requires diverse range of sensor capabilities

• Filtering and Fusion allows us to


– Improve state estimates
– Combine information from disparate sources (location and capabilities)
– Estimate state variables which are otherwise unobservable

100 Data Fusion © fortiss GmbH München, 22 August 2014


Daniel Clarke

Multi-Sensor Data Fusion Group


fortiss GmbH
An-Institut Technische Universität München
Guerickestraße 25 · 80805 München · Germany

tel +49 89 3603522 0 fax +49 89 3603522 50

clarke@fortiss.org
www.fortiss.org

101 Data Fusion © fortiss GmbH München, 2 July 2013

You might also like