You are on page 1of 47

Beyond the Kalman Filter:

Particle filters for tracking applications

N. J. Gordon
Tracking and Sensor Fusion Group
Intelligence, Surveillance and Reconnaissance Division
Defence Science and Technology Organisation
PO Box 1500, Edinburgh, SA 5111, AUSTRALIA.
Neil.Gordon@dsto.defence.gov.au

N.J. Gordon : Lake Louise : October 2003 p. 1/47



Contents
General PF discussion
History
Review
Tracking applications
Sonobuoy
TBD
DBZ

N.J. Gordon : Lake Louise : October 2003 p. 2/47



What is tracking?
Use models of the real world to
estimate the past and present
predict the future
Achieved by extracting underlying information from
sequence of noisy/uncertain observations
Perform inference on-line
Evaluate evolving sequence of probability distributions

N.J. Gordon : Lake Louise : October 2003 p. 3/47



Recursive filter
System model
xt = ft (xt`1 ; t ) $ p (xt xt`1 )

Measurement model
yt = ht (xt ; t ) $ p (yt xt )

Information available
y1:t = (y1 ; : : : ; yt )

p(x0 )

Want
p (x0:t+i y1:t )

and especially
p (xt y1:t )

N.J. Gordon : Lake Louise : October 2003 p. 4/47



Recursive filter
Prediction Z
p (xt y1:t`1 ) = p (xt xt`1 ) p (xt`1 y1:t`1 ) dxt`1

Update
p (yt xt ) p (xt y1:t`1 )
p (xt y1:t ) =
p (yt y1:t`1 )

Z
p (yt y1:t`1 ) = p (yt xt ) p (xt y1:t`1 ) dxt

Alternatively ...

p (yt xt ) p (xt xt`1 )


p (x0:t y1:t ) = p (x0:t`1 y1:t`1 )
p (yt y1:t`1 )

Z
p (xt y1:t ) = p (x0:t y1:t ) dx0:t`1

N.J. Gordon : Lake Louise : October 2003 p. 5/47



Tracking : what are the problems?
On-line processing
Target manoeuvres
Missing measurements
Spurious measurements
Multiple objects and/or sensors
Finite sensor resolution
Prior constraints
Signature information

Nonlinear/non-Gaussian models

N.J. Gordon : Lake Louise : October 2003 p. 6/47



Analytic approximations
EKF and variants : linearisation, Gaussian approx,
unimodal
Score function moment approximation : Masreliez (75),
West (81), Fahrmeir (92), Pericchi, Smith (92)
Series based approximation to score functions : Wu,
Cheng (92)

N.J. Gordon : Lake Louise : October 2003 p. 7/47



Numerical approximations
Discrete grid: Pole, West (88)
Piecewise pdf: Kitagawa (87), Kramer, Sorenson (88)
Series expansion: Sorenson, Stubberud (68)
Gaussian mixtures: Sorenson, Alspach (72), West (92)
Unscented filter: Julier, Uhlman (95)

N.J. Gordon : Lake Louise : October 2003 p. 8/47



Monte Carlo Approximations
Sequential Importance Sampling (SIS): Handschin &
Mayne, Automatica, 1969.
Improved SIS: Zaritskii et al., Automation and Remote
Control, 1975.
Rao-Blackwellisation: Akashi & Kumamoto,
Automatica, 1977...

) Too computationally demanding 20-30 years ago

N.J. Gordon : Lake Louise : October 2003 p. 9/47



Sequential Monte Carlo (SMC)
SMC methods lead to estimate of the complete
probability distribution
Approximation centred on the pdf rather than
compromising the state space model
Known as Particle filters, SIR filters, bootstrap
filters, Monte Carlo filters, Condensation etc

N.J. Gordon : Lake Louise : October 2003 p. 10/47



Why random samples?
nonlinear/non-Gaussian constraints
whole pdf association hypotheses
moments/quantiles independent over time

HPD interval multiple models trivial

re-parameterisation scalable - how big is 1


parallelisable

N.J. Gordon : Lake Louise : October 2003 p. 11/47



Comparison
Kalman filter Particle filter
analytic solution sequential MC solution
restrictive assumptions based on simulation
deduce state from no modelling
measurement restrictions
KF optimal predicts measurements
EKF sub-optimal from states
optimal (with 1 compu-
tational resources)

N.J. Gordon : Lake Louise : October 2003 p. 12/47



Book Advert
Sequential Monte Carlo methods in practice
Editors: Doucet, de Freitas, Gordon
Springer-Verlag (2001)
Theoretical Foundations
Efficiency Measures
Applications :
Target tracking, missile guidance, image tracking, terrain referenced
navigation, exchange rate prediction, portfolio allocation, in-situ
ellipsometry, pollution monitoring, communications and audio engineering.

N.J. Gordon : Lake Louise : October 2003 p. 13/47



Useful Information
Books
Sequential Monte Carlo methods in practice, Doucet, de Freitas, Gordon,
Springer, 2001.
Monte Carlo strategies in scientific computing, Liu, Springer, 2001.
Beyond the Kalman filter : Tracking applications of particle filters, Ristic,
Arulampalam, Gordon, Artech House, 2003?
Papers
On sequential Monte Carlo sampling methods for Bayesian filtering,
Statistics in Computing, Vol 10, No 3, pgs 197-208, 2000.
IEEE Trans. Signal Processing special issue, February 2002.
Web site
www.cs.ubc.ca/ nando/smc/index.html (includes software)

N.J. Gordon : Lake Louise : October 2003 p. 14/47



Particle Filter
Represent uncertainty over x1:t using diversity of weighted particles

n oN
i
x1:t ; wti
i=1

Ideally:
i p(x1:t ; y1:t )
x1:t p(x1:t j y1:t ) =
p(y1:t )

where Z
p(y1:t ) = p(x1:t ; y1:t )dx1:t

What if we cant sample p(x1:t j y1:t )?

N.J. Gordon : Lake Louise : October 2003 p. 15/47



Particle Filter - Importance Sampling
Sample from a convenient proposal distribution q(x1:t j y1:t )
Use importance sampling to modify weights

Z Z
p(x1:t j y1:t )
p(x1:t j y1:t )f (x1:t )dx1:t = q(x1:t j y1:t )f (x1:t )dx1:t
q(x1:t j y1:t )
N
X
wti f (x1:t
i
)
i=1

where

i
x1:t q(x1:t j y1:t )
p(x1:t j y1:t )
wti =
q(x1:t j y1:t )

N.J. Gordon : Lake Louise : October 2003 p. 16/47



Particle Filter - Importance Sampling
Pick a convenient proposal
Define the un-normalised weight:

p(x1:t ; y1:t )
w~ti =
q(x1:t j y1:t )

Can then calculate approximation to p(y1:t )

N
X
p(y1:t ) w~ti
i=1

Normalised weight is

p(x1:t j y1:t ) p(x1:t ; y1:t ) 1 w~ti


wti = = = PN
q(x1:t j y1:t ) q(x1:t j y1:t ) p(y1:t ) i=1 w ~ti

N.J. Gordon : Lake Louise : October 2003 p. 17/47



Particle Filter - SIS
To perform Sequential Importance Sampling, SIS

q(x1:t j y1:t ) , q(x1:t`1 j y1:t`1 ) q(xt j xt`1 ; yt )


| {z }| {z }
Keep existing path Extend path

The un-normalised weight then takes the appealing form

i ;y jy
p(x1:t t 1:t`1 )
w~ti = i jy )
q(x1:t 1:t
i
p(x1:t`1 j y1:t`1 ) p(xti ; yt j x1:t`1
i )
= i
q(x1:t`1 i
j y1:t`1 ) q(xti j xt`1 ; yt )
i
p(yt j xt`1 )p(xti j xt`1
i )
i
=wt`1 i
q(xti j xt`1 ; yt )
| {z }
Incremental weight

N.J. Gordon : Lake Louise : October 2003 p. 18/47



Illustration of SIS

N.J. Gordon : Lake Louise : October 2003 p. 19/47



Illustration of SIS - data conflict

N.J. Gordon : Lake Louise : October 2003 p. 20/47



SIS
Problem : Whatever the importance function, degeneracy is observed (Kong, Liu and Wong
1994).
i with respectively
Introduce a selection scheme to discard/multiply particles x0:t
high/low importance weights
i ; w ) onto the equally
Resampling maps the weighted random measure (x0:t t
i ; N `1 )
weighted random measure (x0:t
P
Scheme generates Ni children such that Ni=1 Ni = N and satisfies
E(Ni ) = Nwki

N.J. Gordon : Lake Louise : October 2003 p. 21/47



Illustration of SIR

N.J. Gordon : Lake Louise : October 2003 p. 22/47



Illustration of SIR

N.J. Gordon : Lake Louise : October 2003 p. 23/47



Ingredients for Particle filter
Importance sampling function
(i)
Prior p(xt j xt`1 )
(i)
Optimal p(xt j xt`1 ; yt )
UKF, linearised EKF, : : :
Redistribution scheme
Multinomial
Deterministic
Residual
Stratified
Careful initialisation procedure (for efficiency)
N.J. Gordon : Lake Louise : October 2003 p. 24/47

Improvements to SIR
To alleviate degeneracy problems many other methods have been proposed
Local linearisation (Doucet, 1998; Pitt & Shephard, 1999) using the EKF to estimate
the importance distribution or UKF (Doucet et al, 1999)
Rejection methods (Mller, 1991; Hrzeler & Knsch, 1998; Doucet, 1998; Pitt &
Shephard, 1999)
Auxiliary particle filters (Pitt & Shephard, 1999)
Kernel smoothing (Gordon, 1993; Liu & West, 2000; Musso et al, 2000)
MCMC methods (Mller, 1992; Gordon & Whitby, 1995; Berzuini et al, 1997; Gilks &
Berzuini, 1999; Andrieu et al, 1999)
Bridging densities : (Clapp & Godsill, 1999)

N.J. Gordon : Lake Louise : October 2003 p. 25/47



Auxiliary SIR - ASIR
Introduced by Pitt and Shephard 1999.
Use importance sampling function q(xt ; i j y1:t )
Auxiliary variable i refers to index of particle at time t ` 1
Importance distribution chosen to satisfy

q(xt ; i j y1:t ) / p(yt j it )p(xt j xt`1


i i
)wt`1

i
it is some characterisation of xt given xt`1
eg, it = E(xt j xt`1
i i
) or it p(xt j xt`1 )
This gives
j
ij
p(yt j xtj )p(xtj j xt`1
i ) p(yt j xtj )
wtj / wt`1 =
q(xtj ; i j j y1:t )
j
p(yt j ti )

N.J. Gordon : Lake Louise : October 2003 p. 26/47



ASIR
Naturally uses points at t ` 1 which are close to
measurement yt
If process noise is small then ASIR less sensitive to
outliers than SIR
This is because single point it characterises
p(xt j xt`1 ) well
But if process noise is large then ASIR can degrade
performance
Since a single point it does not characterise
p(xt j xt`1 )

N.J. Gordon : Lake Louise : October 2003 p. 27/47



Regularised PF - RPF
Resampling introduced to reduce degeneracy
But, also reduces diversity
RPF proposed as a solution
Uses continuous Kernel based approximation
N
X 
p^(x1:t j y1:t ) wti Kh xt ` xti
i=1

N.J. Gordon : Lake Louise : October 2003 p. 28/47



RPF

N.J. Gordon : Lake Louise : October 2003 p. 29/47



RPF
Kernel K(.) and bandwidth h chosen to minimise MISE

Z 
^ =E
MISE(p) ^ t j y1:t ) ` p(xt j y1:t )g2 dxt
fp(x

For equally weighted samples, optimal choice is Epanechnikov kernel


nx +2 (1` k x k2 ) if k x k< 1
2cnx
Kopt =
0 otherwise

Optimal bandwidth can be obtained as a function of underlying pdf


Assume this is Gaussian with unit covariance matrix

 p nx `1 1=(nx +4)
hopt = 8N(nx + 4)(2 ) cnx

N.J. Gordon : Lake Louise : October 2003 p. 30/47



MCMC Moves
RPF moves are blind
Instead, introduce Metropolis style acceptance step
n o
R R R
Resampled xk and created x1:k = xk ; x1:k`1 R

Resampled xkR and then sampled from a proposal distribution


n o
P R P P R
xk q(: j xk ) and created x1:k = xk ; x1:k`1

Assume q(: j :) symmetric


x P with probability
1:k
x1:k =
x R otherwise
1:k
!
P jy
p(x1:k R j xkP ))
1:k )q(xk
= min 1; R jy P
p(x1:k 1:k )q(xk j xkR )
!
R )
p(yk j xkP )p(xkP j xk`1
= min 1;
p(yk j xkR )p(xkR j xk`1
R )

N.J. Gordon : Lake Louise : October 2003 p. 31/47



Tracking dim targets
Detection and tracking of low SNR targets - better not to threshold the sensor data !
The concept: track-before-detect
Conventional TBD approaches:
- Hough transform (Carlson et al, 1994)
- dynamic programming (Barniv, 1990; Arnold et al, 1993)
- maximum likelihood (Tonissen, 1994)
The performance improved by 3-5 dB in comparison to the MHT (thresholded data).

N.J. Gordon : Lake Louise : October 2003 p. 32/47



Recursive Bayesian TBD
Drawbacks of conventional TBD approaches: batch processing; prohibit or
penalise deviations from the straight line motion; require enormous computational
resources.
A recursive Bayesian TBD (Salmond, 2001), implemented as a particle filter
no need to store/process multiple scans
target motion - stochastic dynamic equation
valid for non-gaussian and structured background noise
the effect of point spread function, finite resolution, unknown and
fluctuating SNR are accommodated
target presence and absence explicitly modelled
Run Demo

N.J. Gordon : Lake Louise : October 2003 p. 33/47



Mathematical formulation
Target state vector

xk = [xk x_k yk y_k Ik ]T :

State dynamics:

xk+1 = fk (xk ; vk );

Target existence - two state Markov chain, Ek 2 f0; 1g


with TPM
1 ` Pb Pb
= :
Pd 1 ` Pd

N.J. Gordon : Lake Louise : October 2003 p. 34/47



Mathematical formulation (Contd)
Sensor model: 2D map, image of a region x y .
At each resolution cell (i ; j) measured intensity:

(i;j) (i;j)


hk (x k ) + wk if target present

zk(i;j) =


w (i;j)

if target absent
k

where
 2 2

(i;j) x y Ik (i x ` xk ) + (jy ` yk )
hk (xk ) 2
exp `
2 22

N.J. Gordon : Lake Louise : October 2003 p. 35/47



Example
30 frames and target present in 7 to 22
SNR = 6.7 dB (unknown)
20 20 cells
Frame 2 Frame 7 Frame 12

Frame 17 Frame 22 Frame 27

N.J. Gordon : Lake Louise : October 2003 p. 36/47



Particle filter output (6 states)
Figures suppressed to reduce file size.

N.J. Gordon : Lake Louise : October 2003 p. 37/47



Particle filter output (6 states)

16
1 TRUE PATH
ESTIMATED
0.9 14

0.8
12
0.7
PROBABILITY

0.6 10

y
0.5
8
0.4

0.3 START
6
0.2
4
0.1

0 2
5 10 15 20 25 30 0 2 4 6 8 10 12 14
FRAME NUMBER x

N.J. Gordon : Lake Louise : October 2003 p. 38/47



PF based TBD - Performance
Derived CRLBs (as a function of SNR)
Compared PF-TBD to CRLBs
Detection and Tracking reliable at 5 dB (or higher)

N.J. Gordon : Lake Louise : October 2003 p. 39/47



Sonobuoy and Submarine
Noisy bearing measurements from drifting sensors
Uncertainty in sensor locations
Sensor loss
High proportion of spurious bearing measurements
Run demo

N.J. Gordon : Lake Louise : October 2003 p. 40/47



Blind Doppler
Blind Doppler zone to to filter out ground clutter +
ground moving targets
A simple EP measure against any CW or pulse Doppler
radar
Causes track loss
Aided by on-board RWR or ESM

N.J. Gordon : Lake Louise : October 2003 p. 41/47



Tracking with hard constraints
Prior information on sensor constraints
Posterior pdf is truncated (non-Gaussian)
Example :
2-D tracking with CV model
(r; ; r_) measurements
pd < 1
EKF and Particle Filter with identical gating and
track scoring

N.J. Gordon : Lake Louise : October 2003 p. 42/47



EKF
(a) (b)
400

RangeRate [km/h]
200
50
0

200

40 400

600

800

30 0 50 100 150
Y [km]

T I M E [s]
(c)

20 1

0.8

Track Score
0.6
10 TRACK
0.4 LOSS

0.2
0
0

15 10 5 0 5 10 15 0 50 100 150
X [km] T I M E [s]

N.J. Gordon : Lake Louise : October 2003 p. 43/47



Particle Filter
(a) (b)

400

RangeRate [km/h]
200
50
0

200

40 400

600

800

30 0 50 100 150
Y [km]

T I M E [s]
(c)

20 CLOUD OF 1
PARTICLES
AT t=108 s 0.8

Track Score
0.6
10
0.4

0.2
0
0

10 0 10 0 50 100 150
X [km] T I M E [s]

N.J. Gordon : Lake Louise : October 2003 p. 44/47



Track continuity

1 1
Probability of Track Maintenance

Probability of Track Maintenance


0.8 0.8

0.6 0.6 PF
EKF

0.4 0.4

0.2 0.2
PF
EKF

0 0
0 20 40 60 80 100 120 0 20 40 60 80 100 120
Doppler blind zone limit [km/h] Doppler blind zone limit [km/h]

T=2s T=4s

N.J. Gordon : Lake Louise : October 2003 p. 45/47



Problems hindering SMC
Convergence results
Becoming available (Del Moral, Crisan, Chopin, Lyons, Doucet ...)
Communication bandwidth
Large particle sets impossible to send
Interoperability
Need to integrate with varied tracking algorithms
Computation
Expensive so look to minimise Monte Carlo
Multi-target problems
Rao-Blackwellisation

N.J. Gordon : Lake Louise : October 2003 p. 46/47



Final comments
Sequential Monte Carlo methods
Optimal filtering for nonlinear/non Gaussian
models
Flexible/Parallelizable
Not a black-box: efficiency depends on careful
design.
If the Kalman filter is appropriate for your application
use it
A Kalman filter is a (one) particle filter

N.J. Gordon : Lake Louise : October 2003 p. 47/47

You might also like