Welcome to Scribd. Sign in or start your free trial to enjoy unlimited e-books, audiobooks & documents.Find out more
Download
Standard view
Full view
of .
Look up keyword
Like this
1Activity
0 of .
Results for:
No results containing your search query
P. 1
Visual Tracking using Particle Filters

Visual Tracking using Particle Filters

Ratings: (0)|Views: 7|Likes:
Published by Ângela Teixeira

More info:

Categories:Types, School Work
Published by: Ângela Teixeira on Feb 24, 2012
Copyright:Attribution Non-commercial

Availability:

Read on Scribd mobile: iPhone, iPad and Android.
download as PDF, TXT or read online from Scribd
See more
See less

01/03/2014

pdf

text

original

 
1 ESTIMATORS 
Mestrado em Engenharia Electrot´ecnica e de ComputadoresMestrado em Tecnologias de Informa¸ao Visual
Vis˜ao por ComputadorAssignment n 6Visual Tracking using Particle Filters
1 Estimators
Many problems in science and engineering require the estimation of a set of variables related with some system from a sequence of noisy measurementsmade on some variables that are related to this system.When such estimation is to be done by a computer a state space modelfor the dynamic system written using a discrete-time formulation is the mostadequate. This means that the evolution of the system is modelled usingdifference equations, and the measurements are assumed to be available atdiscrete times. All the attention is focused on the state vector which shouldcontain all the relevant information to describe the system.In a (visual) tracking problem, this information is normally related to thekinematic characteristics of the target, and the measurement vector repre-sents noisy observations that are somehow related to the state vector. Al-though this is not a requirement, the measurement vector is frequently of lower dimension than the state vector.Two models are required to analyse and infer about a dynamic system.The first one, known as the system model describes the evolution of thestate with time and the second one relates the noisy measures to the state,being known as the measurement model. Different approaches can be used toderive estimators for these kinds of problems and frequently the results arethe same. Here we are interested in the so called probabilistic approaches.These kind of approaches produce estimates under the form of probabilitydistributions, instead of deterministic values.The probabilistic state space formulation and the requirement for updat-ing of the state information upon the reception of each new measurement
Vis˜ao por Computador
1
Paulo Menezes, 2011, DEEC
 
1.1 Sequencial importance sampling algorithm 1 ESTIMATORS 
are well suited to a Bayesian approach. In such approach one attempts toconstruct the posterior probability density function (pdf) of the state givenall the available information, which includes the set of received measure-ments. For the cases where an estimate must be obtained whenever a newmeasurement is received, a recursive filter is an adequate solution. This filteris normally divided in two stages: (1) prediction and (2) update (or correc-tion). In the first stage, the system model is used to predict the state pdf atthe next measurement time, from the previous one. Due to the presence of noise (which models the unknown disturbances that affects the system), thepredicted pdf is generally translated, deformed and spread. The receptionof a new measurement permits the adjustment of this pdf during the updateoperation. This is done using the Bayes theorem as the mechanism to updatethe knowledge about the system state upon the reception of new information.There are different filtering algorithms that can be derived using a Bayesianformulation, being the Kalman Filter (or its Extended version) one of themost used. Its requirements are that the models be linear (or differentiable)and the involved noise distributions be Gaussian.Particle filters are in turn, particularly well-suited for vision-based track-ing applications, in particular due to the non-Gaussian characteristics of themeasurement functions involved. These are also known as Monte Carlo-basedmethods, as they involve the use of sets of random samples that approximatethe probability distributions we want to estimate.
1.1 Sequencial importance sampling algorithm
Suppose that there is a probability distribution
p
(
x
), from which it is verydifficult to draw samples, but we know that it is proportional to anotherdistribution
π
(
x
) which can easily be evaluated.Let
x
(
i
)
q
(
x
)
,i
= 1
,...,N 
s
be samples that are easily generated froma proposal distribution
q
(
.
), called
importance density 
. Then, a weightedapproximation to
p
(
.
) is
 p
(
x
)
s
i
=1
w
(
i
)
δ
(
x
x
(
i
)
) (1)where
w
(
i
)
π
(
x
(
i
)
)
q
(
x
(
i
)
)
is the normalised weight of the i-th particle.Thus, if the samples were drawn from an importance density
q
(
x
0:
k
|
z
1:
k
)
Vis˜ao por Computador
2
Paulo Menezes, 2011, DEEC
 
1.1 Sequencial importance sampling algorithm 1 ESTIMATORS 
then weights are
w
(
i
)
k
p
(
x
(
i
)0:
k
|
z
1:
k
)
q
(
x
(
i
)0:
k
|
z
1:
k
)(2)If we factorise the importance density as
q
(
x
0:
k
|
z
1:
k
) =
q
(
x
k
|
x
0:
k
1
,
z
1:
k
)
q
(
x
0:
k
1
|
z
0:
k
1
) (3)then we can obtain samples
x
(
i
)0:
k
q
(
x
0:
k
|
z
1:
k
) by augmenting the existingsamples
x
(
i
)0:
k
q
(
x
0:
k
1
|
z
1:
k
1
) with the new state
x
(
i
)
k
q
(
x
k
|
x
0:
k
1
,
z
1:
k
)
.
We start by expressing
p
(
x
0:
k
|
z
1:
k
) in terms of 
p
(
x
0:
k
1
|
z
1:
k
1
),
p
(
z
k
|
x
k
)and
p
(
x
k
|
x
k
1
)
 p
(
x
0:
k
|
z
1:
k
) =
p
(
z
k
|
x
0:
k
,
z
1:
k
1
)
 p
(
x
0:
k
|
z
1:
k
1
)
 p
(
z
k
|
z
1:
k
1
)=
p
(
z
k
|
x
0:
k
,
z
1:
k
1
)
 p
(
x
k
|
x
0:
k
1
,
z
1:
k
1
)
 p
(
x
0:
k
1
|
z
1:
k
1
)
 p
(
z
k
|
z
1:
k
1
)=
p
(
z
k
|
x
k
)
 p
(
x
k
|
x
k
1
)
 p
(
z
k
|
z
1:
k
1
)
p
(
x
0:
k
1
|
z
1:
k
1
)
p
(
z
k
|
x
k
)
 p
(
x
k
|
x
k
1
)
 p
(
x
0:
k
1
|
z
1:
k
1
)Substituting these results in the weight update equation 2
w
(
i
)
k
p
(
z
k
|
x
(
i
)
k
)
 p
(
x
(
i
)
k
|
x
(
i
)
k
1
)
 p
(
x
(
i
)0:
k
1
|
z
1:
k
1
)
q
(
x
(
i
)
k
|
x
(
i
)0:
k
1
,
z
1:
k
)
q
(
x
(
i
)0:
k
1
|
z
1:
k
1
)=
p
(
z
k
|
x
(
i
)
k
)
 p
(
x
(
i
)
k
|
x
(
i
)
k
1
)
q
(
x
(
i
)
k
|
x
(
i
)0:
k
1
,
z
1:
k
)
 p
(
x
(
i
)0:
k
1
|
z
1:
k
1
)
q
(
x
(
i
)0:
k
1
|
z
1:
k
1
)=
p
(
z
k
|
x
(
i
)
k
)
 p
(
x
(
i
)
k
|
x
(
i
)
k
1
)
q
(
x
(
i
)
k
|
x
(
i
)0:
k
1
,
z
1:
k
)
w
(
i
)
k
1
If 
q
(
x
k
|
x
0:
k
,
z
1:
k
) =
q
(
x
k
|
x
k
1
,
z
k
) then (we can just store
x
k
and discard
x
0:
k
1
and
z
0:
k
1
).
w
(
i
)
k
w
(
i
)
k
1
 p
(
z
k
|
x
(
i
)
k
)
 p
(
x
(
i
)
k
|
x
(
i
)
k
1
)
q
(
x
(
i
)
k
|
x
(
i
)
k
1
,
z
k
)
Vis˜ao por Computador
3
Paulo Menezes, 2011, DEEC

You're Reading a Free Preview

Download
scribd
/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->