You are on page 1of 19

Hidden Markov Models

Introduction
• Most of the methods • Hidden Markov
we’ve looked at Models (HMMs)
• Have examined • Provide a way of
images and tried to representing a model
compute a few pieces of what happens to
of information cause these
• These give us observations
observations that tell • We can use this to
us about the world in explain a sequence of
some way observations, such as
those made from a
sequence of images
Hidden, Markov, Model
• Hidden
• We can’t observe the model directly, but we
know that it is there, and we can observe
things related to it
• Markov
• The state of the model at any time depends
only on the previous state, and not on
those before (is Markovian)
Hidden Markov Models
• A HMM consists of • The functions:
two sets and three • A start state function,
I(si), giving the prob.
functions of starting in state si
• The sets: • A transition function,
• A set of n states, T(si, sj), giving the
S={s1, s2, …, sn} prob. of going from
state si to state sj
• A set of m symbols
• An observation
that can be observed, function, O(ak , si)
A={a1, a2, …, am} giving the prob. of
seeing symbol ak in
state si
Example
• Suppose we have a • We expect people to
camera watching have several actions
some people • Sitting, standing,
• We segment the walking, or running
scene into individuals • Each of these is a
• We track each person state in our HMM for
to class their speed that person
into still, slow, or fast • Each state has an
• We use stereo to class expected observation
their height as being (eg sitting people are
low, or high still and appear low)
Eg: Sets and Start States
• The sets: • The function I tells
• The set of states, S, is us how people enter
{sitting, standing, the scene
walking, running} • I(sitting) = 0
• The observations, A, • I(standing) = 0
are • I(walking) = 0.9
{(still, low), • I(running) = 0.1
(still, high),
(slow, low), …}
Eg: Transition Function
• Tells us how people’s • Eg:
actions change • T(sit, sit) = 0.7
• Given that we saw • T(sit, stand) = 0.2
someone sitting in the • T(sit, walk) = 0.1
last frame what are • T(sit, run) = 0
they likely to be doing
now? • We repeat this for all
• The’ll probably still be other pairs of
sitting, might have actions. Note that
stood up or walked T(x,x) is often high
off, but won’t have
run
Eg: Transition Function
0.5

0.3 Stand 0.1

0.2 0.2

0.7 Sit 0.2 0.2 Run 0.4

0.1 0.1

0.1 Walk 0.4

0.6
Eg: Observation Function
• Tells us how each • Eg: A sitting person
state looks is almost certainly
• Often a bit uncertain ‘low’ and probably
because of errors in ‘still’ but not ‘fast’
measurements • O(sit, (low, still)) = 0.6
• We assume there are • O(sit, (low, slow)) = 0.3
a discrete set of • O(sit, (low, fast)) = 0
measurements made • O(sit, (high, still)) = 0.1
(not necessary, but
• O(sit, (high, slow)) = 0
simpler)
• O(sit, (high, fast)) = 0
Three Problems
• Given the model, what is the chance of
some event happening

• Given the model and an observation,


what is the most likely sequence of
states to explain that observation

• Given a set of observations, what set of


parameters is the model likely to have
Mathematical Tools
• To do this we need some probability
Notation:
• P(x) = Probability of x happening
• P(x|y) = Probability of x happening given
that y has happened
• Rules:
• P(x and y) = P(x)P(y)
if x and y are independent
• Markovian property gives independence
Problem 1
• We have a HMM
• We know all it’s parameters
• We have some observation – a sequence of
symbols – that we are interested in
• What is the chance of this sequence
occurring given the model?
• This allows us to predict how likely an event
is, given our model
Problem 1
• A direct method:
• Suppose the observation consists of k
symbols, X = x1x2…xk
• This must be caused by a sequence of k
states, Y = y1y2…yk
• If we figure out the chance of taking each
path, Yi, and the chance of seeing X with
each path we can find the answer as
P(X |Yi )P(Yi )
i
Probability of a Path
• Given a HMM and a Path, Yi, what’s the
probability that we take the path?
• The path is a sequence of states,
Yi = yi,1, yi,2, yi,3, …, yi,k
• The probability of taking the path is
P(Yi) = P(start in yi,1) and
move from yi,1 to yi,2 and
move from yi,2 to yi,3 and …
move from yi,k-1 to yi,k)
Probability of a Path

P(Yi) = P(start in yi,1 and


move from yi,1 to yi,2 and
move from yi,2 to yi,3 and …
move from yi,k-1 to yi,k)

= I(yi,1)×T(yi,1, yi,2)×T(yi,2, yi,3)×…


×T(yi,k-1, yi,k)
P(Observation|Path)
• Now we know how likely each path is,
we need to know how likely it is to make
the observation for each path
P(X|Yi) = P(see x1 in state yi,1 and
see x2 in state yi,2 and…
see xk in state yk)
= O(x1, yi,1)O(x2, yi,2)…O(xk, yi,k)
Problem 1
• This gives us a direct solution, but
• If the path is of length k and there are n
states, then there are nk possible paths
• Each path requires 2k multiplications
• So the algorithm is order knk
• If n = 4, and k = 10 we need about 10
million computations, for k = 100 we need
on the order of 1061
• We need a more efficient solution, recursive
algorithms exist
A Vision Example
• HMMs were introduced into
computer vision by Yamato
et el 1992.
• Trained HMMs to recognise
six tennis strokes
• Apply each HMM
independently to the input
image sequence
• The HMM most likely to
produce the observations is
taken to represent the
stroke
A Vision Example

• HMMs are by far the most


commonly used event
detection mechanism
• their strength comes from
the ability to learn models
• Many variations on the
theme exist