Professional Documents
Culture Documents
MODELS
O UTLINE
Markov model
Hidden Markov model (HMM)
Example:
Probability Recap
Conditional probability
Product rule
Chain rule
4
M OTIVAT IO N
MARKOV CHAIN
P
P sentence
is
next of 0.6
P P
P
P P
P P
P 0.05
paragraph
P
What
message
P
MARKOV CHAIN: WEATHER EXAMPLE
Design a Markov Chain to predict
the weather of tomorrow using
previous information of the past
days.
𝑃 𝑞2, 𝑞3 𝑞1 = 𝑃 𝑞2 𝑞1 𝑃 𝑞3 𝑞1, 𝑞2
= 𝑃 𝑞2 𝑞1 𝑃 𝑞3 𝑞2
= 𝑃 𝑆𝑢𝑛𝑛𝑦 𝑆𝑢𝑛𝑛𝑦 𝑃 𝑅𝑎𝑖𝑛𝑦 𝑆𝑢𝑛𝑛𝑦
= 0.8 (0.05)
= 0.04
Exercise 2: Assume that yesterday’s weather was Rainy, and today is
Cloudy, what is the probability that tomorrow will be Sunny?
𝑃(𝑞3 |𝑞1 , 𝑞2 ) = 𝑃 𝑞3 𝑞2
= 𝑃 𝑆𝑢𝑛𝑛𝑦 𝐶𝑙𝑜𝑢𝑑𝑦
= 0.2
Exercise 3 : What is the probability of given sequence S,R,R,R,C,C
Initial probabilities given as ℿ ={0.7,0.25,0.05}
W H AT IS A MARKOV MODEL?
A Markov Model is a stochastic model which models
temporal or sequential data, i.e., data that are ordered.
U = Umbrella
N U = Not Umbrella
Let’s assume that 𝑡 days had passed. Therefore, we
will have an observation sequence O = {𝑜1 , … , 𝑜𝑡 } ,
where 𝑜𝑖 𝜖 𝑈𝑚𝑏𝑟𝑒𝑙𝑙𝑎, 𝑁𝑜𝑡 𝑈𝑚𝑏𝑟𝑒𝑙𝑙𝑎 .
𝑃 𝑜𝑖 𝑞𝑖 𝑃(𝑞𝑖)
𝑃 𝑞𝑖 𝑜 =
𝑃(𝑜𝑖)
For a sequence of length 𝑡:
𝑃 𝑜1 , … , 𝑜𝑡 𝑞1 , … , 𝑞𝑡 𝑃(𝑞1 , … , 𝑞𝑡 )
𝑃 𝑞1 , … , 𝑞𝑡 𝑜1 , … , 𝑜𝑡 =
𝑃(𝑜1 , … , 𝑜𝑡 )
H M M PARAMETERS
A H M M is governed by the following parameters:
λ = {𝐴, 𝐵, 𝜋}
⚫ State-transition probability matrix 𝐴
⚫ Emission/Observation/State Conditional Output
probabilities 𝐵
⚫ Initial (prior) state probabilities 𝜋
𝑆 = 𝑠1 , … , 𝑠𝑁
State-transition probability matrix:
𝑎 𝑖𝑗
𝑠𝑖 𝑠𝑗
Emission probabilities: A state will generate an
observation (output), but a decision must be taken
according on how to model the output, i.e., as discrete
or continuous.
𝑏𝑖 𝑣𝑘 = 𝑃 𝑜𝑡 = 𝑣𝑘 𝑞𝑡 = 𝑠𝑖 , 1≤𝑘≤𝑊
𝑏1 𝑣2 … 𝑏1 𝑣𝑊
𝑣1
𝑏1 (𝑣1 ) 𝑏1 (𝑣2 ) . . . 𝑏1(𝑣𝑊)
𝑏2 (𝑣1 ) 𝑏2 (𝑣2 ) . . . 𝑏2(𝑣𝑊) 𝑣2 𝑣𝑊
. . .
𝐵= .
. . .
. . . .
𝑏𝑁(𝑣1)𝑏𝑁(𝑣2) . . . 𝑏𝑁(𝑣𝑊)
HMM components: Here (Xi = Qi ) and (Ei = Oi)
26
Three Problems of HMM
27
HMM Example
28
Markov model with Graz weather with state
transition probabilities
29
Probability of Umbrella
30
Example1
31
Example1
33
Example-2
34
Example 2
36
ICE Cream Problem
Imagine that you are a climatologist in the year 2799 studying the history of
global warming. You cannot find any records of the weather in Amaravati, for
the summer of 2023, but you do find Jason Eisner’s diary, which lists how many
ice creams Jason ate every day that summer.
Our goal is to use these observations to estimate the temperature every day.
We’ll simplify this weather task by assuming there are only two kinds of days:
cold (C) and hot (H).
So the Eisner task is as follows: Given a sequence of observations O (each an
integer representing the number of ice creams eaten on a given day) find the
‘hidden’ sequence Q of weather states (H or C) which caused Jason to eat the
ice cream.
Figure A.2 shows a sample HMM for the ice cream task.
The two hidden states (H and C) correspond to hot and cold weather, and the
observations (drawn from the alphabet O = {1,2,3}) correspond to the number
of ice creams eaten by Jason on a given day
37
ICE Cream Problem
38
Question ???
Compute the probability of ice-cream events 3 1
3 instead by summing over all possible weather
sequences, weighted by their probability.
First, let’s compute the joint probability of being
in a particular weather sequence Q and
generating a particular sequence O of ice-cream
events.
39
The computation of the joint probability of our ice-cream observation 3
1 3 and one possible hidden state sequence hot hot cold is shown below
40
N = No. of hidden states = 2
T = Given no.of observations = 3
M = NT = 23 = 8 possibilities (HHC, CCC, HCH… )
N= 7
T = 10
NT = 710 (282475249)
41
How to solve this ?
42
Forward Trellis
43
44
Likelihood Computation: The forward
algorithm
45
The Forward Algorithm
46
Steps in Forward Algorithm
47
Decoding: The Viterbi Algorithm
48
Viterbi Algorithm
49
50
51
52
53
54
Real HMM Examples
Speech recognition HMMs:
Observations are acoustic signals (continuous valued)
States are specific positions in specific words (so, tens of
thousands)
Robot tracking:
Observations are range readings (continuous)
States are positions on a map (continuous)
56