You are on page 1of 4

Hidden Markov Models

Building further on the concept of a Markov chain is the hidden Markov


model (HMM). A hidden Markov model is a statistical model where the
system being modeled is assumed to be a Markov process with
unknown parameters or casual events. HMMs are primarily helpful in
determining the hidden parameters from the observable parameters.
Again, the figure below may help visualize the Hidden Markov Model

concept.
x1, x2, x3 = hidden/unknown states
a12, a21, a23, a32 = transition probabilities
y1, y2, y3 = observable states
e1, e2, e3 = emission probabilities
The key point of the figure above is that only the blue circles are seen,
but you suspect or know that these observable states are directly related
or dependent on some hidden state. These hidden states (the red
circles) are what actually dictates the outcome of the observable states.
The challenge is to figure out the hidden states, the emission
probabilities and transition probabilities.
For example, say you have a data acquisition program that continuously
records the outlet temperature of your product stream which you are
heating up through a heat exchanger. The heat exchanger provides a
constant supply of heat. You observe that the temperature has some
step disturbances and is not always constant. The temperature is the
observable parameter, but the cause of its deviations/states is actually
the stream's flowrate. When the flowrate fluctuates, the outlet
temperature also fluctuates since the heat exchanger is providing the
same amount of heat to the stream. So, the stream flowrate is the
hidden parameter (since you do not have access to this measured
variable) and the probability that the flowrates switches from say 100 to
150 to 300 L/h are the transition probabilities. The emission probability

is the probability that the observed outlet temperature is directly caused


by the flowrate. This emission probability is not necessarily 1 since
temperature variations could also be due to noise, etc.
Another common scenario used to teach the concept of a hidden
Markov model is the Occasionally Dishonest Casino. If a casino uses
a fair die, each number has a 1/6 probability of being landed on. This
type of probability that represents a known parameter is known as
emission probability in HMMs. Sometimes a casino can tip the odds by
using a loaded die, where one number is favored over the other five
numbers and therefore the side that is favored has a probability that is
higher than 1/6.
But how does one know that the casino is being dishonest by using a
loaded die? Pretend that the die is loaded in such a way that it favors
the number 6. It would be difficult to differentiate between a fair die and
loaded die after watching only one roll. You may be able to get a better
idea after watching a few tosses and see how many times the die
landed on 6. Lets say that you saw the following sequence of die
tosses:
465136
It is still difficult to say with certainty whether the die is loaded or not.
The above sequence is a feasible for both a fair die and a loaded die. In
this particular case, the above numbers that represent what was rolled
are the observable parameters. The hidden parameter is the type of die
used just because we do not know which type produced the above
sequence of numbers.
Instead of relying on a sneaking suspicion that the casino is being
dishonest, one can use a hidden Markov model to prove that a loaded
die is being used occasionally. Please note that if a loaded die was used
all the time it would be more blatantly obvious. In order to get away with
this slight unfairness, the casino will switch the fair die with a loaded
one every so often. Lets say that if a fair die is being used there is a 5%
chance that it will be switched to a loaded die and a 95% chance that it
will remain fair. These are known as transition probabilities because

they represent the likelihood of a causal event changing to another


causal event.
Below is a picture representing the transition probabilities (probability of
staying or changing a type of die) as well as the emission probabilities
(probability of landing on a number).

Using the basic diagram above that show the emission and transition
properties, conditioned statements in Excel can be used to model the
probability that a loaded die is being used. Please refer to example two
for the hidden Markov model Excel sheet.
HHMs can be modeled using Matlab. Follow the link to the site which
contains information in installing the HHM toolbox in Matlab:
HHMs in Matlab

Worked out Example 1: "What should I wear?"

Being a student of the University of Michigan, one of your daily


concerns is the weather. The weather affects important issues such as,
choice of clothing, shoes, washing your hair, and so many more.
Considering the bizarre weather in Ann Arbor, you wish to be able to
calculate the probability that it will rain given the data of the last three
consecutive days. Applying your chemical engineering skills and
empowered by the knowledge from this Wiki article, you sit down and
decide to be the next Weather-Person.
You know the following information since you have been an avid fan of
the Weather Channel since age 5.

You then record the weather for the past 3 days.

Solution
Since the last day was a blizzard you use a matrix expressing a 100%
chance in the past, and multiply this by the probabilities of each weather
type.

This means that there is a 60% chance of sun, a 25% chance of rain
and a 15% chance of a blizzard. This is because a Markov process
works based on the probability of the immediately previous event. Thus,
you decide that is a low enough risk and happily decide to wear your
best new outfit AND wash your hair.