You are on page 1of 1

Markov Chains - Problem Solving

Imagine you're exploring a new city. You leave the train station with shops on every
side of you and your hotel in the distance. Once you leave the train station the
probability of going back there is really low - we’ll say 2% - compared to your
probability of going to one of the other shops. You are given the probability that you'll
enter each shop, and that once you have entered the shop, your chance of going to
another shop adjacent to it. You also know that if you go into the hotel, you will
probably stay there for the evening but there’s a 20% chance that you’ll go to one of
the 2 surrounding shops (ie: you may stay in for the night, or you may decide to go
back out shopping).

A. We can use a Markov chain to generate all of the possible different paths that we
might traverse the town. All we have to know at each step of the way is where we
came from (ie: the town square) and the probability that we'll go a certain direction
(ie: to the hotel, or back to the square). Represent these as a probability matrix
showing the probability that we’ll go from one place to another.

B. Does this Markov chain have a stationary distribution? Give your reasons.
C. In the long run, what proportion of the time will we be in the hotel?
D. In the long run, what proportion of the time will we be in the town square?

You might also like