Professional Documents
Culture Documents
Module 1
Markov Chains - I
Dr. Alok Goswami, Professor, Indian Statistical Institute, Kolkata
1 A Coin-Tossing Game
1.1 A Simple Example to Illustrate the Idea:
A coin is tossed repeatedly. At each toss, you either WIN or LOSE 1 rupee, depending on whether
the toss results in a HEAD or a TAIL. Your net/accumulated fortune/capital fluctuates at each toss.
1
Time-Homogeneous Markov Property: For every n ≥ 0 and every x0 , x1 , . . . , xn−1 , x, y ∈ S,
P(Xn+1 = y | X0 = x0 , . . . , Xn−1 = xn−1 , Xn = x)
θ if y = x + 1
= 1 − θ if y = x − 1
if y 6= x ± 1
0
Definition 1. Let S be a countable set. A sequence {Xn }n≥0 of random variables taking values in
S is said to be a Markov Chain if, for every n ≥ 0 and for every choice of x0 , x1 , . . . , xn−1 , x, y ∈ S,
P(Xn+1 = y | X0 = x0 , X1 = x1 , . . . , Xn−1 = xn−1 , Xn = x)
= P(Xn+1 = y | Xn = x) = P(X1 = y | X0 = x) = pxy
The definition captures two properties: One, the conditional probability on the left doesn’t
depend on x0 , x1 , . . . , xn−1 (first equality), and, Two, it doesn’t depend on n either (second equal-
ity).
Thus, a Markov chain, by definition, satisfies both Markov property and Time-homogeneity.
The set S is called the State Space of the chain and elements of S are called states. The
numbers pxy , x, y ∈ S are called transition probabilites of the chain: given the history of the
chain upto any time point n, pxy is the (conditional) probability that the chain will move from its
(given) present state x to the state y at the next time point. The following properties are obvious:
X
pxy ≥ 0 ∀ x, y ∈ S and pxy = 1 ∀ x ∈ S.
y∈S
3 Some Examples:
• On-Off chains
Imagine a system which can be in one of two states: ’on’ or ’off’. If it is ’off’ at time n, then chances
that it will be ’on’ at time n + 1 is α, while the chances of going from an ’on’ state to ’off’ is β,
irrespective of the past history. Denoting the ’on’ and ’off’ states as states 1 and 0 respectively,
we get the simplest of Markov chains with state space S = {0, 1} and transition probabilities
p01 = α, p10 = β; of course, p00 = 1 − α, p11 = 1 − β.
2
{Xn }n≥0 integer-valued sequence of random variables defined as:
X0 = any Z-valued random variable, independent of {ξn } and,
X1 = X0 + ξ1 , X2 = X1 + ξ2 , · · ·, Xn = Xn−1 + ξn = X0 + ξ1 +· · · + ξn , · · ·
It is easy to see that: (Xn )n≥0 is a Markov chain with state space S = Z and transition probabilities
pxy = θy−x , x, y ∈ Z. (write conditional probability in definition in terms of X0 , ξ1 , . . . , ξn+1 ). It
is called Random Walk on integers: A particle is moving along integers through random i.i.d.
steps/increments:
Coin-Tossing game is a special case when the increments ξi take values ±1 only: called Simple
Random walk.
An important property (space homogeneity):
Exs: Random Walks are the only Markov chains on integers whose transition probabilities satisfy
space homogeneity.
• Renewal Chain
Running of a process depends on an item that has a random lifetime. When it breaks down, a new
item is installed in its place at the start of next day.
Assume: Lifetimes of successive replacements are i.i.d. random variables with common distribu-
tion:
P(L = i) = θi , i = 1, 2, . . .. (L: generic notation for lifetime)
Consider: Xn = age of item in use on day n, n = 0, 1, . . ..
It is easy to see that: (Xn )n≥0 is a Markov chain with state space S = N and transition probabilities:
Customers arrive at a service counter and queue up for service. If, at the start of a period, no
customers are waiting for service, there will be no service during that period; otherwise, only
the first customer in queue is served during that period.
Assume: Arrivals during successive periods are independent with common distribution:
P(ξ = i) = θi , i = 0, 1, 2, . . .. Xn = length of the queue at the start of the nth period.
It is easy to see: (Xn )n≥0 is a Markov chain with state space S = Z+ and transition probabilities:
pxy = θy−x+1 , for x ≥ 1, y ≥ x − 1
and p0y = θy , for y ≥ 0.
3
• Ehrenfest Diffusion chain
Consider d balls distributed between two boxes. Out of the d balls, one is selected at random.
The selected ball is taken out of the box it is in and transferred to the other.The same process is
repeated. Consider Xn = number of balls in Box 1 after nth turn.
Easy to see: (Xn )n≥0 is a Markov chain with state space S = {0, 1, . . . , d} and transition probabil-
ities:
x x
px,x−1 = , px,x+1 = 1− and pxy = 0 for y 6= x − 1, x + 1.
d d
Remark: Captures an essential feature of diffusion of particles: Whenever a box carries a very large
share of the balls, the chances of a ball moving out of it into the other box are high!