You are on page 1of 1

First Term, Homework 1

Markov Chains

Problem 1
(a) State n = (stock yesterday, stock today) (b) ⎡1 − α1 α1 0 0 ⎤
State 0 = (up, up) ⎢ 0 0 1 − α 3 α 3 ⎥⎥
State 1 = (up, down) ⎢
State 2 = (down, up) ⎢ 1 − α 2 α 2 0 0 ⎥
⎢ ⎥
State 3 = (down, down) ⎣ 0 0 1 − α 4 α 4 ⎦

(c) the information of the two days is included in the definition of the states, therefore the markovian property
holds because the next transition’s state depends only on the present state even when this present state includes
the information for two days.

Problem 2
(a) it can be modeled as a markov chain with 8 states.
State 0 = (up, up, up) State 4 = (down, up, up)
State 1 = (up, up, down) State 5 = (down, up, down)
State 2 = (up, down, up) State 6 = (down, down, up)
State 3 = (up, down, down) State 7 = (down, down, down)

Problem 3
State 0 = dry at day t ⎡0.9 0.1⎤
P = ⎢
State 1 = rain in day t ⎣0.5 0.5⎥⎦
( 3)
(b) P{ X 3 = 1} = P{ X 0 = 0} ⋅ p01 + P{ X 0 = 1} ⋅ p11(3) = 0.5(0.22) + 0.5(0.156 ) = 0.188
(7)
P{ X 7 = 1} = P{ X 0 = 0} ⋅ p01 + P{ X 0 = 1} ⋅ p11( 7 ) = 0.5(0.1664) + 0.5(0.1681) = 0.1672
(c) Steady state probabilities: π 0 = 0.8333, π 1 = 0.1667

Problem 4
(a) Xn = binary digit entering the system after the nth transmission
State 0 Xn = 0, State 1 Xn = 1,
(c) The probability that a digit is correctly transmitted after the last transmission is 0.9085. (From the 10th step
transition matrix)

You might also like