Professional Documents
Culture Documents
PART – A
0 1 1 2
If the tpm of a markov chain is P 1 1 . Find the steady ,
1. 3 3
2 2
state distribution of the markov chain.
If the initial state probability distribution of a markov chain is 11 23
p ( 2) ,
5 1 0 1 24 24
2. p , and the tpm of the chain is P 1 1 . Find the
(0)
6 6 2 2
probability distribution of the chain after 2 steps.
A Student’s study habit are as follows. If he studies one night, he 0.3 0.7
is 70% sure not to study next night. On the other hand, the P
3. probability that he does not study two nights in succession is 0.6. 0.4 0.6
In the long run, how often does he study ? 4
long run
11
The transition probability matrix of a markov chain 0.0048
X n , n 1,2,3,... having 3 states 1, 2 and 3 is
0.1 0.5 0.4
P 0.6 0.2 0.2 and the initial distribution is
4.
Ergodic state
P( X 2 6)
There are 2 white balls in bag A and 3 red balls in bag B. At each
step of the process, a ball is selected from each bag and the 2 0 1 0
1 1 1
balls selected are interchanged. Let the state a i of the system be P
the number of red ball in A after i change. i) What is the 6 2 3
0 2 1
probability that there are 2 red balls in A after 3 steps?. ii) In the 3 3
10. long run, what is the probability that there are 2 red balls in bag
A?. 5
i)
18
3
ii)
10
Consider a markov chain with state space S {0,1,2} and one- i ) The chain is irreducible
ii ) Periods are 1
0 0 1 2 1 2
step transition probability matrix P 1 0 0 iii ) , ,
1 1 5 5 5
12. 0
2 2
i) Show that the chain is irreducible.