You are on page 1of 3

TUTORIAL SHEET - UNIT - V

SRM Institute of Science and Technology


Department of Mathematics
18MAB204T-Probability and Queuing Theory
Unit – V

Sl.No. Questions Answers

PART – A

0 1 1 2 
If the tpm of a markov chain is P   1 1  . Find the steady   , 
1. 3 3 
 2 2 
state distribution of the markov chain.
If the initial state probability distribution of a markov chain is  11 23 
p ( 2)   , 
5 1 0 1  24 24 
2. p   ,  and the tpm of the chain is P   1 1  . Find the
(0)

6 6  2 2 
probability distribution of the chain after 2 steps.
A Student’s study habit are as follows. If he studies one night, he 0.3 0.7
is 70% sure not to study next night. On the other hand, the P 
3. probability that he does not study two nights in succession is 0.6. 0.4 0.6
In the long run, how often does he study ? 4
long run 
11
The transition probability matrix of a markov chain 0.0048
X n , n  1,2,3,... having 3 states 1, 2 and 3 is
 0.1 0.5 0.4
P  0.6 0.2 0.2 and the initial distribution is
4.

0.3 0.4 0.3


P(0)  (0.7, 0.2, 0.1) . Find P( X 3  2, X 2  3, X 1  3, X 0  2) .

A raining process is considered as a two state Markov chain. If it 0.376


rains, it is considered to be in state 0 and if it does not rain, the
5. chain is in state 1. The tpm of the Markov chain is defined as
0.6 0.4
P  . Find the probability that it will rain after three
0.2 0.8
days from today.
PART – B

Three boys A, B, C are throwing a ball to each other. ‘A’ always  


throws the ball to ‘B’ and ‘B’ always throws the ball to ‘C’ but 0 1 0
‘C’ is just as likely to throw the ball to B as to A. Find the P  0 0 1
transmission matrix and classify the states. 1 1 
6.  0
2 2 

Ergodic state

A man either drives a car or catches a train to go to office each 0 1


day. He never goes 2 days in a row by train but if he drives one P  1 1
day, than the next day he is just as likely to drive again he is just  2 2 
as likely to drive again he is to travel by train. Now suppose that
7. on the first day of the week, the man tossed a fair die and drove 11
to work if and only if a ‘6’ appeared. Find (i) the probability that i)
12
he takes train on the third day. (ii) The probability that he drives
to work in the long run. 2
ii)
3

Let X n , n  0,1,2,3,... be a markov chain with state space i) Ergodic


S  0,1,2, and one-step transition probability matrix
1 2 1
0 1 0 ii )    , , 
1 1 1 6 3 6
P
2 4 
8.
 04
 1 0
i) Is the chain ergodic? Explain

ii) Find the invariant probabilities.

A Fair dice is tossed repeatedly. If X n denotes the maximum of 1 1 1 1 1 1


the numbers occurring in the first n tosses, find the probability 6 6 6 6 6 6
 1
matrix P of the markov chain X n  . Find also PX 2  6 and
2 1 1 1
0 
 6 6 6 6 6
P2 . 0 3 1 1 1
0
P 6 6 6 6
9.  4 1 1
0 0 0 
 6 6 6
0 0 0 0
5 1
 6 6
 0 0 0 0 0 1 

P( X 2  6)
There are 2 white balls in bag A and 3 red balls in bag B. At each  
step of the process, a ball is selected from each bag and the 2 0 1 0
1 1 1
balls selected are interchanged. Let the state a i of the system be P 
the number of red ball in A after i change. i) What is the 6 2 3
0 2 1
probability that there are 2 red balls in A after 3 steps?. ii) In the  3 3 
10. long run, what is the probability that there are 2 red balls in bag
A?. 5
i)
18

3
ii)
10

Let X n , n  0,1,2,3,... be a markov chain with state space Ergodic


S  {1,2,3} and one-step transition probability matrix
0 1 0
11. 1 1
P 0 ,Is the chain ergodic? Explain
2 2
1 0 0 

Consider a markov chain with state space S  {0,1,2} and one- i ) The chain is irreducible
  ii ) Periods are 1
 0 0 1 2 1 2
step transition probability matrix P   1 0 0 iii )    , , 
1 1  5 5 5
12.  0
2 2 
i) Show that the chain is irreducible.

ii) Find the period.

iii) Find the invariant probabilities.


The tpm of a markov chain with three states 0,1,2 is 1
i)
3 1  6
4 4 0 3
1 1 1 ii )
P  and the initial state distribution of the chain is 64
13. 4 2 4 5
0 3 1 iii )
48
 4 4 
PX 0  i   , i  0,1,2. Find i) PX 2   2 ,
1
3
ii ). P( X 3  1, X 2  2, X 1  1, X 0  2) , iii ). P( X 2  1, X 0  0)
Find the nature of the states of the Markov chain with the tpm Not Ergodic
0 1 0
14. 1 1
P 0
2 2
0 1 0
 

You might also like