You are on page 1of 2

UNIVERSITY OF RWANDA

Course: STOCHASTIC PROCESSES Assignment 1


Date: August 2, 2023

1. Assume that a student can be in 1 of 4 states: Rich, Average, Poor, In Debt.


Assume the following transition probabilities:
If a student is Rich, in the next time step the student will be Average with probability 0.75
If a student is Rich, in the next time step the student will be Poor with probability 0.2
If a student is Rich, in the next time step the student will be In Debt with probability 0.05

If a student is Average, in the next time step the student will be Rich with probability 0.05
If a student is Average, in the next time step the student will be Average with probability 0.2
If a student is Average, in the next time step the student will be In Debt with probability 0.45

If a student is Poor, in the next time step the student will be Average with probability 0.4
If a student is Poor, in the next time step the student will be Poor with probability 0.3
If a student is Poor, in the next time step the student will be In Debt with probability 0.2

If a student is In Debt, in the next time step the student will be Average with probability 0.15
If a student is In Debt, in the next time step the student will be Poor with probability 0.3
If a student is In Debt, in the next time step the student will be In Debt with probability 0.55

Model the above as a discrete Markov chain

(a) Draw the corresponding Markov chain and obtain the corresponding stochastic matrix.
(b) If a student starts at the “Average” state, what will be the probability of them getting ot
state “Rich” after
i) 1 time step?
ii) 2 time steps?
iii) 3 time steps?

2. Let Sn be a Markov Chain with state space S = {1, 2, 3} and transition matrix
 
0.1 0.4 0.5
P = 0.3
 0.5 0.2
0.4 0.3 0.3

a) find the limiting distribution of P


b) Let the initial distribution of X0 be π0 = (0.4, 0.3, 0.2), this means: P (X0 = 1) = 0.4,
P (X0 = 2) = .3, P (X0 = 3) = .2 Find the distribution of all the states 3 days later.
3. Suppose that customers arrive according to a Poisson process with rate λ at a service center
that has a single server. Customers are served one at a time in order of arrival. Service times are
assumed to be i.i.d. Exponential(µ) random variables and independent of the arrival process.
Customers leave the system after being served. Our goal in this problem is to model the above
system as a continuous- time Markov chain. Let X(t) be the number of customers in the system
at time t , so the state space is S = {0, 1, 2, · · · } . Assume i > 0 . If the system is in state i at
time t , then the next state would either be i + 1 (if a new customers arrive) or state i − 1 (if
a customer leaves).

a. Suppose that the system is in state 0 , so there are no customers in the system and the
next transition will be to state 1 . Let T0 be the time until the next 0 transition. Show
that T0 ∼ Exponential(λ) .
b. Suppose that the system is currently in state i , where i > 0 until the next transition.
Show that Ti ∼ Exponential(λ + µ)
c. Suppose that the system is at state i . Find the probability that the next transition will
be to state i + 1 .
d. Draw the jump chain, and provide the holding time parameters λi .
e. Find the Generator matrix.
f. Draw the transition rate diagram.

You might also like