You are on page 1of 6

Primavera 2021 Procesos Estocásticos I

Tarea 1
1. A Markov chain X0 , X1 , . . . on states 0, 1, 2 has the transition probability matrix
 
0.1 0.2 0.7
P =  0.9 0.1 0 
0.1 0.8 0.1
and initial distribution p0 = P [X0 = 0] = 0.3, p1 = P [X0 = 1] = 0.4, p2 = P [X0 = 2] = 0.3.
Determine P [X0 = 0, X1 = 1, X2 = 2].
Resp.=0
2. A Markov chain X0 , X1 , . . . on states 0, 1, 2 has the transition probability matrix
 
0.7 0.2 0.1
P = 0 0.6 0.4  .
0.5 0 0.5
Determine the conditional probabilities
P X2 = 1, X3 = 1 X1 = 0 and P X1 = 1, X2 = 1 X0 = 0 .
   

Resp.=0.12, 0.12
3. A Markov chain X0 , X1 , . . . on states 0, 1, 2 has the transition probability matrix
 
0.6 0.3 0.1
P =  0.3 0.3 0.4  .
0.4 0.1 0.5
If it is known that the process starts in state X0 = 1, determine the probability P [X0 = 1, X1 = 0, X2 = 2].
Resp.=0.03
4. A Markov chain X0 , X1 , . . . on states 0, 1, 2 has the transition probability matrix
 
0.1 0.1 0.8
P =  0.2 0.2 0.6  .
0.3 0.3 0.4

Determine P X1 = 1, X2 = 1 X0 = 0 and P
   
X2 = 1, X3 = 1 X1 = 0
Resp.=0.02, 0.02
5. A Markov chain X0 , X1 , . . . on states 0, 1, 2 has the transition probability matrix
 
0.3 0.2 0.5
P =  0.5 0.1 0.4 
0.5 0.2 0.3
and initial distribution p0 = 0.5, p1 = 0.5. Determine
P [X0 = 1, X1 = 1, X2 = 0] and P [X1 = 1, X2 = 1, X3 = 0]

Resp.=0.025, 0.0075
6. A simplied model for the spread of a disease goes this way: The total population size is
N = 5, of which some are diseased and the remainder are healthy. During any single period
of time, two people are selected at random from the population and assumed to interact.
The selection is such that an encounter between any pair of individuals in the population is
just as likely as between any other pair. If one of these persons is diseased and the other not,
then with probability α = 0.1 the disease is transmitted to the healthy person. Otherwise,
no disease transmission takes place. Let Xn , denote the number of diseased persons in the
population at the end of the nth period. Specify the transition probability matrix.

Dr. josé Juan Castro Alva


Primavera 2021 Procesos Estocásticos I

7. Consider the problem of sending a binary message, 0 or 1, through a signal channel consisting
of several stages, where transmission through each stage is subject to a xed probability of
error α. Suppose that X0 = 0 is the signal that is sent and let Xn , be the signal that is
received at the nth stage. Assume that {Xn } is a Markov chain with transition probabilities
P00 = P11 = 1 − α and P10 = P10 = α where 0 < α < 1.

a ) Determine P [X0 = 0, X1 = 0, X2 = 0], the probability that no error occurs up to stage


n = 2.
b ) Determine the probability that a correct signal is received at stage 2. (Hint: This is
P [X0 = 0, X1 = 0, X2 = 0] + P [X0 = 0, X1 = 1, X2 = 0]).

8. Consider a sequence of items from a production process, with each item being graded as good
or defective. Suppose that a good item is followed by another good item with probability α
and is followed by a defective item with probability 1 − α. Similarly, a defective item is
followed by another defective item with probability β and is followed by a good item with
probability 1 − β . If the rst item is good, what is the probability that the rst defective
item to appear is the fth item?
9. The random variables ξ1 , ξ2 , . . . are independent and with the common probability mass
function
k= 0 1 2 3
P (ξ = k) = 0.1 0.3 0.2 0.4

Set X0 = 0, and let Xn = máx{ξ1 , . . . , ξn } be the largest ξ observed to date.Determine the


transition probability matrix for the Markov chain {Xn }.
10. A particle moves among the states 0, 1, 2 according to a Markov process whose transition
probability matrix is  
0 0.5 0.5
P =  0.5 0 0.5  .
0.5 0.5 0
Let Xn denote the position of the particle at the nth move. Calculate P Xn = 0, X0 = 0
 

for n = 0, 1, 2, 3, 4.
11. A Markov chain X0 , X1 , X2 , . . . on states 0, 1, 2 has the transition probability matrix
 
0.6 0.3 0.1
P =  0.3 0.3 0.4  .
0.4 0.1 0.5

If it is known that the process starts in state X0 = 1, determine the probability P [X2 = 2].
12. A Markov chain X0 , X1 , X2 , . . . on states 0, 1, 2, 3 has the transition probability matrix
 
0.4 0.3 0.2 0.1
 0.1 0.4 0.3 0.2 
P =
 0.3 0.2 0.1 0.4  .

0.2 0.1 0.4 0.3

Suppose that the initial distribution is pi = 1/4 for i = 0, 1, 2, 3. Show that P [Xn = k] = 1/4,
k = 0, 1, 2, 3, for all n. Can you deduce a general result from this example?

13. Consider the problem of sending a binary message, 0 or 1, through a signal channel consisting
of several stages, where transmission through each stage is subject to a xed probability of
error α. Let X0 be the signal that is sent and let Xn be the signal that is received at the
nth stage. Suppose Xn is a Markov chain with transition probabilities P00 = P11 = 1 − α
and P10 = P10 = α, (0 < α < 1). Determine P X5 = 0 X0 = 0 the probability of correct


transmission through ve stages.

Dr. josé Juan Castro Alva


Primavera 2021 Procesos Estocásticos I

14. Let Xn denote the quality of the nth item produced by a production system with X0 = 0
meaning ”good” and Xn = 1 meaning ”defective.” Suppose that Xn evolves as a Markov
chain whose transition probability matrix is
 
0.99 0.01
P = .
0.12 0.88

What is the probability that the fourth item is defective given that the rst item is defective?
15. A Markov chain Xn on states 0, 1 has the transition probability matrix
 
α 1−α
P = .
1−β β

Then Zn = (Xn−1 , Xn ) is a Markov chain having the four states (0, 0), (0, 1), (1, 0), and
(1, 1). Determine the transition probability matrix.

16. A Markov chain X0 , X1 , X2 , . . . on states 0, 1, 2 has the transition probability matrix


 
0.7 0.2 0.1
P =  0.3 0.5 0.2  .
0 0 1

The Markov chain starts at time zero in state X0 = 0. Let

T = mı́n{n ≥ 0 : Xn = 2}

be the rst time that the process reaches state 2. Eventually, the process will reach and
be absorbed into state 2. If in some experiment we observed such a process and noted that
absorption had not yet taken place, we might be interested in the conditional probability that
the process is in state 0 (or 1), given that absorption had not yet taken place. Determine
P X3 = 0 X0 = 0, T > 3 . (Hint: The event {T > 3} is exactly the same as the event {X3 6=
2} = {X3 = 0} ∪ {X3 = 1}.)

17. Consider a spare parts inventory model in which either 0, 1, or 2 repair parts are demanded
in any period, with

P (ξn = 0) = 0.4, P (ξn = 1) = 0.3, P (ξn = 2) = 0.3,

and suppose s = 0 and S = 3. Determine the transition probability matrix for the Markov
chain {Xn } where Xn is dened to be the quantity on hand at the end of period n.
18. Consider two urns A and B containing a total of N balls. An experiment is performed in
which a ball is selected at random (all selections equally likely) at time t (t = 1, 2, . . .) from
among the totality of N balls. Then an urn is selected at random (A is chosen with probability
p and B is chosen with probability q ) and the ball previously drawn is placed in this urn.
The state of the system at each trial is represented by the number of balls in A. Determine
the transition matrix for this Markov chain.
19. Consider the inventory model. Suppose that S = 3, and that the probability distribution for
demand is P (ξ = 0) = 0.1, P (ξ = 1) = 0.4, P (ξ = 2) = 0.3, and P (ξ = 3) = 0.2. Set up the
corresponding transition probability matrix for the end-of-period inventory level Xn .
20. An urn contains six tags, of which three are red and three green. Two tags are selected from
the urn. If one tag is red and the other is green, then the selected tags are discarded and
two blue tags are returned to the urn. Otherwise, the selected tags are returned to the urn.
This process repeats until the urn contains only blue tags. Let Xn denote the number of red
tags in the urn after the nth draw, with X0 = 3. (This is an elementary model of a chemical
reaction in which red and green atoms combine to form a blue molecule.) Give the transition
probability matrix.

Dr. josé Juan Castro Alva


Primavera 2021 Procesos Estocásticos I

21. Two teams, A and B, are to play a best of seven series of games. Suppose that the outcomes
of successive games are independent, and each is won by A with probability p and won by B
with probability 1 − p. Let the state of the system be represented by the pair (a, b), where
a is the number of games won by A, and b is the number of games won by B. Specify the
transition probability matrix. Note that a + b ≤ 7 and that the series ends whenever a = 4
or b = 4.

22. A component in a system is placed into service, where it operates until its failure, whereupon
it is replaced at the end of the period with a new component having statistically identical
properties, and the process repeats. The probability that a component lasts for k periods
is αk , for k = 1, 2, . . . . Let Xn be the remaining life of the component in service at the
end of period n. Then Xn = 0 means that Xn+1 will be the total operating life of the next
component. Give the transition probabilities for the Markov chain {Xn }.

23. Two urns A and B contain a total of N balls. Assume that at time t there were exactly k
balls in A. At time t + 1 an urn is selected at random in proportion to its contents (i.e.,
A is chosen with probability k/N and B is chosen with probability (N −k)/N ). Then a ball is
selected from A with probability p or from B with probability q and placed in the previously
chosen urn. Determine the transition matrix for this Markov chain.

24. Consider a discrete-time, periodic review inventory model and let ξn , be the total demand
in period n, and let Xn be the inventory quantity on hand at the end-of-period n. An (s, S)
inventory policy is used: If the end-of-period stock is not greater than s, then a quantity is
instantly procured to bring the level up to S . If the end-of-period stock exceeds s, then no
replenishment takes place.

a ) Suppose that s = 1, S = 4, and X0 = S = 4. If the period demands turn out to be


ξ1 = 2, ξ2 = 3, ξ3 = 4, ξ4 = 0, ξ5 = 2, ξ6 = 1, ξ7 = 2, ξ8 = 2, what are the end-of-period
stock levels Xn for periods n = 1, 2, . . . , 8?
b ) Suppose that ξ1 , ξ2 , . . . are independent random variables where P (ξn = 0) = 0.1,
P (ξn = 1) = 0.3, P (ξn = 2) = 0.3, P (ξn = 3) = 0.2 and P (ξn = 4) = 0.1. Then
X0 , X1 , . . . is a Markov chain. Determine P41 and P04 .

25. A Markov chain X0 , X1 , . . . on states 0, 1, 2 has the transition probability matrix


 
0.7 0.2 0.1
P = 0 0.6 0.4 
0.5 0 0.5

Determine the limiting distribution.

26. A Markov chain X0 , X1 , . . . on states 0, 1, 2 has the transition probability matrix


 
0.1 0.1 0.8
P =  0.2 0.2 0.6 
0.3 0.3 0.4

What fraction of time, in the long run, does the process spend in state 1?

27. A Markov chain X0 , X1 , . . . on states 0, 1, 2 has the transition probability matrix


 
0.3 0.2 0.5
P =  0.5 0.1 0.4 
0.5 0.2 0.3

Every period that the process spends in state 0 incurs a cost of $2. Every period that the
process spends in state 1 incurs a cost of $5. Every period that the process spends in state 2
incurs a cost of $3. What is the long run cost per period associated with this Markov chain?

Dr. josé Juan Castro Alva


Primavera 2021 Procesos Estocásticos I

28. A Markov chain X0 , X1 , . . . on states 0, 1, 2, 3 has the transition probability matrix


 
0.1 0.2 0.3 0.4
 0 0.3 0.3 0.4 
P =
 0

0 0.6 0.4 
1 0 0 0

Determine the corresponding limiting distribution.


29. Suppose that the social classes of successive generations in a family follow a Markov chain
with transition probability matrix given by
Son0 s Class
Lower M iddle U pper
Lower 0.7 0.2 0.1
F ather0 s M iddle 0.2 0.6 0.2
Class U pper 0.1 0.4 0.5

What fraction of families are upper class in the long run?


30. Five balls are distributed between two urns, labeled A and B . Each period, an urn is selected
at random, and if it is not empty, a ball from that urn is removed and placed into the other
urn. In the long run, what fraction of time is urn A empty?
31. A Markov chain X0 , X1 , . . . on states 0, 1, 2, 3, 4, 5 has the transition probability matrix
 
α1 α2 α3 α4 α5 α6

 1 0 0 0 0 0 

 0 1 0 0 0 0 
P = 

 0 0 1 0 0 0 

 0 0 0 1 0 0 
0 0 0 0 1 0

where αi ≥ 0, i = 1, 2, . . . , 6, and α1 + · · · + α6 = 1. Determine the limiting probability of


being in state 0.
32. Which states are transient and which are recurrent in the Markov chain whose transition
probability matrix is  
1/3 0 1/3 0 0 1/3
 1/2 1/4 1/4 0 0 0 
 
 0 0 0 0 1 0 
P = 1/4 1/4 1/4


 0 0 1/4 

 0 0 1 0 0 0 
0 0 0 0 0 1

33. A Markov chain on states {0, 1, 2, 3, 4, 5} has transition probability matrices


 
1/3 0 2/3 0 0 0

 0 1/4 0 3/4 0 0 
2/3 0 1/3 0 0 0 
a) P = 



 0 1/5 0 4/5 0 0 
 1/4 1/4 0 0 1/4 1/4 
1/6 1/6 1/6 1/6 1/6 1/6
 
1 0 0 0 0 0

 0 3/4 1/4 0 0 0 
0 1/8 7/8 0 0 0 
b) P = 


 1/4 1/4 0 1/8 3/8 0 
 
 1/3 0 1/6 1/4 1/4 0 
0 0 0 0 0 1
Find all communicating classes.

Dr. josé Juan Castro Alva


Primavera 2021 Procesos Estocásticos I

34. Determine the communicating classes and period for each state of the Markov chain whose
transition probability matrix is
 
1/2 0 0 0 1/2 0

 0 0 1 0 0 0 

 0 0 0 1 0 0 
P = .

 0 0 0 0 1 0 

 0 0 0 0 0 1 
0 0 1/3 1/3 0 1/3

Dr. josé Juan Castro Alva

You might also like