You are on page 1of 17

IE 325: Stochastic Models

Recitation 5 Solutions
Spring 2020

Question 1. Let (Nt )t be a Poisson process with rate λ > 0 and let Sn be
the n’th arrival time.
(a) Compute the conditional distribution of N1 given N4 = 7.
(b) Compute E[S4 |N1 = 2].
(c) Compute E[N1 − 2N3 + 3N10 |N4 = 7].
(d) Compute Var(N5 + 3N7 |N3 = 2).
Solution.
(a) We will compute the probability mass function of N1 given that N4 = 7.
For k = 0, 1, · · · , 7, we have
P{N1 = k, N4 = 7}
P{N1 = k|N4 = 7} =
P{N4 = 7}
P{N1 = k, N4 − N1 = 7 − k}
=
P{N4 = 7}
P{N1 = k}P{N4 − N1 = 7 − k}
= (by independent increments)
P{N4 = 7}
P{N1 = k}P{N3 = 7 − k}
= (by stationary increments)
P{N4 = 7}
k 7−k
e−λ λk! e−3λ (3λ)
(7−k)!
= 7
e−4λ (4λ)
7!
7! 1 3
= ( )k ( )7−k
k!(7 − k)! 4 4

Therefore we have N1 N4 = 7 ∼ Binomial(7, 14 ).


1
(b) Since we are asked to compute conditional expectation of a positive
random variable, we will compute E[S4 |N1 = 2] by using the conditional
version of the useful formula :
Z ∞
E[S4 |N1 = 2] = P{S4 > s|N1 = 2}ds
0
Z 1 Z ∞
= P{S4 > s|N1 = 2}ds + P{S4 > s|N1 = 2}ds
0 1
Z 1 Z ∞
= 1 ds + P{S4 > s|N1 = 2}ds
0 1
Z ∞
=1+ P{S4 > s|N1 = 2}ds.
1

For s > 1, we have

P{S4 > s, N1 = 2}
P{S4 > s|N1 = 2} =
P{N1 = 2}
P{Ns ≤ 3, N1 = 2}
=
P{N1 = 2}
P{Ns = 3, N1 = 2} + P{Ns = 2, N1 = 2}
=
P{N1 = 2}
P{Ns − N1 = 1, N1 = 2} + P{Ns − N1 = 0, N1 = 2}
=
P{N1 = 2}
P{Ns − N1 = 1} + P{Ns − N1 = 0, N1 = 2}
=
P{N1 = 2}
P{Ns − N1 = 1}P{N1 = 2} + P{Ns − N1 = 0}P{N1 = 2}
=
P{N1 = 2}
(by independent increments)
= P{Ns − N1 = 1} + P{Ns − N1 = 0}
= P{Ns−1 = 1} + P{Ns−1 = 0} (by stationary increments)
1
λ(s − 1)0
 
−λ(s−1) λ(s − 1)
=e + .
1! 0!

2
So we have
Z ∞
E[S4 |N1 = 2] = 1 + P{S4 > s|N1 = 2}ds
1
Z ∞ 1
λ(s − 1)0
 
−λ(s−1) λ(s − 1)
=1+ e + ds
1 1! 0!
Z ∞
=1+ e−λu (λu + 1)du (after a change of variable by u = s − 1)
0
2
=1+ .
λ
Now we provide another method for computing E[S4 |N1 = 2] :
Z ∞
E[S4 |N1 = 2] = 1 + P{S4 > s|N1 = 2}ds
1
Z ∞
=1+ P{Ns ≤ 3|N1 = 2}ds
Z1 ∞
=1+ P{Ns − N1 ≤ 1|N1 = 2}ds
Z1 ∞
=1+ P{Ns − N1 ≤ 1}ds (by independent increments)
1
Z ∞
=1+ P{Ns−1 ≤ 1}ds (by stationary increments)
1
Z ∞
=1+ P{S2 > s − 1}ds
1
Z ∞
=1+ P{S2 > u}du (after a change of variable by u = s − 1)
0
2
= 1 + E[S2 ] = 1 + .
λ
Note that one can provide a shorter proof using memoryless property
of the interarrival times which are exponential.

(c) We have

E[N1 −2N3 +3N10 |N4 = 7] = E[N1 |N4 = 7]−2E[N3 |N4 = 7]+3E[N10 |N4 = 7].

Since given N4 = 7, N1 is Binomial(7, 41 ) and given N4 = 7, N3 is


Binomial(7, 34 ) we have E[N1 |N4 = 7] = 41 ×7 and E[N3 |N4 = 7] = 43 ×7.

3
We have

E[N10 |N4 = 7] = E[(N10 − N4 ) + N4 |N4 = 7]


= E[(N10 − N4 ) + 7|N4 = 7]
= 7 + E[N10 − N4 |N4 = 7]
= 7 + E[N10 − N4 ] (by independent increments)
= 7 + 6λ

Therefore;
7 21
E[N1 − 2N3 + 3N10 |N4 = 7] = −2× + 3(7 + 6λ).
4 4

(d) One can compute E[N5 + 3N7 |N3 = 2] and E[(N5 + 3N7 )2 |N3 = 2] and
using the definition of the variance, compute Var(N5 + 3N7 |N3 = 2) =
E[(N5 +3N7 )2 |N3 = 2]−(E[N5 +3N7 |N3 = 2])2 . Yet we present a shorter
computation here using the fact that Var(X + Y ) = Var(X) + Var(Y )
for X and Y independent variables :

Var(N5 + 3N7 |N3 = 2) = Var(3(N7 − N5 ) + 4(N5 − N3 ) + 4N3 |N3 = 2)


= Var(3(N7 − N5 )|N3 = 2) + Var(4(N5 − N3 )|N3 = 2)
+ Var(4N3 |N3 = 2)
(by independent increments)
= Var(3(N7 − N5 )|N3 = 2) + Var(4(N5 − N3 )|N3 = 2) + 0
(since constants do not have any variance)
= Var(3(N7 − N5 )) + Var(4(N5 − N3 ))
(by independent increments)
= 9Var(N7 − N5 ) + 16Var(N5 − N3 )
= 9Var(N2 ) + 16Var(N2 ) (by stationary increments)
= 9(2λ) + 16(2λ)
= 18λ + 32λ = 50λ.

Question 2. Let (Nt )t be a Poisson process with rate λ > 0. Suppose that
each arrival represents a customer arriving at a store. The k’th customer

4
spends Yk , k ∈ N. We assume (Yk )k∈N and (Nt )t∈R+ are independent. More-
over we assume that Y1 , Y2 , · · · are i.id. with common mean µ and common
variance σ 2 . For each t ∈ R+ , let Zt be the amount spent by the customers
to arrive during (0, t) :
XNt
Zt = Yk .
k=1

The process (Zt )t∈R+ is called a compound Poisson process.

(a) Compute E[Zt ] and E[Zt2 ].

(b) Now consider a more realistic situation for longer term plans. Suppose
that the time value of money is important and 1 TL at time t is worth
e−rt TL at time zero, where r > 0 is the annualized interest rate of
continuous compounding. We should calculate present worth of the
money spent during (0, t) as
Nt
X
Zet = Yk e−rSk .
k=1

Compute E[Zet ].

Solution.

(a) We will compute E[Zt ] by conditioning on Nt :


Nt
X
E[Zt |Nt = k] = E[ Yj |Nt = k]
j=1
k
X
= E[ Yj |Nt = k]
j=1

= E[Y1 + Y2 + · · · Yk |Nt = k]
= E[Y1 + Y2 + · · · Yk ]
(since the processes (Yi )i and (Nt )t are independent)
= kE[Y1 ]
= kµ

5
So we have

X
E[Zt ] = E[Zt |Nt = k]P{Nt = k}
k=0

X
= kµP{Nt = k}
k=0

X
=µ kP{Nt = k}
k=0
= µE[Nt ]
= λtµ

We will compute E[Zt2 ] in the same way. Lets start by conditioning on


Nt :

XNt
E[Zt2 |Nt = k] = E[( Yj )2 |Nt = k]
j=1

Xk
= E[( Yj )2 |Nt = k]
j=1

= E[(Y1 + Y2 + · · · Yk )2 |Nt = k]
= E[(Y1 + Y2 + · · · Yk )2 ]
(since the processes (Yi )i and (Nt )t are independent)
h X i
= E Y12 + Y22 + · · · Yk2 + Yi Yj
i,ji6=j
X
= E[Y12 ] + E[Y22 ] + · · · E[Yk2 ] + E[Yi Yj ]
i,ji6=j
X
= E[Y12 ] + E[Y22 ] + · · · E[Yk2 ] + E[Yi ]E[Yj ] (since Yi0 s are independent)
i,ji6=j
2
= k(Var(Y1 ) + E[Y1 ] ) + k(k − 1)E[Y1 ]E[Y1 ]
= k(σ 2 + µ2 ) + k(k − 1)µ2
= kσ 2 + k 2 µ2

6
So we have

X
E[Zt2 ] = E[Zt2 |Nt = k]P{Nt = k}
k=0
X∞
= (kσ 2 + k 2 µ2 )P{Nt = k}
k=0

X ∞
X
= σ2 kP{Nt = k} + µ2 k 2 P{Nt = k}
k=0 k=0
2
= E[Nt ]σ + E[Nt2 ]µ2
= E[Nt ]σ 2 + (Var(Nt ) + E[Nt ]2 )µ2
= (λt)σ 2 + (λt + (λt)2 )µ2

From these two we can compute

Var(Zt ) = E[Zt2 ]−E[Zt ]2 = (λt)σ 2 +(λt+(λt)2 )µ2 −(λtµ)2 = λt(σ 2 +µ2 ).

(b) Let’s try computing E[Zet ] in the same way by conditioning on Nt :

XNt
E[Zet |Nt = k] = E[ Yj e−rSj |Nt = k]
j=1

Xk
= E[ Yj e−rSj |Nt = k]
j=1

Xk
6= E[ Yj e−rSj ]
j=1

(although the processes (Yi )i and (Nt )t are independent,


the processes (Sj )j and (Nt )t are not independent)

Yet one can argue as follows : We have Zet = N −rSj


P t
j=1 Yj e is a random
sum, that is a summation of random variables where the limits of the
summation are also random. To kill randomness coming from the lim-
its we previously conditioned on Nt . A way to make the upper limit
deterministic is to encode the randomness coming from the upper limit

7
in the summand of the summation. That is Zet = ∞ −rSj
P
j=1 Yj e 1{j≤Nt } .
Then we have
X∞
E[Zt ] = E[
e Yj e−rSj 1{j≤Nt } ]
j=1

X
= E[Yj e−rSj 1{j≤Nt } ]
j=1

X
= E[Yj e−rSj 1{Sj ≤t} ]
j=1

X  
 −rSj 
= E Yj e 1{Sj ≤t}
j=1

X
= E[Yj ]E[e−rSj 1{Sj ≤t} ]
j=1

(since the processes (Yi )i and (Sj )j are independent)


X∞
= µE[e−rSj 1{Sj ≤t} ]
j=1
∞ Z t
X λe−λs (λs)j−1
=µ e−rs ds
j=1 0 (j − 1)!
t ∞
e−λs (λs)j−1
Z X
−rs
=µ e λ ds
0 j=1
(j − 1)!
Z t
=µ e−rs λ 1 ds
0
(since we sum probability mass function of a Poisson variable with parameter λs)
Z t
=µ e−rs λds
0
1 − e−rt
= λµ
r

We now provide another method for computing E[Zet ] :


This method is based on the fact that given Nt = k the unordered
arrival times of customers (we do not refer to Sj0 s here) are independent

8
Uniform(0, t) variables. That is let U1 , · · · , Uk be identically distributed
independent Uniform(0, t) variables. Let U(1) , · · · , U(k) be their ordered
versions. That is

U(1) = min{Uj : j = 1, · · · , k)} = min(U1 , · · · , Uk ),

U(2) = min{Uj : Uj ≥ U(1) , j = 1, · · · , k)}


and so on. Notice that given Nt = k, (S1 , · · · , Sk ) have the same joint
distribution as (U(1) , · · · , U(k) ). So we have

9
Nt
X
E[Zet |Nt = k] = E[ Yj e−rSj |Nt = k]
j=1
k
X
= E[ Yj e−rSj |Nt = k]
j=1
k
X
= E[ Yj e−rU(j) ]
j=1

(since the processes (Yi )i and (Nt )t are independent)


k
X
= E[Yj e−rU(j) ]
j=1
k
X
= E[Yj ]E[e−rU(j) ]
j=1

(since the processes (Yi )i and (Sj )j are independent)


k
X
= µE[e−rU(j) ]
j=1
k
X
=µ E[e−rU(j) ]
j=1
k
X
= µE[ e−rU(j) ]
j=1
k
X
= µE[ e−rUj ]
j=1

(since we are summing over all Uj0 s)


k
X
=µ E[e−rUj ]
j=1

= µkE[e−rU1 ]
1 1 − e−rt
= µk .
t r

10
Replacing k by Nt we have

1 1 − e− −rt
E[Zet |Nt ] = Nt µ
t r
So we have

E[Zet ] = E[E[Zet |Nt ]]


1 1 − e−rt
= E[Nt µ ]
t r
1 1 − e−rt
= λtµ
t r
1 − e−rt
= λµ
r

Question 3. On a weekday, buses arrive to a certain stop with respect to a


Poisson process with rate λ = 2 per hour. A regular weekday is assumed to
start at 6:00 am.

(a) What is the probability that the tenth bus arrives before noon?

(b) What is the expected number of buses that arrive between noon and
3:00 pm?

(c) What is the probability that the time between the fifth and the tenth
bus is more than three hours?

(d) Given that exactly 10 buses have arrived before noon, what is the
probability that all of these buses have arrived before 11:00 am?

(e) Given that exactly 10 buses have arrived before noon, what is the
expected arrival time of the 10th bus?

Solution. Let Nt be the number of buses arrive in the first t hours after
6:00 am. Then, (Nt )t≥0 is a Poisson process with rate λ = 2. Let Si denote
the arrival time of the ith bus.

11
(a)

P{10th bus arrives between 06:00am and 12:00} = P{10th bus arrives before 6 hours}
= P{S10 ≤ 6}
= P{N6 ≥ 10}
9
X e−(2·6) (2 · 6)
=1−
k=0
k!
9
−12
X 12k
=1−e
k=0
k!

(b) By the stationary increments property of Poisson process,

E[N9 − N6 ] = E[N3 ] = 3 · 2 = 6

(c)

P{S10 − S5 > 3} = P{NS5 +3 − NS5 ≤ 4}


Z ∞
= P{NS5 +3 − NS5 |S5 = t}fS5 (t)dt
Z0 ∞
= P{N3 ≤ 4}fS5 (t)dt (by stationary independent increments)
0
= P{N3 ≤ 4}
4
X e−6 6k
=
k=0
k!
= 115e−6

12
(d)

P{exactly 10 buses arrive before 11:00 am | exactly 10 buses arrive before 12:00}
= P{exactly 10 arrivals in 5 hours|exactly 10 arrivals in 6 hours}
= P{N5 = 10|N6 = 10}
P{N5 = 10, N6 = 10}
=
P{N6 = 10}
P{N1 = 0|N5 = 10}P{N5 = 10}
=
P{N6 = 10}
P{N1 = 0}P{N5 = 10}
=
PN6 = 10
e−2 20 e−10 1010
0! 10!
= e−12 1210
10!
5
= ( )10
6

(e) We are asked to compute E[S10 |N6 = 10]. Note that S10 is a non-
negative random variable. We have

Z ∞
E[S10 |N6 = 10] = P{S10 > t|N6 = 10}dt.
0

Let’s compute P{S10 > t|N6 = 10}.


For t > 6, P{S10 > t|N6 = 10} = 0. (The tenth arrival has already
occured by time 6. Hence, the arrival time of the tenth bus cannot be
greater than t for t > 6.)
For t ≤ 6, we have

P{S10 > t|N6 = 10} = 1 − P{S10 ≤ t|N6 = 10}


= 1 − P{Nt = 10|N6 = 10}
t
= 1 − ( )10
6

Then,

13
Z ∞
E[S10 |N6 = 10] = P{S10 > t|N6 = 10}dt
0
Z 6
t
= (1 − ( )10 )dt
0 6
60
=
11

Question 4. A car needs frequent maintenance to work properly. The main-


tenance times occur as a Poisson process with rate µ. Once the car receives
no maintenance for a time interval of length h, it breaks down. It then needs
to be repaired, repair time is an exponential distribution with expectation
1/λ, after which it goes back to work properly.

(a) After the car started to work, find the probability that the car will
break down before receiving its first maintenance.

(b) Find the expected time for the first breakdown.

Solution. Let Nt be the number of maintenances that the car have between
[0, t). Then, {Nt , t ≥ 0} is a Poisson process with rate µ.

(a) Let Ti denote the interarrival time between (i − 1)th and ith mainte-
nances. Then, Ti ∼ Expon(µ)

P{The first maintenance arrives later than h} = P{T1 > h}


Z ∞
= µe−µt dt
h
= e−µh

(b) Let TB be the time of the first breakdown. Conditioning on the first

14
maintenance time
Z ∞
E[TB ] = E[TB |T1 = s]fS1 (s)ds
0
Z h Z ∞
= E[TB |T1 = s]fT1 (s)ds + E[TB |T1 = s]fT1 (s)ds
0 h
Z h Z ∞
= (s + E[TB ])fT1 (s)ds + hfT1 (s)ds
0 h
Z h Z h Z ∞
= sfT1 (s)ds + E[TB ]fT1 (s)ds + hfT1 (s)ds
0 0 h
Z h Z h Z ∞
= sfT1 (s)ds + E[TB ] fT1 (s)ds + h fT1 (s)ds
0 0 h
−µh
1−e (µh + 1)
= + E[TB ]P{T1 ≤ h} + hP{T1 > h}
µ
1 − e−µh (µh + 1)
= + E[TB ](1 − e−µh ) + he−µh
µ
Solving for E[TB ] we get
1 − e−µh
E[TB ] =
µe−µh
Question 5. Cars pass a certain street location according to a Poisson pro-
cess with rate λ. A person wanting to cross the street at that location waits
until she can see that no cars will come by in the next T time units. (Note,
for instance, that if no cars will be passing in the first T time units then the
waiting time is 0.)
(a) What is the probability that the waiting time of the person is 0?
(b) Find the expected time that the person waits before starting to cross
the street.
Solution.
(a)
P{Waiting time is 0} =P{No car in T time units}
e−λT (λT )0
=
0!
−λT
=e

15
(b) Let W be the waiting time and X be the time until the first car arrival.

Z ∞
E[W ] = E[W |X = x]fx (x)dx
0
Z T Z ∞
−λx
= E[W |X = x]λe dx + E[W |X = x]λe−λx dx
0 T
Z T
= (E[W ] + x)λe−λx dx
0
Z T Z T
−λx
=E[W ] λe dx + λxe−λx dx
0 0
1
=E[W ](1 − e−λT ) + (1 − e−λT (1 + λT ))
λ

So

1
E[W ]e−λT = (1 − e−λT (1 + λT ))
λ

Hence

1 λT
E[W ] = [e − (1 + λT )].
λ

Question 6. All the customers arrive at a coffee shop according to a Poisson


process with arrival rate λ = 16 . 40% of these customers are full-service
customers and the rest are self-service customers.

(a) What is the probability that fewer than 3 self-service customers arrive
during any 60-minute period the shop is open?

(b) What is the expected number of full-service customers to arrive during


any 60-minute period?

(c) Suppose we know that 12 customers arrived during the last hour. What
is the probability that 3 of them were self-service customers?

16
Solution.

Nt : Number of customers arrived to a coffee shop during (0,t):


1
(Poisson process with rate λ = )
6
f
Nt : Number of full-service customers arrived to a coffee shop during (0,t):
1 1
(Poisson process with rate λf = · 0.4 = )
6 15
Nts : Number of self-service customers arrived to a coffee shop during (0,t):
1 1
(Poisson process with rate λs = · 0.6 = )
6 10

By decomposition theorem, (Ntf ) ⊥⊥ (Nts ).

(a)
s
P{Nt+60 − Nts } =P{N60
s
< 3} (By stationary increments)
s s 1
=P{N60 ≤ 2} (Note that N60 is Poisson with rate λs · t = · 60 = 6))
10
2
X e−6 6j
=
j=0
j!

f
(b) E[Nt+60 − Ntf ] = E[N60
f f
] = 4 since N60 ∼ Poisson(λf · 60 = 4).

(c)
s
P{Nt+60 − Nts = 3|Nt+60 − Nt = 12} = P{N60
s
= 3|N60 = 12}

by stationary increments property.


s
Note that {N60 |N60 = 12} is Binomial(n = 12, p = 0.6).
Then,

s 12!
P{N60 = 3|N60 = 12} = (0.6)3 (0.4)9
3!9!

17

You might also like