You are on page 1of 6

ECE 534: Random Processes

Spring 2016

Problem Set 5
Due Date: 4/6
1. Problem 1
(a)
Var(Yn ) = E[Yn2 ] (E[Yn ])2 = 2n 1
(b) Note that
Yn =

n
Y

Xk = Yn1 Xn ,

n2

k=1

Then,
E[Yn |Y1 , . . . , Yn1 ] = E[Yn1 Xn |Y1 , . . . , Yn1 ]
= Yn1 E[Xn ] (Xn is independent of Y1 , . . . , Yn1 )
n2

= Yn1 ,
When n = 1, E[Y1 ] = E[X1 ] = 1.
(c) From part (b),

n |Y1 , . . . , Yn1 ] =
E[Y

Yn1 ,

n2

1,

n=1

(d) Linear innovation sequence


Y1

= Y1 E[Y1 ] = Y1 1

n |Y1 , . . . , Yn1 ] = Yn Yn1 ,


Yn = Yn E[Y

n2

(e) Note that E[Z] = m 6= 0. Then,

E[Z|Y
1 , . . . , Ym ] = E[Z|Y1 , . . . , Ym ]
Pm
= E[Z] + i=1 E[Z E[Z]|Yi ]
P

1
= m+ m
i=1 Cov(Z, Yi )Cov(Yi ) Yi
It can be checked that
Cov(Z, Yi ) = 1, Cov(Yi ) = 2i1
Hence,
Pm (i1)

E[Z|Y
Yi
1 , . . . , Ym ] = m +
i=1 2
Pm (i1)
= m + Y1 1 + i=2 2
(Yi Yi1 )
2. Problem 2

ECE 534: Random Processes, Problem Set 5

(a) No. Let fS (s) be the density of S. Denote by fX2 |X1 (x2 |x1 ) and fX2 |X1 ,X0 (x2 |x1 , x0 ) the
conditional densities. We choose x0 = 5, x1 = 4, x2 = 1. Then,
fX2 |X1 (x2 |x1 ) = fS (3) > 0
However,
fX2 |X1 ,X0 (x2 |x1 , x0 ) = (x2 3) = 0 6= fX2 |X1 (x2 |x1 )
Hence, X is not a Markov process.
(b) Yes. At each time t, there is a one-to-one mapping between Xt and Nt . Therefore, Xt
and Nt have equivalent information. Since N is Markov, so is X. Furthermore,
pij (s, t) = P (Xt = j|Xs = i)

P (Nt = j|Ns = i),

=
0,

not defined,
When


j, i N

if i N, j 6 N
if

otherwise


j, i N,

j|Ns = i)

= P (Nt Ns = j i)

pij (s, t) = P (Nt =

k (t s)k e(ts)
,
k!

k=

Since pij (s, t) depend on s and t only though t s, X is a time-homogeneous Markov


process.
3. Problem 3
(a) Since N (t 1, t] is a Poisson random variable with mean , E[Xt ] = . If |s t| 1
then Xs is independent of Xt . If |s t| < 1 then the intervals (s 1, s] and (t, t 1] have
an overlap of length 1 |s t|. For example, if 0 t s < 1, then s 1 t 1 < s t,
and Xs = N (s 1, t 1] + N (t 1, s], and Xt = N (t 1, s] + N (s, t]. Since the three
random variables N (s 1, t 1], N (t 1, s] and N (s, t] are independent,
CX (s, t) = Cov(N (s 1, t 1] + N (t 1, s], N (t 1, s] + N (s, t])
= Var(N (t 1, s]) = (s t + 1) = (1 |t s|)
A similar computation shows CX (s, t) = (1 |t s|) if 0 s t < 1 as well. Thus, X
is wide sense stationary, and CX ( ) = (1 | |)+ for all .
(b) Yes and yes. Not only is X WSS as found in part (a), but it is stationary. Starting with
the first order distribution, clearly Xt = N (t 1, t] has the same P oi() distribution as
Xt+ = N (t + 1, t + ] for any . For higher order distributions, think about the
joint distribution of Xt , Xs for example, if we shift the time for , the joint distribution
of Xt+ , Xs+ is the same due to the facts that: (i) if Xt and Xs have no overlap (i.e.
|t s| > 1), then the joint distribution is just the product of the marginal distributions,

ECE 534: Random Processes, Problem Set 5

and this does not change after a time shift. (ii) if they do have overlap, the distribution corresponding to the overlap does not change after the time shift. Thus the joint
distribution is invariant to time shift. You can follow the same logic for higher-order
distributions.
(c) No, X is not Markov. For example, think of t = 1 as the present time. Given X1 = 1,
we know there is one customer in the system at time 1. But if we also knew the past
of X, we would know when that customer arrived, so that we would know exactly when
that customer will depart. Therefore, the past tells us more about the future than the
present alone.
(d) The system is empty during [0, 1] if and only if there are no arrivals during [1, 1]. So
P {Xt = 0 for t [0, 1]} = P {N [1, 1] = 0} = exp(2).
(e) The event is equivalent to there being at least customer in the queue at time zero, and a
new customer arriving after time zero before the departure of the last customer arriving
before time zero. Let T be the time of the first arrival after time zero, and let S be the
time of the last arrival before time zero. Then S and T are independent, exponentially
distributed
random variables. Then P {Xt > 0 for t [0, 1]} = P {S + T 1} =
R 1 R 1t 2 (s+t)
e
dsdt = 1 (1 + )e .
0 0
4. Problem 4
(a)
E[B ] = E[B |X2 = 1]P (X2 = 1|X0 = 1) + E[B |X2 = 3]P (X2 = 3|X0 = 1)
+ E[B |X2 = 5]P (X2 = 5|X0 = 1)
= (E[B ] + 2) 0.5 + 2 0.25 + 2 0.25
which implies that E[B ] = 4.
(b)
E[3 ] = E[3 |X2 = 1]P (X2 = 1|X0 = 1) + E[3 |X2 = 3]P (X2 = 3|X0 = 1)
+ E[3 |X2 = 5]P (X2 = 5|X0 = 1)
= (E[3 ] + 2) 0.5 + 2 0.25 + E[3 |X2 = 5] 0.25
E[3 |X2 = 5] = E[3 |X4 = 1, X2 = 5]P (X4 = 1|X2 = 5, X0 = 1)
+ E[3 |X4 = 3, X2 = 5]P (X4 = 3|X2 = 5, X0 = 1)
+ E[3 |X4 = 5, X2 = 5]P (X4 = 5|X2 = 5, X0 = 1)
= (E[3 ] + 4) 0.25 + 4 0.25 + (E[3 |X2 = 5] + 2) 0.5
Hence, it can be solved that E[3 ] = 8.
(c) Let ij be the minimum steps needed from state i to state j, where i, j {1, 3, 5}, i 6= j.
By symmetry of the states arrangement and the transition probability, ij has the same

ECE 534: Random Processes, Problem Set 5

distribution with 3 . Hence,


E[C ] = E[B + 35 |XB = 3]P (XB = 3|X0 = 1)
+ E[B + 53 |XB = 5]P (XB = 5|X0 = 1)
= E[B |XB = 3]P (XB = 3|X0 = 1) + E[35 ]P (XB = 3|X0 = 1)
+ E[B |XB = 5]P (XB = 5|X0 = 1) + E[53 ]P (XB = 5|X0 = 1)
= E[B ] + E[3 ] = 12
(d) By a similar argument with part (c), one can get
E[R ] = E[B ] + E[3 ] + E[3 ] = 20
5. Problem 5
(a) Let k 0. Think of k is the present time. The future Xk+1 , Xk+2 , is determined
by Xk and Uk , Uk+1 , . But (Uk , Uk+1 ) is independent of (X0 , , Xk ). So X is
Markovian. Since E[Xk+1 |X0 , , Xk ] = 21 (1 + Xk ), which is not equal to Xk , the
process X is not a martingale. To see if X is an independent increment process, we
begin by asking, Is X1 X0 independent of X2 X1 ? or equivalently, Is 2U0 1
independent of U1 + 2U0 U1 2U0 ? Given 2U0 1 = 2u 1, the conditional distribution
of U1 + 2U0 U1 2U0 is uniform over the interval [2u, 1]. This distribution depends on
u. So these two increments of X are not independent. Thus X is not an independent
increment process.
(b) As a first test of whether Y is Markovian, let us see if the conditional independence
condition Yk Yk+1 Yk+2 holds for k 2. Equivalently, we check to see if
Cov(Yk , Yk+2 ) = Cov(Yk , Yk+1 )Var(Yk+1 )1 Cov(Yk+1 , Yk+2 )
But this condition boils down to 1 = 2(3)1 2, which is false. Thus, X is not Markovian.
Note that for k fixed, (Y0 , , Yk ) and (V0 , Vk ) are functions of each other. Thus for
k 1,
E[Yk+1 |Y0 , , Yk ] = E[Vk+1 + Vk + Vk1 |Vk , Vk1 , , V0 ] = Vk + Vk1 6= Yk ,
so Y is not a martingale. Since Cov(Y1 Y0 , Y4 Y3 ) = Cov(V1 , V4 V1 ) = 1 6= 0, Y
does not have independent increments.
6. Problem 6
(a) Since for each t, Wt and Zt are functions of each other, they have equivalent information.
Since W is Markov, so is Z. Let t0 < t1 < < tn+1 . Conditioning on (Zt0 , Zt1 , , Ztn )
is equivalent to conditioning on (Wt0 , Wt1 Wt0 , , Wtn Wtn1 , Ztn ), which is independent of Wtn+1 Wtn . Therefore,




2 (tn+1 tn )
E[Ztn+1 |Zt1 , , Ztn ] = E Ztn exp Wtn+1 Wtn
|Zt , Ztn
2



2 (tn+1 tn )
= Ztn E exp Wtn+1 Wtn
= Ztn .
2

ECE 534: Random Processes, Problem Set 5

So Z is a martingale. The process Z does not have independent increments. One way
to prove it is to compute the covariance between two increments. Here is a less tedius
way. Let A = Z1 Z0 and B = Z2 Z1 . Note that B = (1 + A)R, where R =
2
exp((W2 W1 ) 2 ) 1, and that A is independent of R. So E[B 2 |A] = E[R2 ](1 + A)2 ,
so that E[B 2 |A] depends on A. Thus A and B are not independent, so Z does not have
independent increments.
(b) R has independent increments because N has independent increments and the Ds are
i.i.d.. In addition, R0 = 0, so R is a Markov process. Note that for t 0
ER =
=

X
n=0

X
n=o

E[R|Nt = n]P [Nt = n]


E[D + + Dn ] P [Nt = n] = 0
}
| 1 {z
0

So R is a mean zero independent increment process, implying that R is also a martingale.


7. Problem 7
(a)

0.6 0.4 0
P = 0.8 0 0.2
0 0.4 0.6
(b) The equilibrium distribution can be solved from = P , which is = ( 74 , 27 , 17 ).
(c)
a1 = E[ |x0 = 1]
= E[ |x1 = 1]P (x1 = 1|x0 = 1) + E[ |x1 = 2]P (x1 = 2|x0 = 1)
= (1 + a1 ) 0.6 + (1 + a2 ) 0.4
Similarly,
a2 = (1 + a1 ) 0.8 + (1 + a3 ) 0.2
Since a3 = 0, it can be solved that
a1 = 17.5,

a2 = 15

8. Problem 8
(a) The state space is {00, 01, 10, 11}.
(b) During time interval [t, t + h), if there is no packet in stage 1 at time t, a package arrives
with probability 1 eh = h + o(h).
Packet is transferred from stage 1 to stage 2 with probability h1 + o(h).
Packet leaves stage 2 with probability h2 + o(h).

Hence, the transition rate diagram can be described as follows: 00 10, 11 10,
2
1

01 11, 01 00, 10 01.

ECE 534: Random Processes, Problem Set 5

(c)

0

2 2

Q=

1
1
0
0
0
0
2 2
(d) The equilibrium distribution can be solved from Q = 0. Given that = 1 = 2 =
1, = ( 15 , 51 , 25 , 51 ). Hence, the throughput is

= (00
+ 01
)=

2
5

You might also like