You are on page 1of 2

ECE 534 RANDOM PROCESSES FALL 2011

PROBLEM SET 6 Due Thursday, November 17


6. Basic Calculus of Random Processes
Assigned Reading: Chapter 7 and Sections 11.3-11.5 of the notes.
Reminder: Exam 2, covering lectures, reading, and homework for problem sets 1-5, with an
emphasis on problems sets 4 & 5, will be held on Monday, November, 14, 7-8:15 p.m., in Room
103 Talbot, our usual classroom. You may bring two sheets of notes, two-sided, font size 10 or
larger or equivalent handwriting size, to consult during the exam. Otherwise, the exam is closed
notes. There will be no lecture on Tuesday, November 8.
Problems to be handed in:
1 Residual lifetime process of a Poisson process
Suppose N = (N
t
: t 0) is a Poisson process with some rate > 0. Let R = (R
t
: t 0) be
dened by R
t
= min{ > 0 : N
t+
N
t
+ 1}. That is, R
t
is the amount of time remaining at time
t until the next count of N.
(a) Sketch a typical sample path of R, describe the sample paths of R in words, and explain in
words why R is a Markov process.
(b) What is the distribution of R
t
for t 0?
(c) For t 0, x > 0, and > 0, what is the conditional distribution of R
t+
given R
t
= x? (Hint:
Given R
t
= x, the process from time t + x onward is statistically identical to the original process
shifted in time to the right by t +x.)
(d) In which of the ve sense(s), m.s., p., d., a.s. at each t, a.s. sample-path, is the process R
continuous?
2 Prediction using a derivative
Suppose X = (X
t
: t R) is a mean zero, WSS random process with R
X
() =
1
1+
2
. This problem
concerns estimating X at one point in time based on observation of the m.s. derivative of X at
some earlier time.
(a) Find an explicit expression for

E[X

|X

0
] and the resulting MSE, for 0.
(b) Find the value of that minimizes the MSE found in part (a).
3 Prediction of future integral of a Gaussian Markov process
Suppose (X
t
: t R) is mean zero Gaussian Markov process with R
X
() = e
||
. Let J =

0
e
t
X
t
dt, so that J is the weighted time average of (X
t
: t 0), computed using the expo-
nential density e
t
which has mean
1

.
(a) Describe the probability distribution of J.
(b) Find E[J|X
0
], and the resulting mean square error, MSE = E[(J E[J|X
0
])
2
].
1
4 A two-state stationary Markov process
Suppose X is a stationary Markov process with mean zero, state space {1, 1}, and transition rate
matrix Q =

, where 0. Note that = 0 is a possible case.


(a) Find the autocorrelation function, R
X
().
(b) For what value(s) of 0 is X m.s. continuous?
(c) For what value(s) of 0 is X m.s. continuously dierentiable?
(d) For what value(s) of 0 is X mean ergodic in the m.s. sense?
5 Some Fourier series representations
(a) Find the coecients of the function f(t) =
t
T
for t [0, T] relative to the complete orthonormal
basis for L
2
[0, T] given by
1
(t) =
1

T
,
2k
(t) =

2
T
cos

2kt
T

and
2k+1
(t) =

2
T
sin

2kt
T

for
k 1.
(b) Given N 1, let f
(N)
be the function minimizing the L
2
norm of the approximation error,
||f f
(N)
||, over all functions f
(N)
with N or fewer nonzero coordinates relative to the basis in
part (a). Describe f
(N)
and nd N such that ||f f
(N)
||
2
(0.01)||f||
2
. (You need not nd the
smallest such N, but try to get close; using an integral bound for a series can help.)
(c) Given N 1, let W
(N)
be the best approximation to the Brownian motion process W = (W
t
:
0 t T) on the the interval [0, T] (in the sense of minimizing the expected energy of the error:
E[||W W
(N)
||
2
]) over all random processes W
(N)
expressible as a random linear combination
of N functions on the interval [0, T]. Describe W
(N)
and nd N such that E[||W W
(N)
||
2
]
(0.01)E[||W||
2
]. (Hint: Use the basis functions (
n
: n 0) associated with the KL expansion of
W, given in the notes.)
6 First order dierential equation driven by Gaussian white noise
Let X be the solution of the ordinary dierential equation X

= X+N, with initial condition x


0
,
where N = (N
t
: t 0) is a real valued Gaussian white noise with R
N
() =
2
() for some con-
stant
2
> 0. Although N is not an ordinary random process, we can interpret this as the condition
that N is a Gaussian random process with mean
N
= 0 and correlation function R
N
() =
2
().
(a) Find the mean function
X
(t) and covariance function C
X
(s, t).
(b) Verify that X is a Markov process by checking the necessary and sucient condition: C
X
(r, s)C
X
(s, t) =
C
X
(r, t)C
X
(s, s) whenever r < s < t. (Note: The very denition of X also suggests that X is a
Markov process, because if t is considered to be the present time, the future of X depends only
on X
t
and the future of the white noise. The future of the white noise is independent of the past
(X
s
: s t). Thus, the present value X
t
contains all the information from the past of X that is
relevant to the future of X. This is the continuous-time analog of the discrete-time Kalman state
equation.)
(c) Find the limits of
X
(t) and R
X
(t +, t) as t . (Because these limits exist, X is said to be
asymptotically WSS.)
2