You are on page 1of 3

ECE 534 RANDOM PROCESSES FALL 2011

SOLUTIONS TO PROBLEM SET 4


1 A randomly scaled logistic function
(a) Each sample path is a translation and scaling in time of the logistic function . Thus each sample path
is an increasing function with limit zero at and limit one at .
(b) All sample paths converge to one at t . So lim
t
X
t
= 1 in the a.s sense, and therefore in the p.
and d. senses too. Since X is bounded, the limit holds also in the m.s. sense.
(c) For any a, b R with a < b, P{X
b
X
a
> 0} = 1. Thus, E[X
b
X
a
] > 0, or E[X
b
] > E[X
a
]. That is,
E[X
t
] is strictly increasing in t.
Note that ()
1
2
=
e
/2
e
/2
2(e
/2
+e
/2
)
=
1
2
tanh
_

2
_
, which is an odd function of . Thus, X
0

1
2
=
1
2
tanh
_

U
2V
_
,
which is an odd function of
U
V
. Since U and U have the same distribution, and U and V are independent,
it follows that
U
V
and
U
V
have the same distribution. So
U
V
has an even pdf. Thus, E
_
tanh
_

U
2V
_
= 0, or
E[X
0
] =
1
2
. Since lim
t
X
t
= 1 in m..s. sense and for a m.s. convergent sequence, the mean of the limit is
the limit of the means, it follows that lim
t
E[X
t
] = 1. Similarly, lim
t
E[X
t
] = 0. (Note: It can also
be shown that E[X
t
] 0.5 is an odd function of t.)
(d) No, in fact X is not even wide sense stationary, because, for example, E[X
t
] depends on t.
(e) No. Given r < s < t and constants a and b with 0 < a < b < 1, the condition (X
r
= a, X
s
= b) determines
U and V , and therefore it also determines X
t
. On the other hand, given X
s
= b there is a nondegenerate
conditional density of X
t
. So the conditional distribution of X
t
given X
s
= b is not equal to the conditional
distribution of X
t
given (X
r
= a, X
s
= b). So X is not a Markov process.
2 A simple model of a neural spike train
(a) Let T denote the time elapsed after a given spike until the next spike. The failure rate function of T is
h(t) =
_
1
4
0 t < 1
1 t 1
Therefore,
1 F
T
(t) = exp(
_
t
0
h(s)ds) =
_
e
t/4
0 t < 1
e
(1/4+t1)
t 1
and
f
T
(t) = h(t)(1 F
T
(t)) =
_
1
4
e
t/4
0 t < 1
e
(1/4+t1)
t 1.
(b) By the area rule for expectations, E[T] =
_

0
(1 F
T
(t))dt = 4 3e
1/4
1.6636.
3 A compound Poisson process with mean zero
(a) E[Y
t
|N
t
= n] = E [

n
i=1
J
i
] =

n
i=1
E[J
i
] = 0 for all n, so E[Y
t
|N
t
] is identically zero. Thus, E[Y
t
] =
E[E[Y
t
|N
t
]] = 0.
(b) The increment of Y over an interval [a, b] is the sum of N
b
N
a
independent random variables, all
with the same distribution as J
1
. By the independent increment property of the Poisson process N, for m
disjoint intervals [a
1
, b
1
], . . . , [a
m
, b
m
], the corresponding m increments of Y are given by sums of independent
numbers of Js, which are all independent and with the same distribution as J
1
. Those increments of Y are
thus independent.
(c) E[Y
2
t
|N
t
= n] = E
_
(

n
i=1
J
i
)
2
_
= Var (

n
i=1
J
i
) = n
2
J
so E[Y
2
t
] = E[N
t

2
J
] = t
2
J
.
Similarly, E
_
e
juYt
|N
t
= n

= E
_
e
J1++Jn
|N
t
= n

=
J
(u)
n
, so
Y
(u) = E[
J
(u)
Nt
] = e
t{
J
(u)1}
.
1
(d) Let c =
2
J
so that E[Z
t
] = E[Z
0
]. (If any choice of c works it must be this one.) Let t
1
< < t
n+1
.
Think of t
n
as the present time, and let H
tn
represent the past information: H
tn
= (Y
t1
, . . . , Y
tn
). Then,
E[Z
tn+1
|H
tn
] = E[Y
2
tn+1
ct
n+1
|H
tn
]
= E[Y
2
tn
|H
tn
] + 2E[(Y
tn+1
Y
tn
)Y
tn
|H
tn
] +E[(Y
tn+1
Y
tn
)
2
|H
tn
]
2
J
t
n+1
= Y
2
tn
+ 2 E[(Y
tn+1
Y
tn
)|H
tn
]
. .
0
Y
tn
+(t
n+1
t
n
)
2
J

2
J
t
n+1
= Z
tn
Therefore, E[Z
tn+1
|Z
t1
, . . . , Z
tn
] = E[E[Z
tn+1
|H
tn
]|Z
t1
, . . . , Z
tn
] = Z
tn
. Thus, Z is a martingale.
4 Hitting the corners of a triangle
Let h
i
be the mean time to hit {3, 4, 5} from initial state i. We are interested in h
1
= E[
B
]. By symmetry,
h
2
= h
6
. By conditioning on the rst step, we nd:
h
1
= 1 +h
2
h
2
= 1 +
1
2
h
1
Solving yields (h
1
, h
2
) = (4, 3). Thus, E[
B
] = 4.
(b) Redene h
i
to be the mean time to hit state 3 from initial state i. By symmetry, h
1
= h
5
and h
2
= h
4
.
By conditioning on the rst step, we nd
h
1
= 1 + (h
2
+h
6
)/2 h
2
= 1 +
1
2
h
1
h
6
= 1 +h
1
Solving yields (h
1
, h
2
, h
6
) = (8, 5, 9). Thus, E[
3
] = 8.
(c) The mean time to reach {3, 4, 5} is four. When {3, 4, 5} is rst reached, the state is either 3 or 5. If
3 is reached rst, then the process continues until state 5 is reached. If 5 is reached rst, then the process
continues until state 3 is reached. Once either state 3 or 5 is reached, by symmetry, the mean amount of
additional time needed to reach the other is the same as E[
3
] in part (b). So E[
C
] = 4 + 8 = 12.
(d) The dierence
R

C
is the time needed to reach state 1 starting from either state 3 or 5, and by
symmetry, the mean of this dierence is E[
3
]. Thus, E[
R
] = 4 + 8 + 8 = 20.
5 Marginal distributions for a continuous-time three state Markov process
(a) Q =
_
_
0
2
0
_
_
.
(b) Solving ()Q = 0 for the probability vector () gives () =
_

2+
,

2+
,

2+
_
.
(c) We shall solve the forward Kolmogorov equations
(t)
t
= (t)Q. The second of the equations, upon
substituting
0
(t) +
2
(t) = 1
1
(t), becomes

1
(t)
t
=
0
(t) 2
1
(t) +
2
(t) = (2 +)
1
(t)
which, with the initial condition
1
(0) = 0, yields

1
(t) =

2 +
_
1 e
(2+)t
_
.
The other two equations,
0(t)
t
=
0
(t) +
1
(t) and
2(t)
t
=
2
(t) +
1
(t), can be combined
to yield
(0(t)2(t))
t
= (
0
(t)
2
(t)). Solving, with the initial condition
0
(0)
2
(0) = 1, yields
2

0
(t)
2
(t) = e
t
. Also,
0
(t) +
2
(t) = 1
1
(t). Putting it together yields

0
(t) =

2 +
+
1
2
_
e
(2+)t
2 +
+e
t
_

2
(t) =

2 +
+
1
2
_
e
(2+)t
2 +
e
t
_
6 Conditioning a Gauss-Markov process
(a) Since E[X
t
] = 0 for all t, Cov(X
t
, X
0
) = R
X
(t) and

o
X
(t) =

E[X
t
|X
0
= 0] = E[X
t
] +
Cov(Xt,X0)
Var(X0)
(0 E[X
0
]) = 0.
(b) The desired quantity, R
o
X
(s, t) is the upper right entry of the conditional covariance matrix of
_
Xs
Xt
_
given
X
0
. By the formula for the covariance of error for MMSE estimation, said matrix is given by:
Cov
__
X
s
X
t
_

X
0
_
= Cov
__
X
s
X
t
__
Cov
__
X
s
X
t
_
, X
0
_
Var(X
0
)
1
Cov
_
X
0
,
_
X
s
X
t
__
=
_
1 e
|st|
e
|st|
1
_

_
e
|s|
e
|t|
_
_
e
|s|
e
|t|
_
which yields R
o
X
(s, t) = e
|st|
e
(|s|+|t|)
.
(c) Yes, the conditional distribution has the Markov property. Note that C
o
X
(s, t) = R
X
(s, t) = 0 if st 0
(i.e. if s and t dont have the same sign) so that under the conditional distribution, the process for positive
time is independent of the process for negative time, and the correlation structure is symmetric in time. It
hence suces to prove that the process restricted to times in (0, +) is Markov. To that end, we nd the
correlation coecient between X
s
and X
t
under the conditional distribution, for 0 < s t :

o
X,Y
(s, t) =
e
(st)
e
(s+t)

1 e
2s

1 e
2t
=
e
s
(1 e
2s
)

1 e
2s
e
t

1 e
2t
=

e
2s
1

e
2t
1
Therefore, for 0 < r < s < t,
o
X,Y
(r, t) =
o
X,Y
(r, s)
o
X,Y
(s, t), so that the process under the conditional
distribution is Markov.
Alternative argument: Another way to show that the process restricted to positive times is Markov is to refer
directly to the Markov property of X itself. Given 0 < r < s < t, by the Markov property of X, the condi-
tional distribution of X
t
given X
0
, X
r
, X
s
is the same as the conditional distribution of X
t
given X
s
, which
is the same as the conditional distribution of X
t
given X
0
, X
s
. Thus, for the conditional distributions given
X
0
= 0, the conditional distribution of X
t
given X
r
, X
s
is the same as the conditional distribution of X
t
given
X
s
. This, together with the fact the conditional distribution is Gaussian, implies it has the Markov property.
3