You are on page 1of 5

ECE 4110: Random Signals in Communications andSignal Processing

ECE department, Cornell University, Fall 2013Homework 5


Due Friday October 11 at 5:00 p.m. Feel free to turn it in earlier!!!1. Let
Y
i
=
X
+
Z
i
for
i
= 1
,
2
,...,n
be
n
observations of a signal
X
~
N
(0
,P
).The additive noise components
Z
1
,Z
2
,...,Z
n
are zero-mean jointly Gaussian randomvariables that are independent of
X
. Furthermore, assume that the noise components
Z
1
,...,Z
n
are uncorrelated, each with variance
N
. Find the best MSE estimate of Xgiven
Y
1
,Y
2
,...,Y
n
and its MSE.
Hint: It might be convenient to assume a form of the estimator and use the ortho
gonality principle to claim optimality.
2. Suppose that
g
(
? Y
) is the linear least-square error (LLSE) estimator for
X
given
? Y
:
g
(
? Y
) =
L
[
X
|
? Y
] =
K
X ? Y
K
-
1
? Y
(
? Y
-
E
[
? Y
]) +
E
[
X
]
.
Determine the mean square error
E
[(
X
-
g
(
? Y
))
2
]in terms of the means, covariances, and cross-covariances of
X
and
? Y
.3. Prove that(a)
L
[
X
1
+
X
2
|
? Y
] =
L
[
X
1
|
? Y
] +
L
[
X
2
|
? Y
](b)
L
[
L
[
X
|
? Y,? Z
]
|
? Y
] =
L
[
X
|
? Y
]4. Let
? X
be Gaussian random vector with mean [1 4 6]
T
and covariance matrix
??
3 1 01 2 10 1 1
??
.
(a) Compute
E
[
X
1
|
X
2
] and
E
[(
X
1
-
E
[
X
1
|
X
2
])
2
].(b) Compute
E
[
X
1
|
X
3
] and
E
[(
X
1
-
E
[
X
1
|
X
3
])
2
].(c) Compute
E
[
X
1
|
X
2
,X
3
] and
E
[(
X
1
-
E
[
X
1
|
X
2
,X
3
])
2
].(d) Note that
X
1
and
X
3
are uncorrelated, and hence independent. Yet
E
[
X
1
|
X
2
,X
3
]is a function of both
X
2
and
X
3
. Why is that?5. Let
X
n
be a sequence of i.i.d. equiprobable Bernoulli random variables and let
Y
n
= 2
n
X
1
X
2
...X
n
.
(a) Does this sequence converge almost surely, and if so, to what limit?(b) Does
this sequence converge in mean square, and if so, to what limit?

You might also like