You are on page 1of 8

MILITARY TECHNICAL COLLEGE Department: Communications

Approved by ( ) Term: Winter 2019/2020


Commandant of the M. T. C. Spec.: Postgraduate
Questions for the Final Exam
Subject : Stochastic Signal Detection and Estimation
Examiner: Lt. Col. Dr. Amr M. Ashry Time: 180 minuets

Answer All Questions Number of Questions: 6 Number of Pages: 3


Question 1:
Let 𝑌 ~𝒩(0, 𝜎 2 ) and the minimum mean square error (MMSE) estimator for a related
random variable 𝛳 based on 𝑌 be
𝜃̂𝑀𝑀𝑆𝐸 (𝑌) = 𝑌 2 .
a- Could 𝜃 and 𝑌 be jointly Gaussian? Explain briefly.
b- Determine 𝜃̂𝐿𝐿𝑆𝐸 (𝑦), the linear least squares estimate of 𝛳 based on 𝑌
Hint: 𝔼[𝑌 3 ] = 0
c- If the error variance of the LLSE estimator is 3𝜎 4 , determine the error variance of the
MMSE.
Hints:
1- 𝜃̂𝐿𝐿𝑆𝐸 (𝑌) − 𝛳 = [𝜃̂𝑀𝑀𝑆𝐸 (𝑌) − 𝛳] + [𝜃̂𝐿𝐿𝑆𝐸 (𝑌) − 𝜃̂𝑀𝑀𝑆𝐸 (𝑌)]
2- 𝔼[𝑌 4 ] = 3𝜎 4
Solution:
Question 2:
Suppose for 𝑖 = 1, 2,
𝑌𝑖 = 𝜃 + 𝑁𝑖 ,
where 𝜃 is an unknown constant and N1 and N2 are statistically independent, zero-mean
Gaussian random variables with:
𝑉𝑎𝑟(𝑁1 ) = 1
1, 𝜃 ≥ 0
𝑉𝑎𝑟(𝑁2 ) = {
2, 𝜃 < 0
a- Calculate and plot the Cramer-Rao bound for unbiased estimators of 𝜃 based on the
⃑⃑⃑ = [𝑦1 𝑦2 ] 𝑇 .
observation 𝑦
b- Show that a minimum variance unbiased estimator 𝜃̂𝑀𝑉𝑈𝐸 (𝑌) does not exist.
Hints:
1 1
1- 𝜃̂1 = 𝑌1 + 𝑌2
2 2
2 1
̂
2- 𝜃2 = 𝑌1 + 𝑌2
3 3
Solution:
Question 3:
Consider the estimation of a non-random but unknown parameter θ from an observation of
the form 𝑌 = 𝜃 + 𝑊, where 𝑊 is a random variable. For parts (a) and (b) suppose 𝑊 is a
zero-mean Laplacian random variable, i.e.,
𝛼 −𝛼|𝑊|
𝑓 (𝑊 ) = 𝑒
2
for some 𝛼 > 0.
a- Find the maximum likelihood (ML) estimator, 𝜃̂𝑀𝐿 (𝑌) and the associated mean-
square error.
∞ 𝛼 2
Hint: ∫−∞ 𝑤 2 𝑒 −𝛼|𝑤| 𝑑𝑤 = 2
2 𝛼
b- Does an efficient estimator for 𝜃 exist? If so determine 𝜃̂𝑒𝑓𝑓 (𝑌). If not, explain.
c- Suppose 𝑊 is a zero-mean random variable with pdf 𝑝(𝑤) depicted in the below
figure. Does an efficient estimator for 𝜃 exist? If so determine 𝜃̂𝑒𝑓𝑓 (𝑌). If not,
explain.
Solution:
Question 4:
a- Consider the signal detection problem:
𝐻0 ∶ 𝑌⃑ = ⃑⃑⃑
𝑠0 + 𝑁⃑
𝐻1 ∶ 𝑌⃑ = ⃑⃑⃑
𝑠1 + 𝑁⃑
where 𝑁 ⃑ ~𝒩 (0 ⃑ , 𝛴𝑁 ) , 𝛴𝑁 = [2 0]. Find the pair ⃑⃑⃑
𝑠0 and 𝑠⃑⃑⃑1 which maximizes the
0 1
detection performance (i.e. PD is maximized subject to any given 𝑃𝐹 ≤ 𝛼) subject
to ‖𝑠⃑⃑⃑0 ‖2 = ‖𝑠⃑⃑⃑1 ‖2 = 1 . A brief graphical argument is sufficient.

b- Consider the problem given in part (a) with the variance of the first component of
noise reduced by 50%, i.e., 𝛴𝑁 = 𝐼 . Qualitatively compare the ROC in this scenario
(with the performance maximizing ⃑⃑⃑
𝑠0 , ⃑⃑⃑
𝑠1 pair) vs. the ROC of part (a). Briefly explain
your answer.
Solution:

Question 5:
Assume that you have 𝑛 observations from an 𝑖𝑖𝑑 sequence of exponential random variables.
Consider the following hypothesis testing problem:
𝑎 𝑒 −𝑎𝑦𝑘 𝑦𝑘 ≥ 0
𝐻0 : 𝑓𝑌𝑘|𝐻 (𝑦𝑘 |𝐻0 ) = { 𝑘 = 1,2, … … , 𝑛
0 𝑦𝑘 < 0
𝑏 𝑒 −𝑏𝑦𝑘 𝑦𝑘 ≥ 0
(𝑦 |𝐻
𝐻1 : 𝑓𝑌𝑘|𝐻 𝑘 1 = { ) 𝑘 = 1,2, … … , 𝑛
0 𝑦𝑘 < 0

Assume that 𝑎 > 0 is known, 𝑏 > 0 is unknown, and 𝑏 ≠ 𝑎.


a- Assuming that 𝑏 > 𝑎, determine the form of the LMP test. Hint: You do not need
to solve for threshold 𝜂0 .
b- Does a UMP test exist when 𝑏 > 𝑎? Explain. If your answer is yes, give the test.
c- Does a UMP test exist if it is not known whether 𝑏 > 𝑎 or 𝑎 > 𝑏? (Explain your
answer.)
Solution:
Question 6:
Let Θ be a random variable with pdf 𝑤(𝜃) and let the conditional pdf of Θ given the
observation 𝑌⃑ be 𝑤(𝜃 |𝑌 ⃑ ). Show that the minimum mean absolute error (MMAE) estimate
𝜃̂𝑀𝑀𝐴𝐸 (𝑦) is the conditional median (as given in lecture notes) of 𝑤(𝜃 |𝑌
⃑ ). (Hint: For a non-

negative random variable 𝑋, 𝐸[𝑋] = ∫0 𝑃(𝑋 > 𝑥) 𝑑𝑥.)

Solution:

See Page 155, Case VI.B.2 in Poor’s Book (Attached)

====================GOOD LUCK===================

You might also like