You are on page 1of 2

E1 244 - Detection and Estimation Theory (Jan 2022)

Assignment 4 Due Date: March 27, 2022


Total Marks: 60

General Assignment Guidelines:


• Submission via Teams: Assignments will be allocated to you via the Teams “Assignments” feature. You
will have to upload your answer sheet via the same feature in Teams. Answer sheets sent to me or
the TAs by e-mail will not be considered. Please familiarize yourself with this feature.
• Late Submission Policy: Assignment submission beyond the deadline is allowed as per the below policy:
– Delay of 24 hours will attract 20% penalty
– Delay of 48 hours will attract 40% penalty
– Delay of 72 hours days will attract 60% penalty
– Assignments submitted beyond 72 hours will not be considered
The upload time reflected in the Teams will be considered as final. You are highly encouraged
to upload your answer sheets well before the deadline so that any potential connection/technical issues
can be resolved in time (recommended time is at least 1 hour before the deadline).
• File Logistics: You can scan your handwritten answers, or use a tablet, or typeset your answers in Latex.
Following logistics should be followed while uploading the answer sheet:
– Make sure your scans are properly visible. You can try using some scan apps (Ex. AdobeScan) to
get better results.
– The total file size should not be too large (ideally, less than 5 Mb). You can use available
apps/softwares to reduce the file size. Make sure that the answers are clearly visible while re-
ducing the size.
– Only a single pdf file is allowed to be uploaded (other formats will not be accepted).
– The order of answers in your file should be in the same as the order of questions. Check that you
have included all the pages in your file before uploading it.
– Name you file as: DET Assignmentx FirstName, where x is the assignment number (x=1,2,...)
– Mention your name, course name and submission date on the first page.
• Collaboration Policy: You are allowed to discuss the concepts/questions with your classmates. However,
the final answer should be as per your own understanding. Merely copying solutions from class-
mates/online sources will attract significant penalty and strict disciplinary action (refer to Section 13.2
and 13.3 of the IISc student information handbook - link). If you have collaborated/discussed with other
classmates or referred some online resource for a particular question, clearly mention the names of
the classmates/online resource at the beginning of your answer.

All questions carry equal marks but are not equally difficult, so you may want to strategize accordingly.

Problem 1 Let the conditional distribution of observation Y given a real random parameter Θ = θ > 0 be :

f (y/θ) = θe−θy 1{y≥0} .

The prior distribution of θ is given by

f (θ) = αe−αθ 1{θ≥0} ,

where α > 0 is a known constant.


(a) Compute θ̂M AE (y) and its associated error covariance.
(b) Compute θ̂M AP (y) and its associated error covariance.
(c) Compute θ̂M M SE (y) and its associated error covariance.

Problem 2 Let the conditional distribution of observation Y given a real random parameter Λ = λ ≥ 0 be
f (y/λ) ∼ Poisson(λ). The distribution of Λ is given by

1
e−λ λα−1
f (λ) = 1{λ≥0} ,
Γ(α)

where α > 0 is a known constant, and Γ(·) is the Gamma function defined as:
Z ∞
Γ(x) = z x−1 e−z dz.
0

(a) Compute θ̂M AP (y) and its associated error covariance.


(b) Find the MMSE estimate of Θ given Y, where Θ = e−Λ .

Problem 3 Let Yi ∼ Unif[1, θ], i = 1, 2, · · · , N be i.i.d observations and the parameter θ ∼ Unif[1, 10].
(a) Find θ̂M AP (y).
(b) Show that θ̂M AP (y) converges in distribution to the true value of θ as N → ∞, that is,

lim Pr(θ̂M AP (y) ≤ z) = Pr(Θ ≤ z) for all z ∈ R


N →∞

Problem 4 Let yi = a + wi for i = 1, 2, · · · , N where wi are i.i.d. and independent of a with distribution
2
wi ∼ N (0, σw ). Further, a ∼ N (0, σa2 ).
(a) Compute âM AP (y), âM M SE (y) and their associated error covariances.
(b) Consider an alternate sequential procedure to compute the MMSE estimate.
• Suppose you have obtained j measurements y1 , y2 , · · · , yj . Compute the MMSE estimate of a based
on these j measurements and denote it by âj = âM M SE (y1 , · · · , yj ). Denote the corresponding error
covariance as σj2 .
2
• Express âj as a function of âj−1 , σj−1 and yj .
1 1 j
• Show that 2 = 2+ 2.
σj σa σw

 
x
Problem 5 Let x = 1 ∼ N (0, Rx ).
x2
(a) Compute the MMSE estimate x̂2 (x1 ) of x2 based on x1 .
(b) Compute the associated MMSE cost in part (a) and prove that it is zero if and only if Rx is singular.
(c) Extend the result to show that if x ∼ N (0, Rx ) is a N -dimensional random vector and Rx is not positive
definite, then any component of x can be perfectly estimated by a linear combination of other components.

Problem 6
(a) Let vector parameters β, θ1 and θ2 be related as β = A1 θ1 + A2 θ2 + c, where A1 , A2 and c are known.
Show that β̂LLS = A1 θ̂1,LLS + A2 θ̂2,LLS + c.
(b) If y1 and y2 are orthogonal, then x̂LLS (y1 , y2 ) = x̂LLS (y1 ) + x̂LLS (y2 ).

You might also like