You are on page 1of 2

Assignment 3

1. Suppose that Xi ∼ N (µ, σ 2 ), i = 1, . . . , n, and Zi ∼ N (0, 1), i = 1, . . . , k, and all random variables are indepen-
Pn Pk Pn Pk
dent. Let X̄ = n1 i=1 Xi , Z̄ = k1 i=1 Zi , SX 2 1
= n−1 2
i=1 (Xi − X̄) and SZ = k−1
2 1 2
i=1 (Zi − Z̄) . State the
distribution of each of the following variables and justify your reasoning.

[4] (a) √nk(
Pk
X̄−µ)
2
.
σ i=1 Zi
Pn 2
i=1 (Xi −µ) k
(Zi − Z̄)2 .
P
[4] (b) 2
σP + i=1
n 2
(k−1) i=1 (Xi −X̄)
[4] (c) P k
(n−1)σ 2 i=1 (Zi −Z̄)2
.

2. Suppose X follows a χ2 distribution of 2n degrees of freedom.

[5] (a) If X = Y + Z, where Y ∼ χ2 (n) and Y is independent of Z, what is the distribution of Z? Justify your
answer. [5 marks]
[3] (b) Find the limiting distribution of X
n as n → ∞.
[3] (c) Find the the distribution of cX, where c > 0 is a constant.

3 Answer the following questions.

[5] (a) We assume that X1 , . .P


. , Xn are i.i.d random variables with the probability function given by P (Xi = 1) =
n
P (Xi = −1) = 0.5. Let Yn = i=1 X 2i . Prove that Yn converges to the √
i
uniform distribution on [-1, 1] in distribution.
[5] (b) We assume that {ξk }k=1 are i.i.d following N (0, 1). Let ηn = √Pnξnn+1 2 . Then what is the exact distribution

k=1 ξk
of ηn ? Please also find the limiting distribution of ηn as n → ∞.

2
4. Suppose
Pn X1 , . . . , Xn are independently and identically
Pndistributed N (µ, σ ) random variables. Let S 2 =
1 2 1
n−1 i=1 (Xi − X̄) be the sample variance, where X̄ = n i=1 Xi is the sample mean.

[5] (a) Show S 2 converges in probability to σ 2 .


[5] (b) Use the central limit theorem (and the definition of the χ2 distribution with n − 1 degrees of freedom) to
prove that
√ d
n(S 2 − σ 2 ) → N (0, ψ 2 )
and give a formula for ψ 2 .
[5] (c) Prove that
√ d
n log(S/σ) → N (0, τ 2 )
and give a formula for τ 2 .

1
5. Let X1 , . . . , Xn be a random sample from a distribution with pdf f (x) = 2θ2 x−3 for x ≥ θ and 0 otherwise,
where θ a positive parameter.

[3] (a) Find the Method of Moment estimator of θ.


[3] (b) Find the Maximum Likelihood estimator of θ, denoted by θ̂.
[5] (c) Is θ̂ an unbiased estimator of θ? Does θ̂ converge in probability to θ? Justify your reasoning.

6. Suppose that X1 , X2 , . . . , Xn are n independent random variables that follow exponential distribution with mean
1
λ.
Pn
[5] (a) Find the probability density function of i=1 Xi .
[3] (b) Find the maximum likelihood estimator (MLE) of λ.
[5] (c) Let λ̂ denote the MLE of λ. Find appropriate normalization constants an and bn such that an (λ̂ − bn )
converges to a normal random variable in distribution. Justify your answer.

You might also like