Professional Documents
Culture Documents
1. Let X1 , . . . , Xn be a random sample from a Bernoulli distribution with parameter p ∈ (0, 1), that is,
and so,
n ! n ! P P
X X Xi (n − Xi )
E 1(X1 = 1, X2 = 0) Xi = P X1 = 1, X2 = 0 Xi = .
i=1
i=1
n(n − 1)
Hence,
P P
n Xi Xi
1−
n−1 n n
is a UMVUE of q(p), and so, S ∗ (X) can be written as a function of T (X), say S ∗ (X) = g(T (X)).
Then,
n
p ∗
X n k
= ES (X) = g(k) p (1 − p)n−k .
1−p k
k=0
Note that the RHS is a polynomial in p of degreen at most n, while p/(1 − p) cannot be written
as a polynomial, which is a contradiction. Hence, no unbiased estimator of p/(1 − p) exists.
c) Determine necessary and sufficient conditions on q(p) such that the UMVUE of q(p) exists.
Solution. If q(p) is a polynomial in p of degree at most n, then UMVUE of q(p) exists. Is this
condition sufficient?...
2. Let Y1 , . . . , Yn be independent with distribution Yi ∼ N (θ0 + θ1 xi , 1), where x1 , . . . , xn are known real
numbers.
a) Determine the maximum likelihood estimator θ̂0 , θ̂1 of (θ0 , θ1 ).
Solution. The likelihood function is
n n
!
(yi − θ0 − θ1 xi )2
Y 1 −n 1X 2
L(θ0 , θ1 ; y) = √ exp − = (2π) 2 exp − (yi − θ0 − θ1 xi ) ,
i=1
2π 2 2 i=1
Hence, the Cramer-Rao lower bound for the estimation of θ1 when θ0 is unknown is
n
.
x2i − ( xi )2
P P
n
and so,
n n n
d X X X
log L = xi yi − θ 0 xi − θ 1 x2i .
dθ1 i=1 i=1 i=1
Hence,
d2 X
2 log L = − i = 1n x2i ,
dθ1
and so, the Fisher information is
n
X
I(θ1 ) = x2i ,
i=1
and thus, the Cramer-Rao lower bound for the estimation of θ1 when θ0 is known is
1
P 2.
xi
d) With both parameters unknown, find a simple necessary condition on a sequence of real numbers
x1 , x2 , . . . such that θ̂0 , θ̂1 is consistent for (θ0 , θ1 ).