You are on page 1of 2

Statistics 131 Worksheet 10

rs
1. Let X1 , · · · , Xn ∼ U (0, θ), θ > 0. Find unbiased estimators of θ.

θ = (θ), Ωθ = {θ : θ > 0}, τ (θ) = θ, SX = {(x1 , . . . , xn ) : xi ∈ (0, θ) ∀i}

We have
θ θ
E(X1 ) = ⇒ E(X) = ⇒ T1 = 2X is unbiased for θ
2 2
 
nθ n+1
Also, since E[X(n) ] = , then T2 = X(n) is unbiased for θ. Observe
n+1 n
that T3 = X(n) is biased with

b(T3 ) = θ − E(T3 )
 
nθ n
= θ− =θ 1−
n+1 n+1
θ
= → 0, as n → ∞,
n+1
so that T3 is asymptotically unbiased for θ. 

2. Let X ∼ Bin(1, p), p ∈ [0, 1]. Show that there exists one and only one unbiased
estimator of p.

θ = (p), Ωθ = {p : p ∈ [0, 1]}, τ (θ) = p, SX = {(x1 , . . . , xn ) : xi ∈ {0, 1} ∀i}

We have E(X) = p, so that X is unbiased for p. Now, any estimator T = T (X) of


p based on X is of the form

t0 if X = 0
T = t0 I{0} (X) + t1 I{1} (X) = ,
t1 if X = 1

so that

E(T ) = p(t1 − t0 ) + t0 = p ⇔ t0 = 0, t1 = 1

so that T ≡ X. That is, X is the only unbiased estimator of p, and hence, is the
UMVUE of p. Observe that X, while being the so-called “best” estimator, estimates
p as either 0 (success is impossible) or 1 (success is certain). 
rs
3. Let X1 , · · · , Xn ∼ Bin(1, p), p ∈ (0, 1). Show that no unbiased estimator exists for
p
h(p) = , the odds.
1−p
p
θ = (p), Ωθ = {p : p ∈ [0, 1]}, τ (θ) = , SX = {(x1 , . . . , xn ) : xi ∈ {0, 1} ∀i}
1−p
Let T = T (X1 , · · · , Xn ) be an unbiased estimator of h(p). By LOTUS, we have
p ?
X
h(p) = = E(T ) = tj pnj (1 − p)n−nj ,
1−p t j

where nj is the number of Xi ’s equal to 1 such that T = tj . The above equation


p
implies that a polynomial in p (right-hand side) equals the ratio h(p) = (left-
1−p
hand side), which is not a polynomial in p. Therefore, no unbiased estimator exists
for h(p), and hence, no UMVUE exists for the odds. 

rs
4. Let X1 , · · · , Xn ∼ U (0, θ), θ > 0. Find a sufficient statistic for θ.
θ = (θ), Ωθ = {θ : θ > 0}, τ (θ) = θ, SX = {(x1 , . . . , xn ) : xi ∈ (0, θ) ∀i}
n n
Y Y 1
fX (xi ; θ) = I(0,θ) (xi )
i=1 i=1
θ
1
= I{(x1 ,··· ,xn ):xi <θ,∀i} (x1 , · · · , xn )
θn
1
= n I(0,θ) (yn ), yn = max(x1 , · · · , xn ),
|θ {z }
g(yn ;θ)

with h(x1 , · · · , xn ) ≡ 1. Thus, Yn = max(X1 , · · · , Xn ) is sufficient for θ. 

rs
5. Let X1 , · · · , Xn ∼ U (θ − a, θ + a), θ ∈ R and a is known. Find jointly sufficient
statistics for θ.
θ = (θ), Ωθ = {θ : θ ∈ R}, τ (θ) = θ, SX = {(x1 , . . . , xn ) : xi ∈ (θ − a, θ + a) ∀i}
n n
Y Y 1
fX (xi ; θ) = I(θ−a,θ+a) (xi )
i=1 i=1
2a
 n
1
= I(θ−a,yn ) (y1 )I(θ−a, θ+a) (yn ),
2a
| {z }
g(y1 ,yn ;θ)

and
 h(x  1 , · · · , xn ) ≡ 1, y1 = min(x1 , · · · , xn ) and yn = max(x1 , · · · , xn ). Thus, S =
Y1
is sufficient for θ (or Y1 = min(X1 , · · · , Xn ) and Yn = max(X1 , · · · , Xn )
Yn
are jointly sufficient for θ). 

You might also like