Professional Documents
Culture Documents
f (x|θ) Pn 2 2
L(x, y|θ) := = e−(1/2) l=1 (xl −yl )+θn(x̄−ȳ) .
f (y|θ)
We see that L(x, y|θ) ≡ K(x, y) for all θ and all pairs (x, y) iff x̄ = ȳ. Thus T = X̄ is
Minimal SS (MSS).
b) Here f (x|θ) = e−(x−θ) I(x > θ) and θ ∈ (−∞, ∞). Write,
Pn
f (x|θ) e− l=1 xl I(x(1) > θ)
L(x, y|θ) := = − Pn y . (1)
f (y|θ) e l=1 l I(y
(1) > θ)
(i) Suppose that L(x, y|θ ≡ K(x, y) for all θ ∈ (−∞, ∞). This yields that x(1) = y(1) .
(ii) Suppose that x(1) = y(1) . Then (1) implies that L(x, y|θ) does not depend on θ.
The properties (i) and (ii) establish, according to Theorem 6.2.13, that T = X(1) is the
minimal sufficient statistic.
e) Here f (x|θ) = (1/2)e−|x−θ| I(x ∈ (−∞, ∞)) for θ ∈ (−∞, ∞). Write for ordered
observations:
f (x|θ)
L(x, y|θ) :=
f (y|θ)
P P
l: x(l) ≤θ
(θ−x(l) )− l: x(l) >θ
(x(l) −θ)
e
= P P
(θ−y(l) )− l: y >θ (y(l) −θ)
.
l: y(l) ≤θ
e (l)
Now note that if at least one pair of ordered observations is such that x(l) 6= y(l) then
L(x, y|θ depends on θ. Observing this, Theorem 6.2.13 and some straightfirward steps
(similar to the previous case), show that T = (X(1) , . . . , X(n) ) is MSS.
2. Exerc. 6.11. (a) All the considered distributions (random variables) are from a
so-called location family where
D
X = Z + θ, Z ∼ fX (x|θ = 0). (2)
You can aslo think about (2) in the following way: there is a random variable Z with
distribution corresponding to θ = 0, and then any other RV from the same distribution with
parameter θ is generated as shown in (2).
1
Let us prove (2) using moment generating function approach (another approach is to
calculate fX (x|θ) using our techniques in Chapter 2). Recall that MX (t) = E(etX ) and X
has the same distribution as Y iff MX (t) ≡ MY (t) for all t in some vicinity of t = 0, say
t ∈ (−δ, δ), δ > 0.
Thus, to check (2), we need to show that
for the case of a location family f (x|θ) =: ψ(x − θ). Here fZ (z|θ) = ψ(z) is the pdf of Z.
To prove (3) we write
Z ∞
MX (t) = E(etX ) = ψ(x − θ)etx dx
−∞
(i) For the case (a) - normal distribution - the MSS is X̄, and it is complete because
normal distribution belongs to an exponential family. Indeed, we can write
f (x|θ) = C(θ)h(x)eθnx̄
and then use Theorem 6.2.25. Then Basu’s Theorem yields that X̄ and Yi are independent.
(ii) For the case (b) - location exponential family - the MSS is X(1) . Is it complete? Let
us check. Suppose that for a function g
This is equivalent to Z ∞
fX(1) (t|θ)g(t)dt ≡ 0, θ ∈ (−∞, ∞). (5)
θ
2
Well, now is the time to remember the density for X(1) . While I know that you remember the
formula, let us one more time deduce it. We begin with the corresponding cdf (cumulative
distribution function)
From here we get dq(θ)/dθ ≡ 0 as well. In its turn, this yields (take the derivative)
which finally yields g(θ) ≡ 0 for all θ ∈ (−∞, ∞). We established that X(1) is complete!
Conclusion: By Basu’s Theorem the CMSS X(1) is independent of the ancillary statistic
Y := {X(n) − X(i) , i = 1, 2, . . . , n − 1}.
REMARK: Do you see that this exponential distribution is not from the exponential
family? This is the reason for the direct proof!
(iii) For the double-exponential distribution the statistic T = (X(1) , X(2) , . . . , X(n) ) is
MSS. Clearly it is not independent of Y := {X(n) − X(i) , i = 1, 2, . . . , n − 1} because if you
know T then you know Y .
3. Exers. 6.19. Let us begin with distribution 1. Let ψ(X) be such that Ep (ψ(X)) ≡ 0
for all p ∈ (0, 1/4). This means that
or equivalently
p[ψ(0) + 3ψ(1) − 4ψ(2)] ≡ −ψ(2), p ∈ (0, 1/4).
This is possible for ψ(2) = 0 and ψ(0) = −3ψ(1) 6= 0. We conclude that ψ(k) is not
necessarily equal to 0, so the family of distributions of X is not complete.
3
For the second distribution, similarly assume that Ep {ψ(X)} ≡ 0 for p ∈ (0, 1/2). This
yields
pψ(0) + p2 ψ(1) + (1 − p − p2 )ψ(2) ≡ 0, p ∈ (0, 1/2).
The last relations yields
A polynom is identically equal to zero on an interval iff all its coefficients are zero. This
yields that ψ(k) ≡ 0.
Conclusion: The distribution is complete.
4. Exerc. 6.20.
(b) We have
θ
f (x|θ) = , x ∈ (0, ∞), θ > 0.
(1 + x)1+θ
Note that here Ω = (0, ∞). Write,
n Pn
f (x|θ) = θn (1 + xl )1+θ = θn e(1+θ) ln(1+xl )
Y
l=1 .
l=1
We see that the distribution is from an exponential family and Ω := (0, ∞) contains an open
set, say (1, 2). Thus, by Theorem 6.2.25 the statistic T := nl=1 ln(1 + Xl ) is CSS.
P
(c) Here
f (x|θ) = log(θ)(θ − 1)θx , x ∈ Ω := (1, ∞).
Write,
n Pn
f (x|θ) = [log(θ)/(θ − 1)]n θxl = [log(θ)/(θ − 1)]n eθ xl
Y
l=1 .
l=1
This is the distribution from an exponential family and Ω contains an open set, say (2, 5).
Thus, by Theorem 6.2.25 the statistic T = nl=1 Xl (or X̄) is CSS.
P
Write,
n
!
2
θxl (1 − θ)2−xl
Y
f (x|θ) =
l=1
xl
n
!
2 Pn
eln(θ/(1−θ)) xl
(1 − θ)2n .
Y
= l=1
l=1
xl
This is again the distribution from an exponential class with Ω containing an open interval.
This X̄ is CSS by Theorem 6.2.25.
4
5. Exerc. 6.21. X is distributed according to
(a) X is sufficient (of course - this is the observation) but not complete. Let us show this.
Suppose that Eθ ψ(X)) ≡ 0, θ ∈ [0, 1]. This is equivalent to
or
θ[(1/2)ψ(−1) − ψ(0) + (1/2)ψ(1)] + ψ(0) ≡ 0, θ ∈ [0, 1].
A polynom is zero over an interval iff all its coefficients are zero. This implies that
(
ψ(0) = 0
(1/2)[ψ(−1) + ψ(1)] − ψ(0) = 0
Solving this system yields ψ(0) = 0 and ψ(−a) = −ψ(1). As a results, ψ(k) is not necessarily
identical zero. For instance, one may choose ψ(−1) = 5, ψ(0) = 0 and ψ(1) = −5.
b),c) Let us begin with c). Write
This is the distribution from an exponential class. Then by Theorem 6.2.25, the statistic |X|
is CSS. Clearly it is minimal as well (what can you reduce here?), or if unclear use Theorem
6.2.28.
Now, the last identity implies that ψ(µ) ≡ 0. To prove this, note that we have µ∞ e−nx ψ(x)dx ≡
R
0 and then take the derivative of the integral with respect to µ. Thus X(1) is complete.
5
(b) Using notation in the text, S 2 := [1/(n − 1)] nl=1 (Xl − X̄))2 .
P
We are dealing with the location parameter here, so we can again use the same trick as
before. Set X = Z + µ where Z is distributed with pdf fZ (z) = e−z I(z > 0). Then
n n
S 2 = (n − 1)−1 (Zl + µ − (Z̄ + µ))2 = (n − 1)−1 (Zl − Z̄)2 .
X X
l=1 l=1
This shows us that S 2 is ancillary with respect to µ. Then Basu’s Theorem implies that X(1)
and S 2 are independent.