You are on page 1of 8

Mathematical Statistics (MA212M)

Lecture Slides
Lecture 25
Sufficient Statistics
Def: A statistic T = T (X ) is called a sufficient statistic for unknown
parameter θ is the conditional distribution of X given T = t does
not include θ for all t in the support of T .
i.i.d.
Pn 117: X1 , X2 , . . . , Xn ∼ Bernoulli(p), p ∈ (0, 1). Take
Example
T = i=1 Xi . We know that T ∼ Bin(n, p). Now for
t = 0, 1, . . . , n,
P (X1 = x1 , . . . , Xn = xn , T = t)
P (X1 = x1 , . . . , Xn = xn |T = t) =
 Qn xT = t1−x
 i=1 p i (1−p) i if Pn x = t
(
P(X1 =x1 , ..., Xn =xn ) P n
T =t if i=1 xi = t (nt)pt (1−p)n−t i=1 i
= =
0 otherwise 0 otherwise
( 1 Pn
n if i=1 xi = t
= (t )
0 otherwise,
Pn
which does not include p. Hence T = i=1 Xi is a sufficient statistics for p.
Nayman-Fisher Factorization Theorem

Theorem: (Without proof) Let X1 , . . . , Xn be RS with JPMF/JPDF


fX (x, θ), θ ∈ Θ. Then T = T (X1 , . . . , Xn ) is sufficient for θ if and
only if

fX (x, θ) = h(x)gθ (T (x)) ,

where h(x) does not involve θ, gθ (·) depends on θ and x only


through T (x).
Examples

i.i.d.
Example 118: Let X1 , X2 , . . . , Xn ∼ P(λ), λ > 0. Here the
likelihood function is
e −nλ λnx
L(λ, x) = Qn = h(x)gλ (T (x)) ,
i=1 (xi !)

−1
where h(x) = [ ni=1 (xi !)] , gλ (t) = e −nλ λnt , and T (x) = x. This
Q
shows that T = X is a sufficient statistic for λ.
i.i.d. 2
Example 119: Let X1 , X2 , . . . , Xn ∼ N(µ, σP ), µ ∈ R and σ > 0.
A sufficient statistic for (µ, σ ) is ( i=1 Xi , ni=1 Xi2 ).
2 n
P
Examples
i.i.d.
Example 120: Let X1 , X2 , . . . , Xn ∼ U(0, θ), θ > 0. Here the
likelihood function is
1
L(θ, x) = I(0, ∞) (x(1) )I(0, θ) (x(n) )
θn
= h(x)gθ (T (x)) ,

where h(x) = I(0, ∞) (x(1) ), gθ (t) = θ1n I(0, θ) (t) and T (x) = x(n) . Hence
T = X(n) is sufficient statistic for θ.
i.i.d.
Example 121: Let X1 , X2 , . . . , Xn ∼ U(θ − 1/2, θ + 1/2), θ ∈ R.
Here the likelihood function is

L(θ, x) = h(x)gθ (T (x)),

where h(x) = 1, gθ (t) = I(θ−1/2, θ+1/2) (x(1) )I(θ−1/2, θ+1/2) (x(n) ) and
T (x) = (x(1) , x(n) ). Hence T = X(1) , X(n) is sufficient for θ.
Remarks

Note that we will be able to use the definition of sufficient


statistic if we can guess one. However the theorem gives
necessary and sufficient conditions, which can be used to find a
sufficient statistic.
Note that the RS is always sufficient for unknown parameters.
However, most of the cases we will not talk about this trivial
sufficient statistic, as it does not provide any dimension
reduction.
Sometimes the RS is only sufficient statistic. For example
consider a Cauchy (0, θ) distribution. (Proof is not need, it is
only for information.)
Remarks

If T is sufficient for θ, then for any one-to-one function of T is


also sufficient for θ. (Can be proved
 easily using Factorization
theorem.) For example X , S 2 is sufficient for parameters of
1
Pn 2
N(µ, σ 2 ), where S 2 = n−1 i=1 Xi − X .
Any function of sufficient statistic is not sufficient. (If so, then
any statistic will be sufficient.)
One-dimensional parameter may have multidimensional sufficient
statistic. (Consider the last example.)
T and θ are of same dimension and T is sufficient for θ do not
imply that the jth component of T is sufficient for the jth
component of θ. It only tells that T is jointly sufficient for θ.
Relation between MLE and Sufficient Statistics

Theorem: Let T be a sufficient statistics for θ. If a unique MLE exist


for θ, it is a function of T . If MLE of θ exist but is not unique, then
one can find a MLE that is a function of T .
Proof: Easy using factorization theorem.
i.i.d.
Example 122: Let X1 , X2 , . . . , Xn ∼ U(0, θ), θ > 0. We know that
the MLE is unique and X(n) , which is also sufficient.
i.i.d.
Example 123: Let X1 , X2 , . . . , Xn ∼ U(θ − 1/2, θ + 1/2), θ ∈ R.
Here a sufficient statistic is T = (X(1) , X(n) ). Also
 MLE is not unique
1 1
and any point in the interval X(n) − 2 , X(1) + 2 is a MLE of θ.
Hence 21 X(1) + X(n) is a MLE and it is also a function of T . On
the other hand Q = sin X1 X(n) − 12 + (1 − sin X1 ) X(1) − 21 is a
MLE but not a function of T only.

You might also like