You are on page 1of 7

Theorem: [Rao-Blackwell Theorem]: Let X and Y be random variables such that 𝐸 𝑌 = 𝜇 and

𝑉 𝑌 = 𝜎𝑌2 > 0. Let 𝐸 𝑌 𝑋 = 𝑥 = 𝜙(𝑥). Then (i) 𝐸 𝜙 𝑋 = 𝜇 and (ii) 𝑉𝑎𝑟[𝜙 𝑋 ] ≤ 𝑉𝑎𝑟(𝑌).
Proof: We prove the Theorem for continuous case only. The discrete case can be proved in the similar
way just replacing the integral sign by summation sign.
Let 𝑓(𝑥, 𝑦) be the joint pdf of random variables X and Y, 𝑓1 𝑥 , 𝑓2 (𝑦) the marginal pdf’s and 𝑕(𝑦|𝑥) be
the conditional pdf of Y given 𝑋 = 𝑥 such that
𝑓(𝑥, 𝑦)
𝑕 𝑦𝑥 =
𝑓1 (𝑥)
∞ ∞ ∞
𝑓(𝑥, 𝑦) 1
𝐸𝜃 (𝑌|𝑋 = 𝑥) = 𝑦 𝑕 𝑦|𝑥 𝑑𝑦 = 𝑦 𝑑𝑦 = 𝑦𝑓(𝑥, 𝑦) 𝑑𝑦 = 𝜙 𝑥 (2)
−∞ −∞ 𝑓1 (𝑥) 𝑓1 (𝑥) −∞

⇒ 𝑦𝑓(𝑥, 𝑦) 𝑑𝑦 = 𝜙(𝑥)𝑓1 (𝑥)
−∞

From (2), the conditional distribution (𝑌|𝑋 = 𝑥) does not depend on the parameter 𝜇. Hence X is
sufficient statistic for 𝜇. Also
𝑬𝝓 𝒙 = 𝑬 𝑬𝜽 (𝑌|𝑋 = 𝑥) = 𝑬 𝒀 = 𝝁
Consider
2 2 2
𝑉 𝑌 = 𝐸 𝑌 − 𝐸(𝑌) =𝐸 𝑌−𝜇 = 𝐸 𝑌 − 𝜙 𝑋 + 𝜙(𝑋) − 𝜇
2 2
=𝐸 𝑌−𝜙 𝑋 + 𝐸 𝜙(𝑋) − 𝜇 + 2𝐸 𝑌 − 𝜙 𝑋 𝜙(𝑋) − 𝜇 (3)
Now


𝐸 𝑌−𝜙 𝑋 𝜙(𝑋) − 𝜇 = 𝑦−𝜙 𝑥 𝜙(𝑥) − 𝜇 𝑓(𝑥, 𝑦) 𝑑𝑥 𝑑𝑦
−∞
−∞


= 𝑦−𝜙 𝑥 𝜙 𝑥 − 𝜇 𝑓1 𝑥 𝑕(𝑦|𝑥) 𝑑𝑥 𝑑𝑦
−∞
−∞


= 𝜙 𝑥 − 𝜇 𝑓1 𝑥 𝑦 − 𝜙 𝑥 𝑕(𝑦|𝑥) 𝑑𝑦 𝑑𝑥 = 0
−∞
−∞

Putting this value in (3), we get


2 2 2
𝑉 𝑌 =𝐸 𝑌−𝜙 𝑋 +𝐸 𝜙 𝑋 −𝜇 =𝐸 𝑌−𝜙 𝑋 + 𝑉{𝜙 𝑋 }
⇒ 𝑉 𝑌 ≥ 𝑉{𝜙 𝑋 }
Theorem 4.5:[Lehmann-Scheffe’ Theorem]:
Let 𝑋1 , 𝑋2 , . . . , 𝑋𝑛 be a random sample of size n from the distribution with p. d .f. 𝑓(𝑥, 𝜃) , 𝜃 ∈ Θ. Let
𝑇 = 𝑇(𝑋1 , 𝑋2 , . . . , 𝑋𝑛 )be a sufficient statistic for θ and the family 𝐹 = {𝑔(𝑡, 𝜃) , 𝜃 ∈ 𝛩} of p .d. f. of
T is complete. If there exists a function 𝜙(𝑇) of T such that 𝐸{𝜙(𝑇)} = 𝜃, then, 𝜙(𝑇) is the unique
best estimator of θ (best in the sense that 𝜙(𝑇)has minimum variance).
Proof: If possible, let λ(T) be another function of T such that 𝜆(𝑇) ≠ 𝜙(𝑇)and
𝐸 𝜆 𝑇 = 𝜃 = 𝐸{𝜙(𝑇)} .So {𝜆(𝑇) − 𝜙(𝑇)} = 0, 𝜃 ∈ 𝛩 .
Let 𝜙1 (𝑇) = 𝜆(𝑇) − 𝜙(𝑇), then , 𝐸{𝜙1 (𝑇)} = 0
Since the family F = {g(t, θ) , θ ∈ Θ} of p .d. f. of T is complete, 𝐸{𝜙1 (𝑇)} = 0
⇒ 𝜙1 (𝑇) = 0
⇒ 𝜆(𝑇) − 𝜙(𝑇) = 0
⇒ 𝜆 𝑇 = 𝜙(𝑇)
which is a contradiction. Hence, 𝜙(𝑇)is unique. 𝜙(𝑇) has variance smaller than the variance of any
unbiased estimator of θ.
Remark:
1. Since, 𝜙(𝑇) is unique and has smallest variance among all unbiased estimator of θ, 𝜙(𝑇) is known as
the Unique Minimum Variance Unbiased Estimator (UMVUE) of θ.
2. If g(θ) be a function of θ and λ(T) be such that E{λ(T)} = g(θ) ,then λ(T) is known as the Unique
Minimum Variance Unbiased Estimator (UMVUE) of g(θ).

Example: Let 𝑋1 , 𝑋2 , . . . , 𝑋𝑛 be a random sample of size n from the Poisson distribution with parameter
θ. Obtain the UMVUE for (i) θ (ii) 𝜃 2 (iii)𝑃( 𝑋 = 0) = 𝑒 −𝜃 (iv) 𝑒 −3𝜃
Solution: (i) The p. m. f. of X is given by,
𝑒 −𝜃 𝜃 𝑥
𝑓(𝑥, 𝜃) = 𝑥! 𝑖𝑓 𝑥 = 0,1,2,3 . . . . . . ; 𝜃 > 0
0, 𝑜𝑡𝑕𝑒𝑟𝑤𝑖𝑠𝑒
In this case, the p .m . f. can be written in the form
𝑓(𝑥, 𝜃) = 𝑒𝑥𝑝[𝑄(𝜃)𝑇(𝑥) + 𝑆(𝑥) + 𝐾(𝜃)]
if x = 0,1,2,3,.........; θ > 0
𝑄(𝜃) = 𝑙𝑛𝜃, 𝑘(𝑥) = 𝑥, 𝑆(𝑥) = −𝑙𝑛 𝑥!, 𝐾(𝜃) = −𝜃
𝑛
Hence, 𝑇 = 𝑖=1 𝑋𝑖 is a sufficient statistic for θ. Since 𝑇~𝑃(𝑛𝜃) and the family F = {g(t, θ) :θ > 0} of
p .d. f. of T is complete, T is a complete sufficient statistic for θ.
𝑇 𝑇
Since, 𝐸(𝑇) = 𝑛𝜃, so 𝐸 = 𝜃. Hence, by virtue of RBLST, 𝜙(𝑇) = = 𝑋 is UMVUE for θ.
𝑛 𝑛

(ii) Since, 𝑉𝑎𝑟(𝑇) = 𝑛𝜃,we have,


𝐸(𝑇 2 ) − 𝐸 𝑇 2
= 𝑛𝜃
⇒ 𝐸(𝑇 2 ) − 𝑛𝜃 2
= 𝑛𝜃
⇒ 𝐸 𝑇 2 − 𝑛𝜃 = 𝑛𝜃 2

⇒ 𝐸 𝑇 2 − 𝐸(𝑇) = 𝑛𝜃 2

𝑇2 − 𝑇
⇒𝐸 2
= 𝜃2
𝑛
𝑇 2 −𝑇
So taking 𝜙(𝑇) = by virtue of RBLST, 𝜙(𝑇)is UMVUE for 𝜃 2 .
𝑛2

(iii) Let 𝑈(𝑋1 ) be a random variable defined as follows:


1, 𝑖𝑓 𝑋1 = 0
𝑈 𝑋1 =
0, 𝑜𝑡𝑕𝑒𝑟𝑤𝑖𝑠𝑒
Then 𝐸 𝑈 𝑋1 = 1. 𝑃 𝑈 𝑋1 = 1 + 0. 𝑃 𝑈 𝑋1 = 0 = 𝑃(𝑋1 = 0) = 𝑒 −𝜃 .
Hence, 𝑈(𝑋1 ) is an unbiased estimator of 𝑒 −𝜃 . By virtue of RBLST, 𝜙(𝑇) = 𝐸{𝑈(𝑋1 )|𝑇} UMVUE for
𝑒 −𝜃 .
Now,
𝜙 𝑡 = 𝐸 𝑈 𝑋1 |𝑇 = 𝑡 = 1. 𝑃 𝑈 𝑋1 = 1|𝑇 = 𝑡 + 0. 𝑃 𝑈 𝑋1 = 0|𝑇 = 𝑡
𝑃{𝑋1 = 0, 𝑛𝑖=2 𝑋𝑖 = 𝑡} 𝑃{𝑋1 = 0}𝑃{ 𝑛𝑖=2 𝑋𝑖 = 𝑡}
= 𝑃 𝑈 𝑋1 = 1|𝑇 = 𝑡 = =
𝑃{ 𝑛𝑖=1 𝑋𝑖 = 𝑡} 𝑃{ 𝑛𝑖=1 𝑋𝑖 = 𝑡}
𝑡
𝑒 −𝜃 𝑒 − 𝑛−1 𝜃 𝑛 − 1 𝜃 𝑡
1
= = 1 −
𝑒 𝑛𝜃 𝑛𝜃 𝑡 𝑛
1 𝑇
Thus, 𝜙 𝑡 = 1 − 𝑛 is UMVUE for 𝑒 −𝜃 .
𝑋1
(iv) Let 𝑈(𝑋1 ) = −2 , then,
−𝜃 𝜃 𝑥 1
∞ 𝑋1 ∞ 𝑋1 𝑒
𝐸{𝑈(𝑋1 )} = 𝑋1 =0 −2 𝑚(𝑥1 , 𝜃) = 𝑋1 =0 −2 = 𝑒 −3𝜃 , on simplification. So −2 𝑋1
is
𝑋1 !

an unbiased estimator of 𝑒 −3𝜃 .


𝑛
Since, 𝑇 = 𝑖=1 𝑋𝑖 is a sufficient statistic for θ and 𝑇~𝑃(𝑛𝜃). The family ℱ= {g(t,θ) :θ>0} of p .d. f. of
T is complete, T is a complete sufficient statistic for θ.
By virtue of RBLST, By virtue of RBLST, 𝜙(𝑇) = 𝐸{𝑈(𝑋1)|𝑇} UMVUE for 𝑒 −3𝜃 . In order to obtain
expression for 𝜙(𝑇), we need conditional p. m. f. of 𝑋1 given 𝑇 = 𝑡. Let 𝑕(𝑥1 |𝑡) be the conditional
p. m. f. of 𝑋1 given T=t. Then,
𝑃 𝑋1 = 𝑥1 , 𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛 = 𝑡
𝑕 𝑥1 𝑡 = 𝑃 𝑋1 = 𝑥1 𝑇 = 𝑡 =
𝑃 𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛 = 𝑡
𝑃 𝑋1 = 𝑥1 𝑃 𝑋2 + ⋯ + 𝑋𝑛 = 𝑡 − 𝑥1 𝑒 −𝜃 𝜃 𝑥 1 𝑒 − 𝑛−1 𝜃 𝑛 − 1 𝜃 𝑡−𝑥 1
𝑡!
= = ×
𝑃 𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛 = 𝑡 𝑥1 ! 𝑡 − 𝑥1 ! 𝑒 −𝑛𝜃 𝑛𝜃 𝑡

𝑥1 𝑡−𝑥 1
𝑡 1 1
𝑕 𝑥1 𝑡 = 𝑥1 1− , 𝑖𝑓 𝑥1 = 0,1,2 … 𝑡
𝑛 𝑛
0, 𝑜𝑡𝑕𝑒𝑟𝑤𝑖𝑠𝑒
Now,
𝑡
𝑋1 𝑋1
𝜙 𝑡 = 𝐸 𝑈 𝑋1 𝑇 = 𝑡 = 𝐸 −2 𝑇=𝑡 = −2 𝑕 𝑥1 𝑡
𝑋1 =0
𝑡 𝑥1 𝑡−𝑥 1 𝑡
𝑋1 𝑡 1 1 3
= −2 𝑥1 1− = 1−
𝑛 𝑛 𝑛
𝑋1 =0

,on simplification.
3 𝑇
Hence, 𝜙(𝑇) = 1 − 𝑛 is UMVUE for 𝑒 −3𝜃 .

Remark: Part (iv) of the problem can also be solved alternatively as under:
Let Z be a random variable defined as follows:
1, 𝑖𝑓 𝑋1 + 𝑋2 + 𝑋3 = 0
𝑍 =
0, 𝑜𝑡𝑕𝑒𝑟𝑤𝑖𝑠𝑒
3 𝑇
Then, 𝜙(𝑡) = 𝐸{𝑍|𝑇} = 1 −
𝑛

Example: Let 𝑋1 , 𝑋2 , . . . , 𝑋𝑛 be a random sample of size n from the distribution having probability mass
function as under:
𝑥
𝜃 1−𝜃 , 𝑖𝑓 𝑥 = 0,1,2, … … ; 0 < 𝜃 < 1
𝑝(𝑥, 𝜃) =
0, 𝑜𝑡𝑕𝑒𝑟𝑤𝑖𝑠𝑒
1−𝜃 𝑡
Obtain the UMVUE of (i) 𝑔 𝜃 = (ii) 1 − 𝜃
𝜃

Solution: (i) The given probability mass function is a regular case of one parameter exponential class of
𝑛
probability mass functions and so 𝑇 = 𝑖=1 𝑋𝑖 is a complete sufficient statistic for θ.Since,
1−𝜃 𝑇 1−𝜃
𝐸(𝑋𝑖 ) = , 𝐸(𝑋 ) = 𝐸 =
𝜃 𝑛 𝜃
𝑇 1−𝜃
Hence, by virtue of Rao-Blackwell Lehmann-Scheffé Theorem(RBLST), 𝜙(𝑇) = is UMVUE for .
𝑛 𝜃

(ii) Since, 𝑃(𝑥1 ≥ 𝑡) = 1 − 𝜃 𝑡 , let U be a random variable defined as follows:


1, 𝑖𝑓 𝑥 ≥ 1
𝑈=
0, 𝑜𝑡𝑕𝑒𝑟𝑤𝑖𝑠𝑒
Then 𝐸(𝑈) = 1. 𝑃(𝑋1 ≥ 𝑡) = 1 − 𝜃 𝑡 .So U is an unbiased estimator of 1 − 𝜃 𝑡 . Hence, by virtue of
Rao-Blackwell Lehmann -Scheffė Theorem (RBLST) , 𝜙(𝑇) = 𝐸(𝑈|𝑇) is UMVUE for 1 − 𝜃 𝑡 . To
calculate 𝐸(𝑈|𝑇), we proceed as follows: The probability mass function of T , say g(t, θ), is as under
𝑛+𝑡−1 𝑛 𝑡
𝜃 1−𝜃 , 𝑖𝑓 𝑡 = 0,1,2, … … ; 0 < 𝜃 < 1
𝑔(𝑡, 𝜃) = 𝑡
0, 𝑜𝑡𝑕𝑒𝑟𝑤𝑖𝑠𝑒
Let 𝑕(𝑥1 |𝑡0 ) denote the conditional probability mass function of 𝑥1 given 𝑇 = 𝑡0 . Then,
𝑃 𝑋1 = 𝑥1 , 𝑇 = 𝑡0
𝑕 𝑥1 𝑡0 = 𝑃 𝑋1 = 𝑥1 𝑇 = 𝑡0 =
𝑃 𝑇 = 𝑡0
𝑥1 𝑛 + 𝑡0 − 𝑥1 − 2 𝑛−1 𝑡 0 −𝑥 1
𝑃 𝑋1 = 𝑥1 𝑃 𝑋2 + ⋯ + 𝑋𝑛 = 𝑡0 − 𝑥1 𝜃 1−𝜃 𝜃 1−𝜃
𝑡0 − 𝑥1
= =
𝑃 𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛 = 𝑡 𝑛 + 𝑡0 − 1 𝑛
𝜃 1 − 𝜃 𝑡0
𝑡0
𝑛 + 𝑡0 − 𝑥1 − 2
𝑡0 − 𝑥1
, 𝑖𝑓 𝑥1 = 0,1,2 … 𝑡
𝑕 𝑥1 𝑡 = 𝑛 + 𝑡0 − 1
𝑡0
0, 𝑜𝑡𝑕𝑒𝑟𝑤𝑖𝑠𝑒
𝑛+𝑡 0 −𝑠−2
𝑇 𝑡 0 −𝑠
Hence, 𝜙 𝑇 = 𝐸 𝑈 𝑇 = 𝑃(𝑥1 ≥ 𝑡|𝑇) = 𝑠=𝑡 𝑛+𝑡 0 −1
𝑡0
𝑛+𝑇−𝑠−2
𝑇 𝑇−𝑠
Therefore ,𝜙 𝑇 = 𝐸 𝑈 𝑇 = 𝑠=𝑡 𝑛+𝑇−1
𝑇

Example: Let 𝑋1 , 𝑋2 , . . . , 𝑋𝑛 be a random sample of size n from U(0,θ) and g(θ) be a function of θ.
Obtain UMVUE of g(θ).
Solution: Let 𝑇 = 𝑀𝑎𝑥(𝑋1 , 𝑋2 , . . . , 𝑋𝑛 ). The pdf of 𝑇 is given by
𝑛𝑡 𝑛 −1
𝑕(𝑡, 𝜃) = , 𝑖𝑓 0 ≤ 𝑡 ≤ 𝜃
𝜃𝑛
0, 𝑂𝑡𝑕𝑒𝑟𝑤𝑖𝑠𝑒
the family 𝑕 𝑡, 𝜃 , 0 ≤ 𝑡 ≤ 𝜃; 𝜃 > 0 is complete. So 𝑇 is a complete sufficient statistic for 𝜃.
Let 𝜙(𝑇) be a function of 𝑇 such that 𝐸{𝜙 𝑇 } = 𝑔(𝜃).Then,
𝜃

𝑔(𝜃) = 𝜙 𝑡 𝑕 𝑡, 𝜃 𝑑𝑡
0

or,
𝜃
𝑛𝑡 𝑛 −1
𝑔(𝜃) = 𝜙 𝑡 𝑑𝑡
𝜃𝑛
0

or
𝜃
𝑔(𝜃)𝜃 𝑛
= 𝜙 𝑡 𝑡 𝑛 −1 𝑑𝑡
𝑛
0

Differentiating this equation with respect to θ we get


𝑛𝜃 𝑛−1 𝑔(𝜃) 𝜃 𝑛 𝑔′(𝜃)
+ = 𝜙 𝜃 𝜃 𝑛−1
𝑛 𝑛
𝜃
𝑔(𝜃) + 𝑔′(𝜃) = 𝜙 𝜃
𝑛
Hence, by RBLST, UMVUE of g(θ) is given by
𝑡
𝜙 𝑡 = 𝑔(𝑡) + 𝑔′(𝑡)
𝑛
𝑇
So 𝜙 𝑇 = 𝑔(𝑇) + 𝑛 𝑔′(𝑇) is UMVUE for 𝑔(𝜃).

Special cases:
𝑇 𝑛 +1 𝑇 𝜃2
1. 𝑔(𝜃) = 𝜃,so that 𝑔`(𝜃) = 1,and therefore, 𝜙 𝑇 = 𝑇 + 𝑛 = 𝑎𝑛𝑑 𝑉𝑎𝑟 𝜙 𝑇 = (𝑛 +2)
𝑛

2. 𝑔(𝜃) = 𝜃𝑒 −1 .So that 𝑔`(𝜃) = 1𝑒 −1 ,and therefore,

−1
𝑇 −1 𝑛 + 1 𝑇𝑒 −1 𝜃 2 𝑒 −2
𝜙 𝑇 = 𝑇𝑒 + 𝑒 = 𝑎𝑛𝑑 𝑉𝑎𝑟 𝜙 𝑇 =
𝑛 𝑛 𝑛(𝑛 + 2)
Remark 4.3: According to Rao- Blackwell theorem, we should restrict our search to function of a
sufficient statistic (whenever it exists). According to L-S Theorem, if a complete sufficient statistic 𝑇
exists, all we need to do is to find a function of 𝑇 that is unbiased. If a complete sufficient does not exist,
an UMVUE may still exist.
Exercises: Let 𝑋1 , 𝑋2 , … , 𝑋𝑛 be a random sample from pdf or pmf given below. Find UMVUE of
𝑔 𝜃 indicated against each problem:
(i) 𝑓 𝑥; 𝜃 = 𝑁 𝜃, 1 ; 𝑔 𝜃 = 𝜃 2 , 𝑏𝑦 𝑓𝑖𝑛𝑑𝑖𝑛𝑔 𝑓 𝑥1 𝑡 , 𝑤𝑕𝑒𝑟𝑒 𝑡 = 𝑥
𝑛 2
(ii) 𝑓 𝑥; 𝜃 = 𝑁 0, 𝜃 ; 𝑔 𝜃 = 𝜃, 𝑏𝑦 𝑓𝑖𝑛𝑑𝑖𝑛𝑔 𝑓 𝑥1 𝑡 , 𝑤𝑕𝑒𝑟𝑒 𝑡 = 𝑖=1 𝑥𝑖
𝜃1
(iii) 𝑓 𝑥; 𝜃1 , 𝜃2 = 𝑁 𝜃1 , 𝜃2 ; 𝑔 𝜃1 , 𝜃2 = , 𝑤𝑕𝑒𝑟𝑒 𝜃1 𝑎𝑛𝑑 𝜃2 𝑎𝑟𝑒 𝑢𝑛𝑘𝑜𝑤𝑛
𝜃2

(iv) 𝑓 𝑥; 𝜃1 , 𝜃2 = 𝑁 𝜃1 , 𝜃2 ; 𝑔 𝜃1 , 𝜃2 = 𝜃1 𝜃2 , 𝑤𝑕𝑒𝑟𝑒 𝜃1 𝑎𝑛𝑑 𝜃2 𝑎𝑟𝑒 𝑢𝑛𝑘𝑜𝑤𝑛


(v) 𝑓 𝑥; 𝜃1 , 𝜃2 = 𝑁 𝜃1 , 𝜃2 ; 𝑔 𝜃1 , 𝜃2 = 𝜃1 2 , 𝑤𝑕𝑒𝑟𝑒 𝜃1 𝑎𝑛𝑑 𝜃2 𝑎𝑟𝑒 𝑢𝑛𝑘𝑜𝑤𝑛
(vi) 𝑃𝜃 𝑋 = 𝑥 = 𝑃 𝜃 ; 𝑔 𝜃 = 𝑒 −2𝜃
(vii) 𝑃𝜃 𝑋 = 𝑥 = 𝑏 1, 𝜃 ; 𝑔 𝜃 = 𝑉𝑎𝑟𝜃 𝑋
(viii) 𝑃𝜃 𝑋 = 𝑥 = 𝑏 1, 𝜃 ; 𝑔 𝜃 = 𝜃 2
(ix) 𝑃𝜃 𝑋 = 𝑥 = 𝑁𝐵 1, 𝜃 ; 𝑔 𝜃 = 𝐸𝜃 𝑋
(x) 𝑃𝜃 𝑋 = 𝑥 = 𝑁𝐵 1, 𝜃 ; 𝑔 𝜃 = 𝑃𝜃 𝑋 = 1
(xi) 𝑓 𝑥; 𝜃 = 𝑈 0, 𝜃 ; 𝑔 𝜃 = 𝜃 𝑏𝑦 𝑓𝑖𝑛𝑑𝑖𝑛𝑔 𝑓 𝑥1 𝑡 , 𝑤𝑕𝑒𝑟𝑒 𝑡 = max1≤𝑖≤𝑛 𝑋𝑖
(xii) 𝑓 𝑥; 𝜃 = 𝐶 𝜃 𝑒𝑥𝑝 𝑄 𝜃 𝑇 𝑥 𝑕 𝑥 . 𝑔 𝜃 = 𝐸𝜃 𝑋
(xiii) 𝑓 𝑥; 𝜇, 𝜎 2 = 𝑁 𝜇, 𝜎 2 , 𝑔 𝛳 = 𝜉𝑝 , the 𝑝𝑡𝑕 𝑞𝑢𝑎𝑛𝑡𝑖𝑙𝑒 𝑜𝑓 𝑡𝑕𝑒 𝑑𝑖𝑠𝑡𝑟𝑖𝑏𝑢𝑡𝑖𝑜𝑛, 𝛳 = 𝜇, 𝜎 2

Example: Let 𝑋1 , 𝑋2 , . . . , 𝑋𝑛 be a random sample of size n from 𝑁(𝜃, 𝜎 2 ), 𝜎 2 being known. To obtain
UMVUE for θ we proceed as follows:
The p. d. f. of X is given by,
1 −1 𝑥−𝜃 2
𝑒𝑥𝑝
𝑓 𝑥, 𝜃 = 𝜎 2𝜋 2 𝜎 , −∞ < 𝑥 < ∞; 𝜎 > 0
0, 𝑜𝑡𝑕𝑒𝑟𝑤𝑖𝑠𝑒

In this case, the p .d. f. can be written in the form


𝑓(𝑥, 𝜃) = 𝑒𝑥𝑝 𝑄 𝜃 𝑇 𝑥 + 𝐾 𝜃 + 𝑆 𝑥
−𝑥 2 −𝜃 2
Where, 𝑄(𝜃) = 𝜃, 𝑇(𝑥) = 𝑥, 𝑆(𝑥) = , 𝐾(𝜃) = . So the p. d. f. belongs to the one –parameter
2 2
𝑛
exponential class. Hence, 𝑇 = 𝑖=1 𝑋𝑖 is a sufficient statistic for θ. Since 𝑇 ~ 𝑁(𝑛𝜃, 𝑛𝜎 2 ),and the
family F = {g(t, θ) ;−∞ < θ < ∞} of p .d. f. of T is complete, T is a complete sufficient statistic for θ.
𝑇 𝑇
Since, E(T) = nθ ,so 𝐸 (𝑛 ) = 𝜃. Hence, by virtue of RBLST, 𝜙(𝑇) = 𝑛 is UMVUE for θ. T is a single

valued function of X, Sample Mean , itself is a complete sufficient statistic for θ.

You might also like