Professional Documents
Culture Documents
𝑉 𝑌 = 𝜎𝑌2 > 0. Let 𝐸 𝑌 𝑋 = 𝑥 = 𝜙(𝑥). Then (i) 𝐸 𝜙 𝑋 = 𝜇 and (ii) 𝑉𝑎𝑟[𝜙 𝑋 ] ≤ 𝑉𝑎𝑟(𝑌).
Proof: We prove the Theorem for continuous case only. The discrete case can be proved in the similar
way just replacing the integral sign by summation sign.
Let 𝑓(𝑥, 𝑦) be the joint pdf of random variables X and Y, 𝑓1 𝑥 , 𝑓2 (𝑦) the marginal pdf’s and (𝑦|𝑥) be
the conditional pdf of Y given 𝑋 = 𝑥 such that
𝑓(𝑥, 𝑦)
𝑦𝑥 =
𝑓1 (𝑥)
∞ ∞ ∞
𝑓(𝑥, 𝑦) 1
𝐸𝜃 (𝑌|𝑋 = 𝑥) = 𝑦 𝑦|𝑥 𝑑𝑦 = 𝑦 𝑑𝑦 = 𝑦𝑓(𝑥, 𝑦) 𝑑𝑦 = 𝜙 𝑥 (2)
−∞ −∞ 𝑓1 (𝑥) 𝑓1 (𝑥) −∞
∞
⇒ 𝑦𝑓(𝑥, 𝑦) 𝑑𝑦 = 𝜙(𝑥)𝑓1 (𝑥)
−∞
From (2), the conditional distribution (𝑌|𝑋 = 𝑥) does not depend on the parameter 𝜇. Hence X is
sufficient statistic for 𝜇. Also
𝑬𝝓 𝒙 = 𝑬 𝑬𝜽 (𝑌|𝑋 = 𝑥) = 𝑬 𝒀 = 𝝁
Consider
2 2 2
𝑉 𝑌 = 𝐸 𝑌 − 𝐸(𝑌) =𝐸 𝑌−𝜇 = 𝐸 𝑌 − 𝜙 𝑋 + 𝜙(𝑋) − 𝜇
2 2
=𝐸 𝑌−𝜙 𝑋 + 𝐸 𝜙(𝑋) − 𝜇 + 2𝐸 𝑌 − 𝜙 𝑋 𝜙(𝑋) − 𝜇 (3)
Now
∞
∞
𝐸 𝑌−𝜙 𝑋 𝜙(𝑋) − 𝜇 = 𝑦−𝜙 𝑥 𝜙(𝑥) − 𝜇 𝑓(𝑥, 𝑦) 𝑑𝑥 𝑑𝑦
−∞
−∞
∞
∞
= 𝑦−𝜙 𝑥 𝜙 𝑥 − 𝜇 𝑓1 𝑥 (𝑦|𝑥) 𝑑𝑥 𝑑𝑦
−∞
−∞
∞
∞
= 𝜙 𝑥 − 𝜇 𝑓1 𝑥 𝑦 − 𝜙 𝑥 (𝑦|𝑥) 𝑑𝑦 𝑑𝑥 = 0
−∞
−∞
Example: Let 𝑋1 , 𝑋2 , . . . , 𝑋𝑛 be a random sample of size n from the Poisson distribution with parameter
θ. Obtain the UMVUE for (i) θ (ii) 𝜃 2 (iii)𝑃( 𝑋 = 0) = 𝑒 −𝜃 (iv) 𝑒 −3𝜃
Solution: (i) The p. m. f. of X is given by,
𝑒 −𝜃 𝜃 𝑥
𝑓(𝑥, 𝜃) = 𝑥! 𝑖𝑓 𝑥 = 0,1,2,3 . . . . . . ; 𝜃 > 0
0, 𝑜𝑡𝑒𝑟𝑤𝑖𝑠𝑒
In this case, the p .m . f. can be written in the form
𝑓(𝑥, 𝜃) = 𝑒𝑥𝑝[𝑄(𝜃)𝑇(𝑥) + 𝑆(𝑥) + 𝐾(𝜃)]
if x = 0,1,2,3,.........; θ > 0
𝑄(𝜃) = 𝑙𝑛𝜃, 𝑘(𝑥) = 𝑥, 𝑆(𝑥) = −𝑙𝑛 𝑥!, 𝐾(𝜃) = −𝜃
𝑛
Hence, 𝑇 = 𝑖=1 𝑋𝑖 is a sufficient statistic for θ. Since 𝑇~𝑃(𝑛𝜃) and the family F = {g(t, θ) :θ > 0} of
p .d. f. of T is complete, T is a complete sufficient statistic for θ.
𝑇 𝑇
Since, 𝐸(𝑇) = 𝑛𝜃, so 𝐸 = 𝜃. Hence, by virtue of RBLST, 𝜙(𝑇) = = 𝑋 is UMVUE for θ.
𝑛 𝑛
⇒ 𝐸 𝑇 2 − 𝐸(𝑇) = 𝑛𝜃 2
𝑇2 − 𝑇
⇒𝐸 2
= 𝜃2
𝑛
𝑇 2 −𝑇
So taking 𝜙(𝑇) = by virtue of RBLST, 𝜙(𝑇)is UMVUE for 𝜃 2 .
𝑛2
𝑥1 𝑡−𝑥 1
𝑡 1 1
𝑥1 𝑡 = 𝑥1 1− , 𝑖𝑓 𝑥1 = 0,1,2 … 𝑡
𝑛 𝑛
0, 𝑜𝑡𝑒𝑟𝑤𝑖𝑠𝑒
Now,
𝑡
𝑋1 𝑋1
𝜙 𝑡 = 𝐸 𝑈 𝑋1 𝑇 = 𝑡 = 𝐸 −2 𝑇=𝑡 = −2 𝑥1 𝑡
𝑋1 =0
𝑡 𝑥1 𝑡−𝑥 1 𝑡
𝑋1 𝑡 1 1 3
= −2 𝑥1 1− = 1−
𝑛 𝑛 𝑛
𝑋1 =0
,on simplification.
3 𝑇
Hence, 𝜙(𝑇) = 1 − 𝑛 is UMVUE for 𝑒 −3𝜃 .
Remark: Part (iv) of the problem can also be solved alternatively as under:
Let Z be a random variable defined as follows:
1, 𝑖𝑓 𝑋1 + 𝑋2 + 𝑋3 = 0
𝑍 =
0, 𝑜𝑡𝑒𝑟𝑤𝑖𝑠𝑒
3 𝑇
Then, 𝜙(𝑡) = 𝐸{𝑍|𝑇} = 1 −
𝑛
Example: Let 𝑋1 , 𝑋2 , . . . , 𝑋𝑛 be a random sample of size n from the distribution having probability mass
function as under:
𝑥
𝜃 1−𝜃 , 𝑖𝑓 𝑥 = 0,1,2, … … ; 0 < 𝜃 < 1
𝑝(𝑥, 𝜃) =
0, 𝑜𝑡𝑒𝑟𝑤𝑖𝑠𝑒
1−𝜃 𝑡
Obtain the UMVUE of (i) 𝑔 𝜃 = (ii) 1 − 𝜃
𝜃
Solution: (i) The given probability mass function is a regular case of one parameter exponential class of
𝑛
probability mass functions and so 𝑇 = 𝑖=1 𝑋𝑖 is a complete sufficient statistic for θ.Since,
1−𝜃 𝑇 1−𝜃
𝐸(𝑋𝑖 ) = , 𝐸(𝑋 ) = 𝐸 =
𝜃 𝑛 𝜃
𝑇 1−𝜃
Hence, by virtue of Rao-Blackwell Lehmann-Scheffé Theorem(RBLST), 𝜙(𝑇) = is UMVUE for .
𝑛 𝜃
Example: Let 𝑋1 , 𝑋2 , . . . , 𝑋𝑛 be a random sample of size n from U(0,θ) and g(θ) be a function of θ.
Obtain UMVUE of g(θ).
Solution: Let 𝑇 = 𝑀𝑎𝑥(𝑋1 , 𝑋2 , . . . , 𝑋𝑛 ). The pdf of 𝑇 is given by
𝑛𝑡 𝑛 −1
(𝑡, 𝜃) = , 𝑖𝑓 0 ≤ 𝑡 ≤ 𝜃
𝜃𝑛
0, 𝑂𝑡𝑒𝑟𝑤𝑖𝑠𝑒
the family 𝑡, 𝜃 , 0 ≤ 𝑡 ≤ 𝜃; 𝜃 > 0 is complete. So 𝑇 is a complete sufficient statistic for 𝜃.
Let 𝜙(𝑇) be a function of 𝑇 such that 𝐸{𝜙 𝑇 } = 𝑔(𝜃).Then,
𝜃
𝑔(𝜃) = 𝜙 𝑡 𝑡, 𝜃 𝑑𝑡
0
or,
𝜃
𝑛𝑡 𝑛 −1
𝑔(𝜃) = 𝜙 𝑡 𝑑𝑡
𝜃𝑛
0
or
𝜃
𝑔(𝜃)𝜃 𝑛
= 𝜙 𝑡 𝑡 𝑛 −1 𝑑𝑡
𝑛
0
Special cases:
𝑇 𝑛 +1 𝑇 𝜃2
1. 𝑔(𝜃) = 𝜃,so that 𝑔`(𝜃) = 1,and therefore, 𝜙 𝑇 = 𝑇 + 𝑛 = 𝑎𝑛𝑑 𝑉𝑎𝑟 𝜙 𝑇 = (𝑛 +2)
𝑛
−1
𝑇 −1 𝑛 + 1 𝑇𝑒 −1 𝜃 2 𝑒 −2
𝜙 𝑇 = 𝑇𝑒 + 𝑒 = 𝑎𝑛𝑑 𝑉𝑎𝑟 𝜙 𝑇 =
𝑛 𝑛 𝑛(𝑛 + 2)
Remark 4.3: According to Rao- Blackwell theorem, we should restrict our search to function of a
sufficient statistic (whenever it exists). According to L-S Theorem, if a complete sufficient statistic 𝑇
exists, all we need to do is to find a function of 𝑇 that is unbiased. If a complete sufficient does not exist,
an UMVUE may still exist.
Exercises: Let 𝑋1 , 𝑋2 , … , 𝑋𝑛 be a random sample from pdf or pmf given below. Find UMVUE of
𝑔 𝜃 indicated against each problem:
(i) 𝑓 𝑥; 𝜃 = 𝑁 𝜃, 1 ; 𝑔 𝜃 = 𝜃 2 , 𝑏𝑦 𝑓𝑖𝑛𝑑𝑖𝑛𝑔 𝑓 𝑥1 𝑡 , 𝑤𝑒𝑟𝑒 𝑡 = 𝑥
𝑛 2
(ii) 𝑓 𝑥; 𝜃 = 𝑁 0, 𝜃 ; 𝑔 𝜃 = 𝜃, 𝑏𝑦 𝑓𝑖𝑛𝑑𝑖𝑛𝑔 𝑓 𝑥1 𝑡 , 𝑤𝑒𝑟𝑒 𝑡 = 𝑖=1 𝑥𝑖
𝜃1
(iii) 𝑓 𝑥; 𝜃1 , 𝜃2 = 𝑁 𝜃1 , 𝜃2 ; 𝑔 𝜃1 , 𝜃2 = , 𝑤𝑒𝑟𝑒 𝜃1 𝑎𝑛𝑑 𝜃2 𝑎𝑟𝑒 𝑢𝑛𝑘𝑜𝑤𝑛
𝜃2
Example: Let 𝑋1 , 𝑋2 , . . . , 𝑋𝑛 be a random sample of size n from 𝑁(𝜃, 𝜎 2 ), 𝜎 2 being known. To obtain
UMVUE for θ we proceed as follows:
The p. d. f. of X is given by,
1 −1 𝑥−𝜃 2
𝑒𝑥𝑝
𝑓 𝑥, 𝜃 = 𝜎 2𝜋 2 𝜎 , −∞ < 𝑥 < ∞; 𝜎 > 0
0, 𝑜𝑡𝑒𝑟𝑤𝑖𝑠𝑒