You are on page 1of 2

~fatbemaucal Expectation

(iv) «I>x (t) and p.d.f. of X.


(v) J,f.. and J,f..'.
(vi) First four cumulants in tenns of first four moments about mean.
IV. Let XI. Xl • .... X. be n i.i.d. (independent and identically distributed)
r.v.'s with m.g.f. M(t). Then prove that
Mx (t).=JM (tlnW,
/I

where X= 1: Xj/n
i= 1
V. If Xh Xl • .... X. are independent r.vo's then prove that
/I

M • (t) == n Mx; (Cj t).


I CjXj i=1
j=1

VI. Fill in the blan.ks :


00

(i) If I I «I>x (t) I dt < then p.d.f. of X is given by ...


-
00.

are independent if and only if... .


(ii) XI aJ'ld Xl
(Give result in tenns of characteristic functions.)
(iii) If XI and Xl are independent then
«I>x,-x, (t) =...
(iv) «I>(t) is ... definedandis ... forall t in (-00. 00).
VII. Examine critically the following statements :
(a) Two distributions having the same set of moments are identical.
(b) The characteristic function of a certain non-~egenerate distribution is
J
e-I .
6·13. Chebychev's Inequality. The role of standard deviation as a
parameter to characterise variance is precisely interpreted by means of the well
known Chebychev's inequality. The theorem discovered in 1853 was later on
discussed in 1856 by Bienayme.
Theorem 6·31. II X is a ~andom variable with mean J.L and variance ~,
lIIen/or any positive number k, we luive
P{IX-J.LI~kcr}~!1/k2 ••• (6·73)~
or P {X - J.L I< k cr }~ 1 - (Ilk2 ) ••• ( 6·73 a)
Proof. Case (i). X is a continuous r.v. By de/..
0 2 = 0,,/ = . E [X - E (X) ]2 =-E [X - J.L ]2
00

= J(x - J.L)2I (x) dx. whereI (x) is p.d.f. of X •


&98 Fundamentals of Mathematical Statistics
Jl- ka 11 + ko

= J (x -1l)2 j{x)dx + J (x. -1l)2 f (x) dx + J (x -1l)2j (x) dx


'11- ko 11 + ko
Jl- ka

~ J (x-Il)2f(x)dx+ J (X-Il)2f(x)dx ... (*)


11 + ko
We know that:
x S Il - k(J l.m,d x ~ Il + k(J ¢:::> I x - III ~ k(J ... (**)
Substituting in (*), we get

.L
Jl-ka ..

,,' , t'''' [l f(x) dx + f(x) dx 1


=k 2 (J2 [P (X 's·Il- k(J) + P (X ~ Il + k(J)] [From (**»)
• =,,2 (J2. P (I X -Ill ~ k(J) [From (**»)
=> P (I X -Ill ~ k(J) S l/k 2• . .. (***)
which establishes (6·73)
Also since
P {I X -Ill ~ k (J) + P {I X -Ill < k (J) I, we get =
=
P {I X - III < k (J) I - P {I X -Ill ~ k. (J) ~ I - { I/k 2 } [From (***)]
which establishes (6·73a).
Case (ii). In case of discrete random variable, the proof follows exactly
similarly on replacing integration by summation.
=
Remark. In partfcular. if we take k (J c > 0', then (*) and (**) give
respectively
(J2 { } .(J2
P { I X -1l·1 ~ ciS. c2 and P I X -Il I < c ~ J - c2

I
P { X - E (X) I ~ c } S va:;X) )
... (6·73 b)

and P{IX-E(X)I<c} ~ 1_ va~~x)

6·13'1. Generalised Form of Bienaymc-Che6ychev's Inequality. Let


g (X) be (l lIon·llegative fllllction of a ralldom variable X. Theil for every k > O.
we have
P { g (X) ~ k } S E. (X) } 1-1 ... (6.74)
[B~ngalore Univ. B.Sc., 1992]
Proof. Here we shall prove the theorem for continuous random variable.
The proof can be adapted to the case of discrete random variable on replacing
integration by summation over the given range of the variable.
Let S be the set of all X where g (X) ~ k. i.e.•

You might also like