You are on page 1of 16

M,easP-l'es.ofDispersion; ~Jre.

w;ness ilnd Kurtosis 3:21

52 =~ I/x 2 = ill [1 2,"C1 + 22. "C 2 + 32."02 + .:. + 1/2."C,,]

LECTURE tn - I} (11- 2) +45


= ;" [1 + 2 (Ii - .1),+ ~ ._ + /I]

=~~_ ({1 + (11- I) + (II - ~~" - 2) + ... + I.} .


Moments +of Distribution,
{(/I - I) + (II - 1)(11 - 2) + .... + (1/ - I)})
=;', [(n-ICO+/I-I C I +'/-1<;;2+ .... ,../I-IC,,_I)
Skewness,
t {(IIKurtosis
- I) 2eo + 11-2C 1 + ... + ,,-2C,(_2) }] ('1-

=l!. [( 1+ 1y,-I + (Il - 1). (I + I );i-~]~=- 1/(11 +. I)


2" 4
(J 2 =11(11 + I) _!t-=!!.
.. -4 _ 4 4'
25. (a) Let r be the range and s be the standard deviation or a set or
observations Xi. x2 • ..... xi,~ .then prove· by general reasoning or othcr\vjsc that
s ~ r.
Hint. Sincex;-x $; r. i::: 1.2..... 11. we have
1 /I - l' "
s'2 ::: - 'i Ii (x· - ;)2 < - 'if; ~r'2)
N ;=1' ,. . - N ;=1 I

'>' 5'2 ~ 1'21


N
if,':::
;=1 '
1'2 ~ S S r
(b) Let I'

Moments
be the range and

S::: (,,~I ;~ (x;about


- x)2 any r
be the standard deviation or a set of pbservntions XI. x2 • .... x". then prove that
1

point
S$ r(,,~
3',9. Mo·mcilts. The rth moment--of a varipblc
}' r1 .'

[Punjab Univ. B.Se (Stat. Hons.), 1993]



about· any point. x =A.


usually denoted ~y p:
js ;givl{n by
I
~: ="N 'i;f;(x;-A),. If;=N .....(3·14)
, ;
I ~ r
="N~f;d;. ... (3·14a)
,
where d; ::: Xi - A.
The rth moment of a variable about the rt1ean X. usually denoted by ~r is
given by
I - 1 ,
~r::: N- 'i;f; (x;.,.. x)'::: "N 'if; z; ... (3·15)
, ;
Fundamentals o[MathemaUoti StaUsUcs

wheIe Z; - X; - i.
In particular
~ - ~ ~, f;(x; - it - ~ ~f;
, - 1

and 1'1 - ~ ~, f; (x; - i) - 0, being the algebraic sum of deviations from


the mean. A1so
1'2 - .!.
N;
1:. f; (;c; - i)2 - ~ ... (3·16)

These IeSults, viz., lAc, - 1, 1'1 - 0, and 1'2 - cl. aIe of fundamental im-
portance and should be 'Committed to memory.
We know that if d; - X; - A , then

i - A+ ~~ f;d; - A + 1'1' .•. (3·17)


,
3·'·1. Relation between"moments about mean in terms of moments
about any point and vice verso.
We have

. . . - .!. If; (;c; - xr - .! 1:. f; (x; - A + A - i)'


N; N;

- ~~f;(d;+A-
, i)', where d;- X;- A

Using (3·17), we get

. . . - .!.
N;
1:. f; (d; - 1'1')'

- Ii1 If; [" 'c1 ,,-Ill'1 + 'c2 ,,-2 1'1,2 - 'c3 ,,-3,3
j Qj - Qj Qj 1'1 + ... + (1)'
-Qi 1'1"]

•••(3·18)
- ......' - 'CIJ.'r.-l' 1'1' + 'C21'r-2' 1'1,2 - .•• + (-1)' 1'1" [ On using (3·141)]
In particular, on putting r = 2, 3 and 4 in (3·18), we get
, ,2
1'2 - 1'2 - 1'1
1'3 - 1'3, - 31'2' 1'1, + 21'1,3 •.. (3.19)
1'4 - 1'4' - 4J.L3'1'1' + 61'2' 1'1,2 _ 31'1,4
Convenely,

. . . ' - .!.
N ;
1:. f;(x; - A)' - 1N 1:.f;
;
(x; - i+ i- A)'

1
- N ~, f; (Z; + til')'
Measures of Dispersion, Skewness and Kurtosis

wherex;- X =Z; =
and X A + Ill'
~ #c (~ .rc ~-I
Thus
, _1
Ilr - N ~J; Z, + IZ, III , + rc2Z,~-2'2
J.l.1 + ... + III ,r)
I ~

= Ilr + rClllr_1 Ill: + rC2 Ilr-2 JlI'2 + '" +JlI'r. [From (3·15>:
In particular, putting r =2, 3 and 4 and noting that III =0, we get
Jl2' = Jl2 + 1l1'2
Jl3' =113 + 3Jl2 JlI' + 1l1'3 .. , (3·20)
Jl4' =114 + 4Jl3JlI' + 6112 1l1'2 + 1l1'4
These formulae enable us to find the moments about any point, once the
mean and the moments about mean are known.
3.9.2 Effect of Change of Origin and Scale on Moments.
x-A - - - -
Let II =-h- , so that x =A + hu, x =A + hu alld x - x =h(1I - u)

Thus, rth moment of x about any point x :: A is given by


I I ~ I ~ r
=
Ilr' N- J;./; (x; - AY =IV ~/; (hu;Y =hr. IV . ~/; II;
I I I

And the rth moment of x about mean is


1 - I
tv
Ilr= J;./; (x;-xY= IV J;./; [h(u; - uW
I I

I -
=hr IV J;./; (u; -
I
uY
Thus the rth moment of the variable x about mean is h r times the rth moment
of the variable II about its mean.
3·9·3. Sheppard's Corrections for Moments. In case of grouped
frequency distribution, while .calculating moments we assume that the
frequencies are concentrated at the middle point of the crass intervals. If the
distribution is symmetrj~al or slightly symmetrical and the class interval.s are
not greater than one-twentieth of the range, this assumption is very nearly true.
But since the assumption is not in general true, some error, called the 'grouping
error', creeps into the calculation of the moments. W.F. Sheppard proved that if
(i) the frequency distribution is continuous, and
(U) the frequency tapers off to zero in both directions,
the effect due to grouping at the mid-point of the intervals can be corrected by
the following fonnulae, known as Sheppard's corrections:
h 2
Jl2 (corrected) =Jl2 - 12 ... Q·21)
Jl3 (corrected) =113

Jl4 (corrected) =114 - 21 h 2 112 + 240


7
h4
3·24 ~damenta1s of Mathematical Statistics

where II is the width of the class interval.


3·9·4. Charlier's Checks. The following identitfes
Pearson's coefficients
'ift.x + 1-) =I../x +N; 'ift.'x + 1)2 ='i/x2 + 2!/x +N
'ift.x + 1)3 ='i/x3 + 3'i/x2 +. 3'i/x + N
'ift.x,.. 1)4 .=.'i/x4 t 4,'i/x3 + 6'i/x2 + 4'i/x + N,
are often used in checking the accuracy in the calculation of first fO,ur moments
and are known as Charlier's Checks.
3,10. Pearson's ~ and 'I Coefficients. Karl Pearson defined the following
four coefficients, based upon the first fout moments"about mean:
2
113 114
~ I =3' , 'I I· = + -{i3; a'ld ~2 =2' \ '12 = ~2 - 3 ... (3·22)
112 112" ,
It may be pointed out that these coefficients 'are pure numbers independent of
units of measurement. The practical utility of these coefficients is discussed in
§ 3·13 and § 3·14.
Remark. Sometiines, another coefficient based on moments, .viz, Alpha
(ex) coefficient is used. Alpha coefficients are defined as :

a I -- H:!
(J -
- 0, (X - 112 - 1
2 - (J2 - ,
(X _l!2.
3 ~cP -- -v-fa -
PI -
y 11 (X _ 114 - f.I.
4 - (J4 - 1:'2

3·11. Factorial Moments. Factorial moment of order r about the origin


of the frequency distribution X; 1/;, (i = 1,2, ... n), is defined' as

II( )'
rr
=1- i: 1', X .<r)
N;=I'"
... (3·23)

1/

where X<r' =x (x - l)ex - 2) ... (x - r + I) and N = 'i /;


;=1
Thus the factorial 'moment' of order r aboUt any point x =a is •given by. ,

t "J;./;,
J ' •

'l1(r(= (x,- a)(r) ... '(3·24)

where (x - a)(r) =(x - a) (x - a - I ~ ... (x - a .... r + q)


In particular from (3·23J, we have

11(1,' = ~ r/; x; = ).(.I~ (~bouJ 9rigin) rr..Mea~ (x).

11(2)' t
= "J;./;xP)
, = "J;./;x;
, (x; - t 1)
Measures ofDispersion~ Skewness and Kurtosis 3·25
J.1(3)' = ~ "i;./; xl:') = 1
N "i;.f~i (Xi'" ) )(Xi - 2)
r r

= ~ "i;./;x,? - 3 ~ "i;.f;x? + 2 -NI". "i;.f;Xi


r r r

= J.13' - 3J.12' + 2J.11'

J.1(4)' = ~ "i;./;x;<4) = ~ "i;.f;Xi (Xi -


r r
I) (Xi - 2) (Xi - 3)

=~"i;./;Xi(X?-6x?+
r
1]xi- 6)

I I I I
= IV "ffi Xi4 - 6. IV I./;x? + I J. IV I./; x? -6. tv I./; XI
= J.1/ -
6J.1/ + IIJ.1i' ~ 6J.1.'
Conversely. we will get
Ill' =J.1(l)'
J.12' =J.1(2)' + J.1(1)'
J.1/ = 11(3)' + 311(2)' + 11(1 >' '" (3·25)
J.14' =11(4)' + 611(3)~ + 711(2){ + 11(1,'
3·12. Absolute Moments. Ror the frequency distribution XI if; i = I. 2•...
II. the 11h absolute moment of the variable abo¥t ~he origin is given by
I"
N r= •
r
- .I. /; Xi N = I./; 1 I. ... (3· 26)
where 1x[ 1represents the absolute or modulus value of x[ .

The 11h absolute moment of the variable about the )1lean x.is g,v~n by
I " -
-N'I.
r= •
/; 1Xi - x l · · · (3·26a)
r

Example 3·8. The first four momellfs of a distribution about the vallie 4 of
the I'llriab/e are - J·5, 17, - 30 and, 108. Find ,the moments t'1/io/lf liiellll, PI lind
f3!.
Find (//so the moments nbollf (i) the origin. and (ii) the poillf'x =2.
Solution. In the usual notations. we are 'given i\ '= 4 and
J.11' = -1'5. J.f2' = '17. J.1/:: - 30 and 1I4' ='108.
Moments about mean: J.11 =0
J.12 = J.1{ - J.11'2 = 17 - (-1·5)2 = 17 - 2·25 = ]4·75
113 =113' - 3112' J.1;' + 2J.1{'
=.- 30 - 3 x (17) x (-1 ·5) +.2 (-1·5)3
=.- 30 + 76·5 - 6·75 = 39·75
Fundamentals of Mathematical Statistics

114 = 1l4' - 41l3' 1lI':to 61l2'Ill'2 - 31ll'4


= 108 - 4(-30)(- 1·5) + 6(17)(-1·5)2 - 3(-1·5)4
=108 - 180 + 229·5 - 15·1875 =142·3125
2
_ 113 _ (39.75)2
Hence ~I - Il~ - (14·75)3 = 0·4924

~2 =Il~ =(142.3125)=0.6541
112 (14·75)2

Also X = A + Il{ = 4 + (-1·5) = 2·5


Moments about origin. We have

x=2·5, 112 = 14·75, 113 =39·75 and III = 142·31 (approx).


We know x=A + Ill" where Ill' is the first moment about the point x =A.
Taking A =0, we get the first moment about origin as Il{ =mean =2·5.
Using (3·20), we get
112' = 112 + 1l{2 = 14·75 + (2·5)2 =14·75 + 6·25 = 21
Il{ = 113 + 3112 ~{ "': 1l{3 = 39·75 + 3(14·75) (2·5) + (2·5)3
=39·75 + 110·625 + 15·625 = 166
11/ == 114 + 41l31ll' +-61l21lI'2 + 1l{4
= 142·3125 + 4 (39·75) (2·5) +.6(14·75)(2·5)2 + (2·5)4
= 142·3125 + 397·5 + 553·125 + 39·0625
= 1132.
=
Moments about the point x 2. We have x =A + Ill'. Taking A = 2, the
tirst moment about the point x 2 is =
Ill' = x -2=2·5-2=0·5
Hence
=
1l2' = 112 + 1lI'2 14·75 + 0·25 15 =
1l3' =113 + 31l21lI' + 1lI'3 = 39·75 + 3(14·75)(0·5) + (0·5)3
= 39,75 + 22·125 + 0·125 62 =
Il/ = 114 + 41l31ll' + 61l21ll'2 + 1lI'4
= 142·3125 + 4(39·75)(0·5) + 6(14·75)(0·5)2+ (0·5)4
= 142·3125 + 79·5 + 22·125 + 0·0625
=244
Example 3·9. Calculate the first lour moments of the following
distribution about the mean and hence fmd ~I and ~2'
x: 0 J 2 3 4 5 6 7 8
f: J 8 28 56 70 56 28 8 J
Measures of Dispersion. Skewness and Kurtosis 3·17

Solution. CALCULATION OF MOMENTS

x f d-x - 4 , fd fd2, fd 3 f"


0 1 -4 -4 16 -64 256
1 8 -3 -24 72 -216 648
2 28 -2 -56 112 -224 448
3 56 -1 -.56 56 -56 56
4 70 0 0 0 o· \
,
0
5 56 1 56 56 56' 56
6 28 2 56 112 224 448
7 8 3 24 72 216 648
8 1 4 4 16 64 256
Total 236 0 0 512 0 2,816
Moments about the pomts x = 4 are
\.11' = ~fd = 0, 1l2' = ~fJl = ;~~ = 2,
I
\.13 =
.!.~fi,$ =
N- U
0 • .!~fi'.J4 2816 .
and \.14 = N- U = 256 = 11
Moments about mean are:
, "
\.11 = 0, Il! = 112 - III - = 2
\.13 = !A3 - 112 III + III ,3 = 0
, 3 ' , 2
!A4 = \-l4 , - 4:J.l3!Al
' , + 61l2!Al
' ,2 - 3III ,4 '"' 11
!A~ !A4 11
~l = 1 = 0, ~2 = 2 -= - => 2·75
!Ai III 4
Exampl~ 3·10' For a distribmion tire mean is 10, variance is 16, Y2 is +
1 and~! is 4. Obtain tlte first four moments abom the orgin, i.e., zero.
Comment upon tlte nature 'of distribution.
Solution. We are given
Mean = to, !A2 = 16, '/1 = + 1, Ih = 4
First fOllr moments about origin (Ill' , 1l2' , 1l3' , \-l4')
Ill' = First moment about origin = Mean = 10
it! , = 112 + III ,2 ~ it! , - 16 + 10 = 116
2
Il! = 112,-,\.11
2
~
III
we have '/1 = + 1 ~ 312 = 1
112
~ \.13 = 112 J~ = (16ll = 43 = 64
.. 113 = 113 , - 3"112 III +- ?-!-l1 ,3
. 3 ' , 2\.11 .3
~ ~t3 = ~l3 + 1l2!A1 -

= 64 + 3 x 116 x 10 - 2 x 1000 = 3544 - 2000 = 1544

Now ~2 '= Il~ = 4 ~ \.1\1 = 4 x 162 .= 1024


112
and ~l4 = ~l' 4 - 41l3\ll' + 6\.12'\.11'1- 1lI,4
3·28 :Fundamentals or Mathematical Statistics

~ !-l~' 1024 + 4 x 1~44.x 1Q -' 6 )(" l,lp,..x .100 +, ~ x 10000


z
= 92784 -
69600 = 23184.
Comments of. Nature of distribution: [c.f. § 3·13 and § 3'141
Since Y·I =
+ 1, the distribution is moderately p,ositively skewed, i.e, if we araw
:the curve for th~ given distribution, it will have longer tail towards the right.
Further since Ih = 4> 3, the distribution is leptokurtic, i.e., it will be more
peake.d thali the normal curye.
Example 3·),11. If for a random variable x, the absolute moment of order
k exists for ordinary k = 1, 2, ..., n-l, then the following inequalities
.(i), ~~K s ~~-1. ~+1' (ii.) ~~ S. ~K+I ~.I
1l0lds for k=;l, 2, ..~, n-l, whe" ~I( is the 7cth absolute moment abollt the origin.
, [Delhi Uni,·. B.Sc. (Stat.Hons.) 1989)'
Solution. If Xi Ifi' i = 1, 2, ... , n is the giyen frequency distrbution;"then

~I(= NIl;
1- '1 Xik 1 ... (1)
Let u and I' be arbitrary real numbers, ,then the 'expressfon'
II 2
I. Ii [u' XP:-I)J2, + \ll.xP:+I)J2,] "is nc;m-g~gati~e,
i. I
II 2
~ I. /; [U I Xil(k-I)/2 + \II Xi l(k+I)J2] ~ 0
i- 1
~ 112 L/;lx,lA>-l + \1 2 Lli lxilk+l + 2u\I "'i:.li lxil k ~,O
Dividing throughout by N and using reiatiOll (1), we get
,,2(3"-1 + v2 (3"+1 + 2uv~" Ot 0, i.e., '?~I(-I + 2uvl3" + v:! ((+1 Ot 0 ... (2)
We know that. the condition for the' expression (/ .\'2 + 2/lX.'':'+ b i to be
non - negative for all values of x llOd y is that
la hi OtO
I h b I
Using this result, we get from (2)'

I~: - 1 !~: +1 I~ 0

~ ~"_I • ~"T I - 13~ ~ 0 ... (3)


Raising both sides of (3) to power k, we get
f:\~" ~ ~~_I. I~"+I k ... (4)
PUlling k= 1. 2, ... , k-l, k successively in (4), we get
Lecture 14
CJlAPTER SEVEN
Binomial
Theoretical Distribution
Discrete Probability
Distributions
- 7·0. Introduction. In the previous chapters we have discussed in detail the
frequency distributions. In the present chapter we will discuss theoretical discrete
distributions in which variables are distributed according to some definite
probability law which can be expre~sed mathematically. The present study will
also enable us to fit a mathematical' model or a function of the form y =p(x) to
the observed data.
We have already defined distribution function. mathematical expectation.
m.g.f.• characterIstic function and moments. This prepares us for a study of
theQretical distributions. This chapter is devoted to the study of univariate
(except for the mult.ino~ial) di~tributions like Binomial •. Poisson. Negatiye
binomial Geometric. Hypergeometric. ~uItinomial and Power-series
distributions.
7'1. Bernoulli Distribution. A random variable X which takes two values
o and I. with probabilities q and p respectively. i.e .• P (X I) = p. =
=
P(X = 0) q. q == I -p is called a Bernoulli variate and is said to have 'a
Bernoulli distribution.
Remark. Sometimes.·the two values are +1. -I instead of I and 0,
70J01. Moments of Bernoulli distribution. The ,rill moment about origin
is
Ilr' = E (X~) =or .q + Ir • p ,= p ; r = I. 2•... . .. (7.1')
Ill' ~·E(X) = p. 112' = E(X2) = p
112 = Var (X) = p - 1'2 = PlJ.
The m.g.f. of Bernoulli variate is given by :
=
M x (1) eO" x P (X = 0) + e I I . P (X = I) = q + pel ... (7·la)
Remark. Degenerate ~anaoDl Variable. Sometimes we may come
across a val'iate X which is degenerate at a point i e•• say. so that: P (X =e) =1

pomt ·'c?
.
and, =0 otherwise. i.e.. the whole mass of the variable is concentrated at a singie

Since P (X =e) = I. Var (X) =O~
Thus adegenerale r.v. X is characterised by Var (X) O. =
M.g,(. of degenerate r.v. is given by
=
Mx (t) =E (e IX ) =e'c P(X =c) eCI ... (7·Jb)
7'2. Binomial Distribution. Binomial distribution wa~discQvered by James
Bernoulli (1654-1705) in the year 1700 and was first published posthumously in
1713. eight years after his death). I,.et a randqm experiment be performed
repeatedly and let the occurrence of an event in a.trial be called a success and its
non-occurrence a failure. Consider a set of Il independent Bernoullian trials (Il
7·1, Fundamentals or Mathematical StatiStics

being finite), in "'1hich the probability 'p' of success in any trial is constant for each
trial. Then q = 1 - p, is the probability of failure in,any trial.
Ti.c probability of x successes and consequently (n -x) failures in n inde_
pendent trials, in a specified order (say) SSFSFFFS .. .FSF (where S represents
success and F failure) is given by the compound probability theorem-by the
expression:
P (~SFSFFFS .. .FSF) = P(S)P(S)P(F)P(S)PH1P(F)P(F)P(S) x
••• x P(F)P(S)P(F)
s p . p .~ .. p . q . q . q . p •.. q . p'. q .
l
. ='p,p ....p
1

q. q.q .....q =P '«,,I'-x


, X

I x factorsl l(n -x) factors) ,

- Butx successes in n trials can oC('\lrjn ( ~}ways and the probability for each

of these ways is PC'cf'-x.. Hence the probability of'x successes in n trials in any
order whatsoever is given by th~ a"d,ition theor~m of'prqba hil JlY by the ~xpression:

x
Th ·l· d· 'b .
. epro bab Ilty- Istn utIOno t6enum ro
'(f;)PX
cf'-be f
successes~so.o
' • b . d· . I
tame 1SC3 led'
the Binomial probability distribution, fOI\ the ,obvious riason that the' probabilities.
of 0, 1~ 2, ... , n successes, viz., ,
: (n),/I'-
' ,I «
,/I n) «
1 p, '( 2 ,/1.- 2 . terms 0~ f t,be b·mo·
p 2, ..., ,P" ,are t. he successive

mial expansion (q + p)".


Definition. it random variable X is said to follow binomial distribution if
it assumes only non-negative vallies and its'P,robabilitj mass {unction is given by
. n ) pX q'-x ; X '"' 0.1,2, ..., n ; q = 1 _ P •.. (7'2)
P(X ... x) .. p(x) "" {( x
0, otherwise.
, The two independent constants nand p in the! distribution.~!e known as the
parameters pfthe distribution. 'n' is also, sometimes.. known as the. degree of the)
bi~~Qn;lj,,1 di~triJ>ution. ' .•
Binomial distribution is a discrete distritJution asX can take only the integra'!
values, viz., 0, 1,2•... , n. Any varjable which follows binomial'distribution is
known as binomial vanate. ,
We shall use the notation X - B(n,p) to denote that the random variable~
f91.I9w~ binomial distribution with parameters,n and p,
1he probability p(x) in (7'2) is also sometimes denoted by,b(x, n!.p). ,
Remarks 1. T&is as~igitrgeiit of pr6~l)iJities is pemlis~ible because
." n' n' - (
1:
x-O'.
p(x>. = l:'
x-O'
It n ) pX 4' -~ .. (q +. p)" = 1
X ' ."
'Theoretical Discrete Pr.obablllty Distributions

2. Let us suppose tbat n trials C9nstitute an experiment .. Jhen if this


experiment is repeated N tinles.the frequency function of the binomial distribu-
tion is given by
~t) = Np(x) = N ( ; ) pX qn-x;x _ 0.1.2•...• n ...(7.3)
and the expected frequencies of O. 1. 2•...• n successes are t&e successive terms of
tbe binomial·expansion. N <II f pt. q + p = 1.
3. Binomial distribution is important 'not only because of its wide ap-
plicability. but because it gives rise (0 many:o(her probability distributions. Tables
for p(x) are available for various values ofn, andp.
4. "Physical conditions for Binomial Dist'ribuqon. We get the binomiar
distribution under the following experimeJ\tl,ll·cOlfditions.
(i) Each trial results in two mutually (li~ioint outcomes. tenned as success
and failure.
(ii) Tile number of trials en' is finit~.
(iii) The' trials are independent of each other.
(iv) The prQbability of.succe~s 'pi is constant for-each trial.
The problems relating to t.lssing of a corn or throwing of dice or drawing
cards from a pack- ~f cards with replacement lead to bipomial probababiliiy
distribution.
Example 7·1_ Ten coins are thrown simultaneously. Find the probability of
getting at least seven "ea,,~_ '
Solution. p,= Probabili~ of getting a head =:I
q =Probability'of not getting a head = 1
The probability Of getting x heads in a random fbrow~f 10 coins js
x 10-x 10

p(x) _ (~O) (¥) (~) = (~O) (~) ;x _ 0" 2. . . 10 ..'


:. Probability of getting at least seven: he~ds. is given by. J

P()( ~ 7) .. p(7j + p(8) +' P(9) + p(10)

·(~ r( 1~
=
120' + 45 + 10 + 1
) t (
176
=--
~'J ('~o.) + ( :~ )}
t

1024' 1024'
. . , , ' ~I", .
Exa~ple' 7·2. A and B playa gllme in wli;ch tlleir cl~a{fces ofwinning qre
i" the ratio 3 : 2. Fin~ A's chance l!f winning at least ,three gqmes., outl?[ t!ie five.
8ame~play¢d. ' [8~rd~an Univ. p.S~. (Hons,),199~]
, Solution~ Let p be the probability that 'A • w1ns the game. Thc:P. ~e are
glvenf = 3is' :=T' q = I-p = 2/5. ,
"Hence. by 'binomia1 probability law. the probability Jhat out of 5 gant~'
played.A wins 'rl.gilmes is given by :
7·4 FundamentaJ.s or'Mathematical Statistics'

P(X .. r) .. p(r) ... ( ;) . (3l5)" (2/5)5-,; r - 0, 1,2, ...,5

The required probability that 'A' wins at least three games is givenby :

P(Xi1:3)= ~(
,-3.
5) 3' .~5-'
r 5

_~: [U.) 22 + (!).3X2+ 1.32Xl] _ 27x (4~1~30+ 9),. 0,68

Example 7~3. If m things are distributed among 'a' men and 'b' W{>men,
show that the probability that the number pFihings received by men is odd, is
.! I~ (b + a)m :.....(b _ a)m]'
2 l
(.b + a)m
(Nagpur Univ B.Sc., 1989, '93)
Solution. p = Probability that a thing is received by man ... ~b' then
a +
q = 1 - P .. 1 - ---.!!.-b
a +
= ~b'
a +
is the probability tbat a thing is received by
woman.
The probability that out of m things exactly x are received by men and the
rest by women, .il1 given. by
p(x) .. mCxIfq"'-x; x ... 0,1,2, ... ,m
T~e probability P that tlie num~r of things received'by Illen is odd is given
by
P • P(' I) + P(3) + P(5) + ... - "'cI' q,"-I 'p + '" C3' q",-3 'p 3 + "'c5' q.. -5 'p5 + '"
Now
+ I'q... -1 'p.+ "'c2'q",-2 P2 + "'c3'q",-3 "p3 + "'c4'q",-4 'p4 + .. ,
(q+p)'" -q"''''c
and
( q-p ) '" -qIII -''"c"q",..,1 'p+'"'c2'q,"-2 'p2 - "'C,
3,'q,"-3 'p.3 +'"'c4'q111-4 'p4 - .. ,
., (q + pT -(q-pt =.2 [mCI . q"'-l.,p + mC3' q"'-.3·l + ... J= 2P
b-a
But q + P = 1 and q - p = -
,., ,..-a

1 _ (b - a)m _ 2P ==> P = ..! [(b + a)m - (b - at]


b + a 2 (b + ar
Example 7· 4 An irregular six faced di~ is. thrown and. the expectation
tlrat in 10 tlrrows it will give five even numbf!rs is twice the expectation that it will;
glveJour even numbers: How many times in 10,000 sets of 10 throws each. wollid'
you expect it to give no even number. (Gujarat Univ. B.Se.1988)
Solution, Let p be the probability of getting an even number in a thro~
of a die. Then,the probab,ility of getting x even numbers in ten throws of a die is
P(X = x) .... ( ~) If qIO-x.; X '" 0, 1, 2 ... ~0
f~retlcal DI~rete Probability Distributions 7·5

We are given tltat


p(X - 5) - 2 p(X - 4)
i.e., ( ~) i ~' = 2 ( ~) l q6
10 ! p _ 2 10 ! q
5 ! 5 ! 4 ! 6 !

:.3p=5q=5.o-p) ~ Sp=5 ~ p=5/Sendq-'3/S


5 3
(8)
x 10-x
:. p(X = x) .. ( ~) (~ 1
Hence, the required number O}'times that in 10,dOO s~ts of 10 throws each,
we get no even number

'"' 10,000 x P(X .. 0) = 10,000 x ( 8)


3 10
=1 (approx.)
Example ,·s In a precision bombing anack there is a 50% c/uince thaI
anyone bomb will striKe the target. Two direct hits are required to destroy the
target completely. How many bombs m~st be dropped to give a 99% chance or
beller of completely,des..troying the target? [Gauhati Univ. M.A., 1992]
Solution. ytle have:
p = Probability th!lt tl!e bomb 'strikes the target = 50% - ~. Let n be the
number of bombs_ which should be dropped to ensure 99% chance or -better of
completely destroying the target. This implies that RprobabiJity that out ofn bombs,
at least two strike the target, is greater thap 0·99 R. ,
Let X be a r.v. representing the number of bombs striking the target. Then
X -1J,(n,p = ,~) with

p(x) = p(X =x) .. ( : ) (


We should have: -
~ r· (~ r- x
- ( :)( ~ r; x .. 0; l,~_, n
P(X OP: 2) OP: 0-99
=> (1 - p(X s 1)] OP: 0-99
=> [1 - '(1'(0) + p(1)U OP: 0·99

=>

=> 0'01
1 - {(

OP: ~
~ ) + ( ~ )} ( ~
=>
r OP:

2ft x (0:01)" OP: 1 + n-


0·99

2ft
'2!' OP: 100 '+- 100 n-
By trial method, we find that the inequaJity,(*) is satisfied by n - 11:Heoce
the minimum number of bombs needed to destroy the target cOmpletely is 11.
Fundamenta~ of Mathematical StaUS•.lcs

--Ex~mple.7' 6 , A department in a works has 10 machines which may need


adjustment from time to time during·the day~ Three ofthese machines are old, each
having a probability of 1111 of needing adjustm~ duri{rg the day, and 7 are new
having corresponding probabilities of 1/21. . '
Assuming that no machine needs adjustment twice on the same day, deter.
mine t/le probabilits that on a particular day'.
. (i) just 2 old and no new machines need adjustment.
(ii) l[just 2 nrachines need adjustment, they are ofthe same type.
.' . (Nagpur Univ. B~E., 1989)
Solution. LetPI = Probability that an old machine needs adjustment
= 1111
qi = 1 - PI Z 10/11
and P2 .. Probability that a new machine' needs ~ajustment =1121
l/2 ,=-, 1 - P2 ... 20/21
, ,

Then Pj(r.) =-..P.robability that 'r' old machines ~eed a~justment


.. " .. }C,p'itrl-, ... 3C, (l0/11)3-, (1/11)'
an(J P2(r) = .Pi'~bability that 'r' new machine need ~dj!Jstment
~. ... 7C ,P2 qi-' == 7C,.·(1121)' (20121)7-,
(i) The probability that just two old machines and no new machine need
.~ajustment is given (by the compound probability theorem) by the expression:
. ,PI(2)' 1'2(0) = 3C2(1/11)2. (10/11) '(20/21)7 - 0·016
;: (iinSimiiarly the probability that just 2 new machines and no old machine
-,rie¢tt 'a<ljustment is ' . '
~ ... "'> , \ 3 1 ,t 2i 5.
..
PI(O) . P2(2) = (10/11)
&4
. C2 (1121) '. (20/21)
\
= 0·028 ~

.' 'T,he probability that "lfjust two machines need adjustment, they are oftbe
same type" is the same as the probability that "either just 2 old and no new or just
.2. I).ew and no old machin~s need adjustment".
. :. Required probability = 0·016 + 0-028 = 0·044
. . 7; 2· 1 l\'1oments. The first four moments about origin of binomial dis-
, '~~!i~'t.!ion ~re obtained as follows:

::~I' ,;. E(X) ...


x-o
i:, ; ( Xn )llcf-'% .. np x-I
i: (nx-- I1 ) JI-Icf-x.
=' np(q + p),,-I = np ( :., q + P = 1)
f

Thus the meaD ofthe binomial distribut~on is. tip.

• - I) - ;n,' xn:-"1I ' (nx-2


( xn), -n; ' (nx-I - 2)
n'n-I
- - ' -- . - n-2(n
-
x x,:,1 x-l x.-3,
. . 3) ,aQd"sooD.
, '
I' ..
fheO_retlcal Discrete Probability Dlstr,lbutions 7'"

1'2' .. E(r) s i:
x- 0
x2
~
( n ) pX
X
c/' - x

=x ::)x (x -
If 11 (n - 1) In - 2 \. .
1) + xl x (x _ 1) . x _ 2 pX q'7 x l J
c n (n - 1) p2 [x ~f (; =~) pX- 2 q"-x] + np

= n(n - l)i(q + p)If-2 + np = n(n - l)pL + np

1'3' ... E (X3) = i:


x _ 0
x 3 ( n) pX c/'
x.
-x
/I

'" I !x(.\"- l)(x - 2) + 3x(x - 1) + xl~c/'-x


x - 0

=n(n-l)(n-2)p3 i (n-3)px-3c/'-X
x - 3
x - 3

+ 3n (n - 1)/ r
x-2
(n - 22) px-2 q"!.ox + liP-
x- ,
= n(n - l)(n - 2)i (q + p),,-3 + 3n(n - 1)/ (q + pt-:. 2 + "I?
= ~ (n - l)(n - 2) l + 3n (n - 1) p2 + np
Similarly
X4 = x (x-l)'(x -2)(x-3) + 6x (x -l)(x..!. 2) + 7x (x-I) +x
, Leti. Ar(x-l){x -2)(x-3)+Bx(r-l)~-2)+Cx(x-l)+x
, By giving to x the values ~1, 2 and 3 respe~tively, we find the values of
arbitrary constants A, Band C. Therefore, ,

1'4' ... E (X,4) '" i: X4 ( n) pX P~-x


x _ 0 x .
= n (n -1) (n -2)Jn _3)p4 + fur (n -1) (n -~) l + 7n (n -1)/i"+'np
. ~~~~.~
Central Moments of Binomial Distribution: ,_
1'2 = I'l - 1'1,2 = n 2 p2 - n/ + np - n2 / = np(l-p) = npq
"r3 '" ~l3 ' - 3
1'2' 1'1 , +.21'1.,3

= (n (n -1) (n - 2) l + 3n (n _1);2 + np\- 3 (n (n _1)p2 + tip) np + 2 (np)3


2 2 -
'" np [- 3np + 3np + 2p - 3p + 1 - 3npq]
, I
= np [3np (1. - p) + '2p2,_ 3p"+ 1 - 3npq]
Fundamentals or Mathematical Statistics

. = np [2i - 3p + 1] .. np (2i - 2p + q) = npq (1 - 2p)


= npq [q + p - 2p] = npq (q - p)
114 ... 14' - 4"'3' "'1' + 6"'2' Ill' 2 - 3"'1,4 = npq [1 + 3 (n - 2) pq]
[On simplification]
Hence
... "'; ... n2p2 q2 (q _ p)2 .. (q _ p)2 = (1 - 2Pi ...(7.4)
~l "'~ ;j3 p3 cI' npq npq
Ul !
npq 1 ... 3 (n - 2) pq} 1 + 3 (n - 2) pq 1 - 6pq
~2 .. ~ = 2 2 2 == ... 3 + '. ...(7·5)
"'2 n P q npq npq

Yl ... Vjft"-... ~;;J .. liul! Y2 = ~2 - 3 = 1 - 6pq ... (7·5 a)


v npq npq' npq
Example 7·7 _ Comment on the following:
Tile mean of a binomi(#distribution is 3 and variance is 4'.
Solution. If tbe given binomial distrib'!.tion bas. para meters nand p, then
weare given
Mean= np .. 3
,
'an~ Va~a~ce ... npq," 4
Dividing (**) by (*), we get q .. 4/3,
wbicb is impossible, since probability cannot exceed unity. Hence tbe given
.sta,e.ment is wrong. .
Example 7·8. Tlte mean and variance of binomial distrwution are 4 and!
r~speftively. Find P (X ~ l). (Sardar Patel Univ. B.Se. 1993J
.Solution. LetX - B (n,p). Then we are given
Mean .. E (X) ... np .. 4
and Var(X) .. npq = j
Dividing, we get
q ... ! p=~
3

Substituting in (*), we get


\ 4 4 x 3 .
n .. - - - - .. 6.
p' 2,
'P~~ 1~ .. 1 - P (X .. 0) .. 1 - t/' .. 1 - '(1/3)6 = 1 - (1/729)
.. 1 - 0-00137 .. 0·99863
Example7· 9 If X - B (n, p), show that:

E (! "'-
n
2
p) ... I!9.. Cov
n' n'
(!
!!-=..
n)
!\.. _I!9.n
(Delhi Univ. B.Se., 1989)

You might also like