You are on page 1of 58

CHAPTER TWO

PROBABILITY & MOMENTS GENERATING AND CHARACTERISTICS


FUNCTIONS FOR PROBABILITY DISTRIBUTION

2.1: INTRODUCTION
2.2 PROBABILITY GENERATING FUNCTION (P.G.F) FOR DISCRETE RANDOM
VARIABLE
The probability generating function (PGF) is a useful tool for dealing with discrete random
variables taking values 0, 1, 2, .. Its particular strength is that it gives us an easy way of
characterizing the distribution of 𝑋 + 𝑌 when 𝑋 𝑎𝑛𝑑 𝑌 are independent. In general it is difficult
to find the distribution of a sum using the traditional probability function. The PGF transforms a
sum into a product and enables it to be handled much more easily.
Sums of random variables are particularly important in the study of stochastic processes,
because many stochastic processes are formed from the sum of a sequence of repeating steps: for
example, the Gambler’s Ruin.
The name probability generating function also gives us another clue to the role of the
PGF. The PGF can be used to generate all the probabilities of the distribution. This is generally
tedious and is not often an efficient way of calculating probabilities. However, the fact that it can
be done demonstrates that the PGF tells us everything there is to know about the distribution
Definition 2.1: Let 𝑢 (𝑥 ) = 𝑠 𝑘 and suppose 𝐸(𝑠 𝑘 ) exist then the probability generating function
𝑢(𝑧) 𝑜𝑟 𝐺𝑥 (𝑠) for a random variable is defined to be 𝐸(𝑠 𝑘 ), we say that probability generating
function for 𝑥 exist. If there is a positive constant such that 𝑠(𝑥) 𝑜𝑟 𝐺𝑥 (𝑠) is finite for (5)  h.

We use 𝑠 dummy variable in place t1 we write:



Gx ( s )  S
x 0
k
p ( x)        (1)

if x is descrete random var iable



Gx ( s )  S p ( x )        ( 2)
k



If 𝑥 is continuous random variable

Condition for probability generating function

i. Pk  0 for all values

ii. P
k x 1


iii. ∫−∞ 𝑃(𝑥 )𝑑𝑥 = 1
Common sums for probability generating function

1. Geometric Series

𝑎
1 + 𝑟 + 𝑟 + 𝑟 + ⋯ . = ∑ 𝑟𝑥 =
2 3
, 𝑤ℎ𝑒𝑛 |𝑟 | < 1.
1−𝑟
𝑥=0
2. Geometric progression with common ratio, 𝒓
𝑎 (1 − 𝑟 ) 𝑓𝑖𝑟𝑠𝑡 𝑡𝑒𝑟𝑚 (1 − 𝑟 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑡𝑒𝑟𝑚)
𝑎 + 𝑎𝑟 + 𝑎𝑟 2 + ⋯ + 𝑎𝑟 𝑛−1 = =
1−𝑟 1−𝑟
3. Binomial Theorem: For any 𝑝, 𝑞 ∈ ℝ, 𝑎𝑛𝑑 𝑖𝑛𝑡𝑒𝑔𝑒𝑟 𝑛,
𝑛 𝑛 𝑛!
(𝑝 + 𝑞 )𝑛 = ( ) 𝑝 𝑥 𝑞 𝑛−𝑥 ; 𝑛𝑜𝑡𝑒 𝑡ℎ𝑎𝑡 ( ) =
𝑥 𝑥 (𝑛 − 𝑥 )! 𝑥!
4. Exponential Power Series: For any 𝜆 ∈ ℝ,

𝜆𝑥 𝜆 𝑛
∑ = 𝑒𝜆 𝑎𝑛𝑑 lim (1 + ) = 𝑒 𝜆 𝑓𝑜𝑟 𝜆 ∈ ℝ
𝑥! 𝑛→∞ 𝑛
𝑋=0

Definition 2.2: The radius of convergence of probability generating function is a number 𝑅 >
0, such that the sum 𝐺𝑋 (𝑆) = ∑∞ 𝑥
𝑋=0 𝑆 ℙ(𝑋 = 𝑥) converges if |𝑆 | < 𝑅 𝑎𝑛𝑑 𝑑𝑖𝑣𝑒𝑟𝑔𝑒𝑠 (→ 𝑖∞) if
|𝑆| > 𝑅

(No general statement is made about what happens when |𝑆| = 𝑅.)

Fact: For any PGF, the radius of convergence exists.

It is always ≥ 1: every PGF converges for at least 𝑠 ∈ (−1, 1).

The radius of convergence could be anything from 𝑅 = 1 𝑡𝑜 𝑅 = ∞.


Note: This gives us the surprising result that the set of s for which the PGF 𝐺𝑋 (𝑆) converges is
symmetric about 0: the PGF converges for all 𝑠 ∈ (−𝑅, 𝑅), and for no 𝑆 < −𝑅 𝑜𝑟 𝑠 > 𝑅.

This is surprising because the PGF itself is not usually symmetric about 0: i.e. 𝐺𝑋 (−𝑆) ≠ 𝐺𝑋 (𝑆)
in general.

Q24. (a) If 𝑃 (𝑠) is the probability generating function for𝑋, find the generating function for
𝑎
(𝑥 − ). (b) let 𝑋 be a random variable with generating function of (𝑎) 𝑋 + 1 (𝑏 )2𝑋
𝑏

ANSWER
𝑎 𝑎 𝑥
(𝑥−𝑏 )
(a) 𝑃 (𝑆) = 𝐸 (𝑆 𝑘 ) = 𝐸 (𝑆 ) = 𝑆 −𝑏 . 𝐸 (𝑆 𝑏 )

𝑎 1 𝑥 𝑎 1
−(𝑏 ) −(𝑏 )
=𝑆 . 𝐸 [(𝑆 𝑏 ) ] = 𝑆 . 𝑃(𝑆 𝑏 )

(b) (𝑖) 𝑃(𝑘 = 𝑥 + 1) = 𝐸 (𝑆 𝑘 ) = ∑∞ 𝑘 ∞


𝑘=0 𝑝𝑘 . 𝑆 = ∑𝑘=0 𝑝𝑘 . 𝑆
(𝑋+1)

= 𝑆. ∑ 𝑝𝑘 . 𝑆 𝑥 = 𝑆. 𝐸 (𝑆 𝑘 ) = 𝑆. 𝑃(𝑠)
𝑥=0

(𝑖𝑖) 𝑃 (2𝑋) = 𝐸 (𝑆 2𝑋 ) = 𝐸 [(𝑠 2 ) 𝑋 ] = 𝐸(𝑠 2 )

Q25. Calculate the probability generating function of a discrete uniform random variable with
1 2 𝑛
support { , , … , }:
𝑛 𝑛 𝑛

𝑘 1
𝑃 (𝑥 = ) = ; 𝑘 = 1,2, … , 𝑛
𝑛 𝑛

ANSWER

Before anything we recall that for a geometric progression with common ratio 𝑟,

𝑎 (1 − 𝑟 ) 𝑓𝑖𝑟𝑠𝑡 𝑡𝑒𝑟𝑚 (1 − 𝑟 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑡𝑒𝑟𝑚)


𝑎 + 𝑎𝑟 + 𝑎𝑟 2 + ⋯ + 𝑎𝑟 𝑛−1 = =
1−𝑟 1−𝑟
𝑛 𝑛
𝑘 𝑘 1 𝑘
𝑃𝑋 (𝑠) = 𝐸 [𝑠 𝑋 ] = ∑ 𝑠𝑛 𝑃 (𝑋 = ) = ∑ 𝑠 𝑛
𝑛 𝑛
𝑘=1 𝑘=1

𝑛
1 1 𝑘 1 1 1 1 1 2 1 𝑘
= ∑ (𝑠 ) = . [(𝑠 ) + (𝑠 ) + (𝑠 ) + ⋯ + (𝑠 ) ]
𝑛 𝑛 𝑛 𝑛 𝑛
𝑛 𝑛
𝑘=1

1 1 𝑛
1
1 𝑆 [1 − (𝑆 ) ] 1 𝑆 𝑛 (1 − 𝑆)
𝑛 𝑛
= . 1 = . 1
𝑛 𝑛
1 − 𝑆𝑛 1 − 𝑆𝑛

Q26. The probability generating function of uniform distribution is given as

1 1 1 1 1 1
𝐺𝑋 (𝑆) = 0𝑆 0 + 𝑆1 + 𝑆 2 + 𝑆 3 + 𝑆 4 + 𝑆 5 + 𝑆 6
6 6 6 6 6 6

Drive the expectation and variance of the distribution.

ANSWER

1 1 1 1 1 1
𝐺𝑋 (𝑆) = 0𝑆 0 + 𝑆1 + 𝑆 2 + 𝑆 3 + 𝑆 4 + 𝑆 5 + 𝑆 6
6 6 6 6 6 6

1 1 1 1 1 1
𝐺𝑋′ (𝑆) = 1. 𝑆 0 + 2. 𝑆1 + 3. 𝑆 2 + 4. 𝑆 3 + 5. 𝑆 4 + 6. 𝑆 5
6 6 6 6 6 6
1 1 1 1 1
𝐺𝑋′′ (𝑆) = 2. 𝑆 0 + 6. 𝑆1 + 12. 𝑆 2 + 20. 𝑆 3 + 30. 𝑆 4
6 6 6 6 6

Accordingly:

1 1 1 1 1 1 1 1 1 1 1 1
𝐺𝑋 (𝑆 = 1) = 0 + . 11 + . 12 + . 13 + . 14 + . 15 + . 16 = + + + + + = 1
6 6 6 6 6 6 6 6 6 6 6 6
1 1 1 1 1 1
𝐺𝑋′ (𝑆 = 1) = 1. . 10 + 2. . 11 + 3. . 12 + 4. . 13 + 5. . 14 + 6. . 15
6 6 6 6 6 6
1+2+3+4+5+6 7
= =
6 2
1 1 1 1 1
𝐺𝑋′′ (𝑆 = 1) = 2. . 10 + 6. . 11 + 12. . 12 + 20. . 13 + 30. . 14
6 6 6 6 6
2 + 6 + 12 + 20 + 30 70 35
= = =
6 6 3
The variance is obtained as

2
𝑉𝑎𝑟 (𝑋) = 𝐺𝑋′′ (𝑆 = 1) + 𝐺𝑋′ (𝑆 = 1) − (𝐺𝑋′ (𝑆 = 1))

35 7 7 2 35 7 49
= + −( ) = + −
3 2 2 3 2 4

140 + 42 − 147 35
= =
12 12

Q27. The two dice were identical so that 𝐺𝑋 (𝑆) = 𝐺𝑌 (𝑆) = 𝐺(𝑆) and the square, 𝐺 2 , gave the
generating function for 𝑋 + 𝑌 as

1 1 1 1 1 1
𝐺𝑋 (𝑆) = 𝐺𝑌 (𝑆) = 𝐺 (𝑆) = 0𝑆 0 + 𝑆1 + 𝑆 2 + 𝑆 3 + 𝑆 4 + 𝑆 5 + 𝑆 6
6 6 6 6 6 6

And

2 1 2 2 3 3 4 4 5 6 6 7 5 8
𝐺𝑋+𝑌 (𝑆) = (𝐺(𝑆)) = 0𝑆 0 + 0𝑆1 + 𝑆 + 𝑆 + 𝑆 + 𝑆5 + 𝑆 + 𝑆 + 𝑆
36 36 36 36 36 36 36
4 3 10 2 1 12
+ 𝑆9 + 𝑆 + 𝑆11 + 𝑆
36 36 36 36
Obtain the expectation and variance of the distribution

ANSWER
2 1 2 2 3 3 4 4 5 6 6 7 5 8
𝐺𝑋+𝑌 (𝑆) = (𝐺(𝑆)) = 0𝑆 0 + 0𝑆1 + 𝑆 + 𝑆 + 𝑆 + 𝑆5 + 𝑆 + 𝑆 + 𝑆
36 36 36 36 36 36 36
4 3 10 2 11 1 12
+ 𝑆9 + 𝑆 + 𝑆 + 𝑆
36 36 36 36

′ 2 2 1 6 2 12 3 20 4 30 5 42 6 40 7
𝐺𝑋+𝑌 (𝑆) = (𝐺(𝑆)) = 𝑆 + 𝑆 + 𝑆 + 𝑆 + 𝑆 + 𝑆 + 𝑆
36 36 36 36 36 36 36
36 30 9 22 10 12 11
+ 𝑆8 + 𝑆 + 𝑆 + 𝑆
36 36 36 36
′′ ( ) 2 2 0 12 1 36 2 80 3 150 4 252 5 280 6
𝐺𝑋+𝑌 𝑆 = (𝐺(𝑆)) = 𝑆 + 𝑆 + 𝑆 + 𝑆 + 𝑆 + 𝑆 + 𝑆
36 36 36 36 36 36 36
288 7 270 8 220 9 132 10
+ 𝑆 + 𝑆 + 𝑆 + 𝑆
36 36 36 36
Accordingly:

1 2 2 3 4 4 5 5 6 6 7 5 8
𝐺𝑋+𝑌 (𝑆 = 1) = 0. 10 + 0. 11 + . 1 + . 13 + .1 + .1 + .1 + .1 + .1
36 36 36 36 36 36 36
4 3 10 2 11 1 12
+ . 19 + .1 + .1 + .1
36 36 36 36
1 + 2 + 3 + 4 + 5 + 6 + 5 + 4 + 3 + 2 + 1 36
= = =1
36 36


2 1 6 12 3 20 4 30 5 42 6 40 7
𝐺𝑋+𝑌 (𝑆 = 1) = . 1 + . 12 + .1 + .1 + .1 + .1 + .1
36 36 36 36 36 36 36
36 30 9 22 10 12 11
+ . 18 + .1 + .1 + .1
36 36 36 36

2 + 6 + 12 + 20 + 30 + 42 + 40 + 36 + 30 + 22 + 12 252
= = =7
36 36
′′ (
2 0 12 1 36 2 80 3 150 4 252 5 280 6
𝐺𝑋+𝑌 𝑆 = 1) = .1 + .1 + .1 + .1 + .1 + .1 + .1
36 36 36 36 36 36 36
288 7 270 8 220 9 132 10
+ .1 + .1 + .1 + .1
36 36 36 36
2 + 12 + 36 + 80 + 150 + 252 + 280 + 288 + 270 + 220 + 132 1722
= =
36 36
287
=
6
The variance is obtained as

2
𝑉𝑎𝑟 (𝑋) = 𝐺𝑋′′ (𝑆 = 1) + 𝐺𝑋′ (𝑆 = 1) − (𝐺𝑋′ (𝑆 = 1))

287 287 287 + 42 − 294


= + 7 − (7)2 = + 7 − 49 =
6 6 6

35
=
6

Q28. Let 𝑋 be a discrete random variable with probability generating function (PGF)

𝑆
𝐺𝑋 (𝑆) = (2 + 3𝑆 2 ). 𝐹𝑖𝑛𝑑 𝑡ℎ𝑒 𝑑𝑖𝑠𝑡𝑟𝑖𝑏𝑢𝑡𝑖𝑜𝑛 𝑜𝑓 𝑋
5

ANSWER

2 3
𝐺𝑋 (𝑆) = 𝑆 + 𝑆 3: 𝐺𝑋 (𝑆 = 0) = 𝑃(𝑋 = 0) = 0
5 5

2 9 2 2
𝐺𝑋′ (𝑆) = + 𝑆 : 𝐺𝑋′ (𝑆 = 0) = 𝑃(𝑋 = 1) =
5 5 5

18 1 ′′
𝐺𝑋′′ (𝑆) = 𝑆: 𝐺 (𝑆 = 0) = 𝑃(𝑋 = 2) = 0
5 2 𝑋

18 1 ′′ 1 18 3
𝐺𝑋′′′(𝑆) = : 𝐺𝑋 (𝑆 = 0) = 𝑃 (𝑋 = 2) = . =
5 3! 6 5 5

(𝑟) 1 (𝑟)
𝐺𝑋 (𝑆) = 0 ∀ 𝑟 ≥ 4: 𝐺 (𝑆 ) = 𝑃 (𝑋 = 𝑟 ) = 0 ∀ 𝑟 ≥ 4
𝑟! 𝑋

Thus,

2
1 𝑤𝑖𝑡ℎ 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦
𝑋={ 5
3
3 𝑤𝑖𝑡ℎ 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦
5

Q29. Let 𝑋~𝐵𝑒𝑟𝑛 (𝑝), find the probability generating function (PGF) of 𝑋 and hence find the
mean and variance of 𝑋
ANSWER

𝑝(𝑋 = 𝑥 ) = 𝑝𝑥 (1 − 𝑝)1−𝑥 ; 𝑥 = 0,1

1 1

𝐺𝑋 (𝑆) = ∑ 𝑆 𝑘 𝑝(𝑋 = 𝑥) = ∑ 𝑆 𝑥 𝑝𝑥 (1 − 𝑝)1−𝑥


𝑋=0 𝑋=0

1 1

= ∑(𝑆𝑝) 𝑥 (1 − 𝑝)1−𝑥 = ∑(𝑆𝑝) 𝑥 𝑞1−𝑥


𝑋=0 𝑋=0

= (𝑆𝑝)0 𝑞1−0 + (𝑆𝑝)1 𝑞1−1 = 𝑞 + 𝑆𝑝

To obtain the mean and variance

By taking first and second derivatives of 𝐺𝑥 (𝑠)

𝐺𝑋 (𝑆) = 𝑆𝑝 + 𝑞

𝑑
𝐺𝑥′ (𝑆) = (𝑆𝑝 + 𝑞) = 𝑝
𝑑𝑆

𝑑2 𝑑 𝑑
𝐺𝑥′′ (𝑆) = [ 𝐺𝑥 ( 𝑆 )] = [[𝐺𝑥
′ (𝑠 )]] = (𝑝) = 0
𝑑𝑆 2 𝑑𝑠 𝑑𝑆

=0

To obtain the mean and variance

𝐺𝑋 (𝑆 = 1) = 1. 𝑝 + 𝑞 = 𝑝 + 𝑞 = 1

𝜇1′ = 𝐺𝑥′ (𝑆 = 1) = 𝑝

𝜇1′ = 𝑝 = 𝑚𝑒𝑎𝑛

𝜇2′ = 𝐺𝑋′′ (𝑆 = 1) = 0

𝜇2 = 𝑣𝑎𝑟 (𝑥 ) = 𝐺𝑋′′ (𝑠 = 1) + 𝐺𝑥′ (𝑠 = 1) − [𝐺𝑥′ (𝑠 = 1)]2 = 0 + 𝑝 − (𝑝)2

= 𝑝 − 𝑝2 = 𝑝(1 − 𝑝)

= 𝑝𝑞
Q30. Derive the probability generating function of a random variable having Binomial
distribution. Hence, or otherwise obtain mean and variance of random variable of 𝑥.

ANSWER

If 𝑋~𝐵𝑖𝑛 (𝑛, 𝑝) 𝑤𝑖𝑡ℎ 𝑝. 𝑑. 𝑓

𝑛
𝑃(𝑥 ) = ( ) 𝑝𝑥 (1 − 𝑝) 𝑛−𝑥 ; 𝑥 = 0,1,2, … , 𝑛
𝑥

We defined prob. gen. function has


𝑛 𝑛
𝑛
𝐺𝑥 (𝑆) = ∑ 𝑆 𝑘 𝑃 (𝑥 ) = ∑ 𝑆 𝑥 ( ) 𝑝𝑥 (1 − 𝑝) 𝑛−𝑥
𝑥
𝑘=0 𝑥=0

𝑛
𝑛
= ∑ ( ) (𝑆𝑝) 𝑥 (1 − 𝑝) 𝑛−𝑥 (∗)
𝑥
𝑥=0

Recall the expansion of Binomial

𝑛
𝑛 𝑛 𝑛
(𝑎 + 𝑏 )𝑛 = 𝑎𝑛 + ( ) 𝑎𝑛−1 𝑏 + ( ) 𝑎𝑛−2 𝑏 2 + ⋯ + 𝑏 𝑛 = ∑ ( ) 𝑎 𝑥 𝑏 𝑛−𝑥 (∗∗)
1 2 𝑥
𝑥

compare 𝑒𝑞𝑛 (∗) 𝑎𝑛𝑑 (∗∗) we deduce


𝑛
𝐺𝑥 (𝑠) = (𝑆𝑝 + (1 − 𝑝)) = (𝑆𝑝 + 𝑞 )𝑛

To obtain the mean and variance

By taking first and second derivatives of 𝐺𝑥 (𝑠)

𝑑
𝐺𝑥′ (𝑆) = ((𝑆𝑝 + 𝑞 )𝑛 ) = 𝑛𝑝(𝑆𝑝 + 𝑞 )𝑛−1
𝑑𝑆

𝑑2 𝑑 𝑑
𝐺𝑥′′ (𝑆) = 2
[𝐺𝑥 (𝑆)] = [[𝐺𝑥′ (𝑠)]] = [𝑛𝑝(𝑆𝑝 + 𝑞 )𝑛−1 ]
𝑑𝑠 𝑑𝑠 𝑑𝑠

= 𝑛(𝑛 − 1)𝑝2 [(𝑆𝑝 + 𝑞 )𝑛−2 ]

To obtain the mean and variance

𝜇1′ = 𝐺𝑥′ (𝑠 = 1) = 𝑛𝑝(𝑝 + 𝑞 )𝑛−1 𝑤ℎ𝑒𝑟𝑒 𝑝 + 𝑞 = 1


𝜇1′ = 𝑛𝑝(1) = 𝑛𝑝 = 𝑚𝑒𝑎𝑛

𝜇2′ = 𝐺𝑋′′ (𝑠 = 1) = 𝑛 (𝑛 − 1)𝑝2 [(𝑝 + 𝑞 )𝑛−2 ]

= 𝑛 (𝑛 − 1)𝑝2 (1) = 𝑛 2 𝑝2 − 𝑛𝑝2

𝜇2 = 𝑣𝑎𝑟 (𝑥 ) = 𝐺𝑋′′ (𝑠 = 1) + 𝐺𝑥′ (𝑠 = 1) − [𝐺𝑥′ (𝑠 = 1)]2

= 𝑛 2 𝑝2 − 𝑛𝑝2 + 𝑛𝑝 − (𝑛𝑝) 2 = 𝑛 2 𝑝2 − 𝑛𝑝2 + 𝑛𝑝 − 𝑛 2 𝑝2

= 𝑛𝑝 − 𝑛𝑝2 = 𝑛𝑝(1 − 𝑝)

= 𝑛𝑝𝑞

Q31. We have a pile of identical biased coins, each of which has a probability 𝑝 of landing heads
when tossed. Suppose we take 3 of these coins for ourselves and hand 5 others to a friend. We then
toss coins and count the number of heads, clearly a number in the range 0 𝑡𝑜 3. Our friend
(independently) tosses the other coins and again counts the number of heads, a number in the range
0 𝑡𝑜 5. We then add our head count to our friend’s head count. Obtain the probability generating
function of two experiments. Hence, otherwise, find the expected value and variance of the
distribution.

ANSWER

Let 𝑋 be the random number whose value 𝑟 is the number of heads we obtain and 𝑌 be the
random number whose value 𝑡 is the number of heads our friend obtains. Clearly, 𝑋 is distributed
Binomial (3, 𝑝) and 𝑌 is distributed Binomial(5, 𝑝). It will be shown that 𝑋 + 𝑌 is distributed
Binomial (8, 𝑝). Given that 𝑋 𝑎𝑛𝑑 𝑌 are binomially distributed:

3 5
𝑃(𝑋 = 𝑟 ) = ( ) 𝑝𝑟 𝑞 3−𝑟 ; 𝑟 = 0,1,2,3 𝑎𝑛𝑑 𝑃 (𝑌 = 𝑡) = ( ) 𝑝𝑡 𝑞 5−𝑡 ; 𝑡 = 0,1,2,3,4,5
𝑟 𝑡

The generating functions associated with 𝑋 𝑎𝑛𝑑 𝑌 are:

3 3
3 3
𝐺𝑋 (𝑆) = ∑ 𝑆 𝑟 ( ) 𝑝𝑟 𝑞 3−𝑟 = ∑ ( ) (𝑆𝑝)𝑟 𝑞 3−𝑟 = (𝑞 + 𝑆𝑝)3
𝑟 𝑟
𝑟=0 𝑋=0

And
5 5
5 5
𝐺𝑌 (𝑆) = ∑ 𝑆 𝑡 ( ) 𝑝𝑡 𝑞 5−𝑡 = ∑ ( ) (𝑆𝑝)𝑡 𝑞 5−𝑡 = (𝑞 + 𝑆𝑝) 5
𝑡 𝑡
𝑡=0 𝑡=0

Then,

𝐺𝑋+𝑌 (𝑆) = 𝐺𝑋 (𝑆). 𝐺𝑌 (𝑆) = (𝑞 + 𝑆𝑝)3 . (𝑞 + 𝑆𝑝) 5

= (𝑞 + 𝑆𝑝) 8

To obtain the mean and variance

By taking first and second derivatives of 𝐺𝑋+𝑌 (𝑆)

𝑑
𝐺𝑋′ (𝑆) = ((𝑆𝑝 + 𝑞 )8 ) = 8𝑝 (𝑆𝑝 + 𝑞 )8−1 = 8𝑝(𝑆𝑝 + 𝑞 )7
𝑑𝑆

𝑑2 𝑑
𝐺𝑋′′ (𝑆) = [ 𝐺𝑋 ( 𝑆 )] = [8𝑝(𝑆𝑝 + 𝑞 )7 ]
𝑑𝑆 2 𝑑𝑠

= 8(7)𝑝2 [(𝑆𝑝 + 𝑞 )6 ]

= 56𝑝2 [(𝑆𝑝 + 𝑞 )6 ]

To obtain the mean and variance

𝜇1′ = 𝐺𝑥′ (𝑠 = 1) = 8𝑝 (𝑝 + 𝑞 )𝑛−1 𝑤ℎ𝑒𝑟𝑒 𝑝 + 𝑞 = 1

𝜇1′ = 8𝑝(1) = 8𝑝 = 𝑚𝑒𝑎𝑛

𝜇2′ = 𝐺𝑋′′ (𝑠 = 1) = 8(7) 𝑝2 [(𝑝 + 𝑞 )𝑛−2 ]

= 56𝑝2

𝜇2 = 𝑣𝑎𝑟 (𝑥 ) = 𝐺𝑋′′ (𝑠 = 1) + 𝐺𝑥′ (𝑠 = 1) − [𝐺𝑥′ (𝑠 = 1)]2

= 56𝑝2 + 8𝑝 − (8𝑝) 2 = 56𝑝2 + 8𝑝 − 64𝑝2

= 8𝑝 − 64𝑝2

= 8𝑝(1 − 8𝑝)

Q32. Let 𝑋~𝐵𝑖𝑛𝑜𝑚𝑖𝑎𝑙(𝑛, 𝑝). What is the radius of convergence of 𝐺𝑋 (𝑆)?


ANSWER

𝑛
𝑃(𝑥 ) = ( ) 𝑝𝑥 (1 − 𝑝) 𝑛−𝑥 ; 𝑥 = 0,1,2, … , 𝑛
𝑥

We defined prob. gen. function has


𝑛 𝑛
𝑛
𝐺𝑥 (𝑆) = ∑ 𝑆 𝑘 𝑃 (𝑥 ) = ∑ 𝑆 𝑥 ( ) 𝑝𝑥 (1 − 𝑝) 𝑛−𝑥
𝑥
𝑘=0 𝑥=0

𝑛
𝑛
= ∑ ( ) (𝑆𝑝) 𝑥 (1 − 𝑝) 𝑛−𝑥 (∗)
𝑥
𝑥=0

Recall the expansion of Binomial


𝑛
𝑛 𝑛 𝑛
(𝑎 + 𝑏 )𝑛 = 𝑎 + ( ) 𝑎𝑛−1 𝑏 + ( ) 𝑎𝑛−2 𝑏 2 + ⋯ + 𝑏 𝑛 = ∑ ( ) 𝑎 𝑥 𝑏 𝑛−𝑥 (∗∗)
𝑛
1 2 𝑥
𝑥

compare 𝑒𝑞𝑛 (∗) 𝑎𝑛𝑑 (∗∗) we deduce

𝑛
𝐺𝑥 (𝑠) = (𝑆𝑝 + (1 − 𝑝)) = (𝑆𝑝 + 𝑞 )𝑛 , by the Binomial theorem: true for all 𝑺

This is true for all −∞ < 𝑺 < ∞, so the radius of convergence is 𝑹 = ∞.

Q33. Let𝑋~𝐺𝑒𝑜𝑚𝑒𝑡𝑟𝑖𝑐(𝑝), so that 𝑃 (𝑋 = 𝑥 ) = 𝑝(1 − 𝑝) 𝑥 = 𝑝𝑞 𝑥 𝑓𝑜𝑟 𝑥 = 0,1, …,

Obtain the probability generating function, 𝐺𝑋 (𝑆), hence otherwise, find the expected and
variance of 𝑋.

ANSWER

The set of probabilities for the Geometric distribution can be defined as:

𝑃(𝑋 = 𝑥 ) = 𝑝𝑞 𝑥 𝑤ℎ𝑒𝑟𝑒 𝑥 = 0,1,2, …,

Remember, this represents 𝑥 successive failures (each of probability 𝑞) before a single success
(probability 𝑝). Note that 𝑥 is unbounded; there can be an indefinite number of failures before
the first success.

The generating function is:


∞ ∞

𝐺𝑋 (𝑆) = ∑ 𝑆 𝑃 (𝑋 = 𝑥 ) = ∑ 𝑆 𝑥 𝑝𝑞 𝑥
𝑥

𝑥=0 𝑥=0

= 𝑝 ∑(𝑞𝑆) 𝑥
𝑥=0

= 𝑝[(𝑞𝑆)0 + (𝑞𝑆)1 + (𝑞𝑆)2 + (𝑞𝑆)3 + ⋯ ]

The item in brackets is easily recognized as an indefinite geometric progression, the expansion of

1
1 + 𝑞𝑆 + 𝑞 2 𝑆 2 + 𝑞 3 𝑆 3 + ⋯ =
1−𝑞𝑆

So the generating function and its first two derivatives are:

1 𝑝 1
𝐺𝑋 (𝑆) = 𝑝. = = 𝑝(1 − 𝑞𝑆)−1 𝑓𝑜𝑟 |𝑆| <
1 − 𝑞𝑆 1 − 𝑞𝑆 𝑞

𝑑
𝐺𝑋′ (𝑆) = 𝑝 [(1 − 𝑞𝑆)−1 ] = 𝑝𝑞 (1 − 𝑞𝑆)−2
𝑑𝑆

𝑑2 𝑑 𝑑
𝐺𝑋′′ (𝑆) = 𝑝 2
(𝐺𝑋 (𝑆)) = 𝑝 [𝐺𝑋′ (𝑆)] = 𝑝𝑞 [(1 − 𝑞𝑆)−2 ] = 2𝑝𝑞 2 (1 − 𝑞𝑆)−3
𝑑𝑆 𝑑𝑆 𝑑𝑆

To obtain the mean and variance

1 𝑝 𝑝
𝐺𝑋 (𝑆 = 1) = 𝑝. = = 𝑝(𝑝) −1 = = 1
1 − 𝑞. 1 1 − 𝑞 𝑝

𝜇1′ = 𝐺𝑋′ (𝑆 = 1) = 𝑝𝑞 (1 − 𝑞. 1)−2 𝑤ℎ𝑒𝑟𝑒 1 − 𝑞 = 𝑝

𝑝𝑞 𝑝𝑞 𝑞 1 − 𝑝
𝜇1′ = = = = = 𝑚𝑒𝑎𝑛
(1 − 𝑞 )2 𝑝2 𝑝 𝑝

𝜇2′ = 𝐺𝑋′′ (𝑆 = 1) = 2𝑝𝑞 2 (1 − 𝑞. 1)−3

2𝑝𝑞 2 2𝑝𝑞 2
= =
(1 − 𝑞 )3 𝑝3

2𝑞 2
= 2
𝑝

𝜇2 = 𝑣𝑎𝑟 (𝑥 ) = 𝐺𝑋′′ (𝑠 = 1) + 𝐺𝑥′ (𝑠 = 1) − [𝐺𝑥′ (𝑠 = 1)]2


2𝑞 2 𝑞 𝑞 2 2𝑞 2 𝑞 𝑞 2
= 2 + −( ) = 2 + − 2
𝑝 𝑝 𝑝 𝑝 𝑝 𝑝

𝑞 2 𝑞 𝑞 2 + 𝑝𝑞 𝑞 (𝑞 + 𝑝)
= + = = 𝑤ℎ𝑒𝑟𝑒 𝑝 + 𝑞 = 1
𝑝2 𝑝 𝑝2 𝑝2

𝑞 1−𝑝
= 2
= 2
𝑝 𝑝

Note: both the mean and variance of the Geometric distribution are difficult to derive without using
the generating function.

Q34. Let 𝑋~𝐺𝑒𝑜𝑚𝑒𝑡𝑟𝑖𝑐 (𝑝 = 0.8). What is the radius of convergence of 𝐺𝑋 (𝑆)?

ANSWER
∞ ∞

𝐺𝑋 (𝑆) = ∑ 𝑆 𝑥 𝑝 (1 − 𝑝) 𝑥 = ∑ 𝑆 𝑥 (0.8)(0.2) 𝑥
𝑥=0 𝑥=0

= 0.8 ∑ (0.2𝑆) 𝑥 = 0.8[(0.2𝑆) 0 + (0.2𝑆)1 + (0.2𝑆)2 + (0.2𝑆)3 + ⋯ ]


𝑥=0

= 0.8[1 + 0.2𝑆 + 0.04𝑆 2 + 0.008𝑆 3 + ⋯ ]

0.8
= 𝑓𝑜𝑟 𝑎𝑙𝑙 𝑆 𝑠𝑢𝑐ℎ 𝑡ℎ𝑎𝑡 |0.2𝑆| < 1.
1−0.2𝑆

1
This is valid for all 𝑆 𝑤𝑖𝑡ℎ |0.2𝑆| < 1, so it is valid for all 𝑆 with |𝑆| < 0.2 = 5 (𝑖. 𝑒. −5 < 𝑥 < 5)

The radius of convergence is 𝑹 = 𝟓

Q35. My aunt was asked by her neighbours to feed the prize goldfish in their garden pond while
they were on holiday. Although my aunt dutifully went and fed them every day, she never saw a
single fish for the whole three weeks. It turned out that all the fish had been eaten by a heron when
she wasn’t looking!

Let 𝑁 be the number of times the heron visits the pond during the neighbours’ absence. Suppose
that 𝑁 ∼ 𝐺𝑒𝑜𝑚𝑒𝑡𝑟𝑖𝑐(1 − 𝑝), so 𝑃(𝑁 = 𝑛) = (1 − 𝜆)𝜆𝑛, for 𝑛 = 0, 1, 2, .. .. When the
heron visits the pond it has probability 𝑝 of catching a prize goldfish, independently of what
happens on any other visit. (This assumes that there are infinitely many goldfish to be caught!)
Find the distribution of 𝑇 = 𝑡𝑜𝑡𝑎𝑙 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑔𝑜𝑙𝑑𝑓𝑖𝑠ℎ 𝑐𝑎𝑢𝑔ℎ𝑡.

ANSWER

1 𝑖𝑓 ℎ𝑒𝑟𝑜𝑛 𝑐𝑎𝑡𝑐ℎ𝑒𝑠 𝑎 𝑓𝑖𝑠ℎ 𝑜𝑛 𝑣𝑖𝑠𝑖𝑡 𝑖


Let 𝑋𝑖 = {
0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒

Then 𝑇 = 𝑋1 + 𝑋2 + . . . + 𝑋𝑁 (randomly stopped sum),

so 𝐺𝑇 (𝑆) = 𝐺𝑁 (𝐺𝑋 (𝑆)).

Now

𝐺𝑋 (𝑆) = 𝐸 (𝑆 𝑋 ) = ∑ 𝑆 𝑥 𝑃(𝑋 = 𝑥 ) = 𝑆 0 × 𝑃 (𝑋 = 0) + 𝑆1 × 𝑃 (𝑋 = 1) = 𝑞 + 𝑝𝑆
𝑥=0

= 1 − 𝑝 + 𝑝𝑆.

Also,
∞ ∞

𝐺𝑁 (𝑟 ) = ∑ 𝑟 𝑛 ℙ( 𝑁 = 𝑛 ) = ∑ 𝑟 𝑛 (1 − 𝜆)𝜆𝑛
𝑥=0 𝑥=0

= (1 − 𝜆) ∑(𝜆𝑟)𝑛 = (1 − 𝜆)[(𝜆𝑟 )0 + (𝜆𝑟 )1 + (𝜆𝑟 )2 + (𝜆𝑟 )3 + ⋯ ]


𝑥=0

= (1 − 𝜆)[1 + 𝜆𝑟 + 𝜆2 𝑟 2 + 𝜆3 𝑟 3 + ⋯ . ]

1−𝜆 1
= (𝑟 < )
1 − 𝜆𝑟 𝜆

So

1−𝜆
𝐺𝑇 ( 𝑆 ) = (𝑃𝑢𝑡𝑡𝑖𝑛𝑔 𝑟 = 𝐺𝑋 (𝑆))
1 − 𝜆𝐺𝑋 (𝑆)

Giving:

1−𝜆
𝐺𝑇 ( 𝑆 ) = , 𝑤ℎ𝑒𝑟𝑒 𝐺𝑋 (𝑆) = 1 − 𝑝 + 𝑝𝑆
1 − 𝜆(1 − 𝑝 + 𝑝𝑆)
1−𝜆
=
1 − 𝜆 + 𝜆𝑝 − 𝜆𝑝𝑆

1−𝜋
[Could this be geometric? 𝐺𝑇 (𝑆) = 𝑓𝑜𝑟 𝑠𝑜𝑚𝑒 𝜋]
1−𝜋𝑆

1−𝜆
𝐺𝑇 ( 𝑆 ) = (𝑑𝑖𝑖𝑣𝑖𝑑𝑒𝑑 𝑏𝑜𝑡ℎ 𝑛𝑢𝑚𝑒𝑟𝑎𝑡𝑜𝑟 𝑎𝑛𝑑 𝑑𝑒𝑛𝑜𝑚𝑖𝑛𝑎𝑡𝑜𝑟 𝑏𝑦 1 − 𝜆 + 𝜆𝑝)
(1 − 𝜆 + 𝜆𝑝) − 𝜆𝑝𝑆
1−𝜆
( )
1 − 𝜆 + 𝜆𝑝
𝐺𝑇 ( 𝑆 ) =
(1 − 𝜆 + 𝜆𝑝) − 𝜆𝑝𝑆
( )
1 − 𝜆 + 𝜆𝑝
1 − 𝜆 + 𝜆𝑝 − 𝜆𝑝
( )
1 − 𝜆 + 𝜆𝑝
=
𝜆𝑝
1−( )𝑆
1 − 𝜆 + 𝜆𝑝
𝜆𝑝
1−( )
1 − 𝜆 + 𝜆𝑝
=
𝜆𝑝
1−( )𝑆
1 − 𝜆 + 𝜆𝑝
𝜆𝑝
This is the PGF of the Geometric 1 − ( ) distribution, so by unique-ness of PGFs, we
1−𝜆+𝜆𝑝
have:
1−𝜆
𝑇~𝐺𝑒𝑜𝑚𝑒𝑡𝑟𝑖𝑐 ( )
1 − 𝜆 + 𝜆𝑝
Note: we could have solved the Q34 without using the PGF but it much more difficult. PGFs are
very useful for dealing with sums of random variables, which are difficult to tackle using the
standard probability function.
ALTERNATIVE METHOD
Here are the first few steps of solving the Q34 without the PGF. Recall the problem:
 Let 𝑁~𝐺𝑒𝑜𝑚𝑒𝑡𝑟𝑖𝑐 (1 − 𝜆), so ℙ(𝑁 = 𝑛 ) = (1 − 𝜆)𝜆𝑛 ;
 Let 𝑋1 , 𝑋2 , …. Be independent of each other and of 𝑁, with 𝑋𝑖 ~𝑏𝑖𝑛(1, 𝑝)
(remember 𝑋𝑖 = 1 with probability 𝑝, and 0 otherwise);
 Let 𝑇 = 𝑋1 + 𝑋2 + … + 𝑋𝑁 be the randomly stopped sum
 Find the distribution of 𝑇
Without using the PGF, we would tackle this by looking for an expression for 𝑃(𝑇 = 𝑡) for any
𝑡. Once we have obtained that expression, we might be able to see that T has a distribution we
recognize (e.g. Geometric), or otherwise we would just state that 𝑇 is defined by the probability
function we have obtained.
To find 𝑃(𝑇 = 𝑡), we have to partition over different values of 𝑁:

𝑃(𝑇 = 𝑡) = ∑ 𝑃 (𝑇 = 𝑡\𝑁 = 𝑛 ) ℙ(𝑁 = 𝑛 ) (∗)


𝑋=0

Here, we are lucky that we can write down the distribution of 𝑇 | 𝑁 = 𝑛:


 if 𝑁 = 𝑛 is fixed, then 𝑇 = 𝑋1 + . . . + 𝑋𝑛 is a sum of 𝑛 independent 𝐵𝑖𝑛𝑜𝑚𝑖𝑎𝑙(1, 𝑝)
random variables, so (𝑻 | 𝑵 = 𝒏) ∼ 𝑩𝒊𝒏𝒐𝒎𝒊𝒂𝒍(𝒏, 𝒑).
For most distributions of 𝑋, it would be difficult or impossible to write down the distribution of
𝑿𝟏 + . . . + 𝑿𝒏 :
we would have to use an expression like
𝑃(𝑋1 + . . . + 𝑋𝑁 = 𝑡| 𝑁 = 𝑛)
𝑡 𝑡−𝑥1 𝑡−(𝑥1+⋯+𝑥𝑛−2

= ∑ ∑ … ∑ {ℙ(𝑋1 = 𝑥1 ) × ℙ(𝑋2 = 𝑥2 ) × … × ℙ(𝑋𝑛−1 = 𝑥𝑛−1 )


𝑥1 =0 𝑥2 =0 𝑥𝑛−1
× ℙ[(𝑋𝑛 = 𝑡 − (𝑥1 + ⋯ + 𝑋𝑛−1 )]}
Back to the heron problem: we are lucky in this case that we know the distribution of (𝑇 | 𝑁 =
𝑛) 𝑖𝑠 𝐵𝑖𝑛𝑜𝑚𝑖𝑎𝑙(𝑁 = 𝑛, 𝑝), so
𝑛
𝑃(𝑇 = 𝑡| 𝑁 = 𝑛) = ( ) 𝑝𝑡 (1 − 𝑝) 𝑛−𝑡 𝑓𝑜𝑟 𝑡 = 0, 1, . . . , 𝑛.
𝑡
Continuing from (⋆):

𝑃(𝑇 = 𝑡) = ∑ 𝑃(𝑇 = 𝑡| 𝑁 = 𝑛) 𝑃(𝑁 = 𝑛)


𝑛=0

𝑛
= ∑ ( ) 𝑝𝑡 (1 − 𝑝) 𝑛−𝑡 (1 − 𝜆)𝜆𝑛
𝑡
𝑛=0

𝑛 1 𝑡 𝑛
= (1 − 𝜆) ∑ ( ) 𝑝𝑡 (1 − 𝑝)𝑛 × ( ) 𝜆
𝑡 1−𝑝
𝑛=0

𝑝 𝑡 𝑛
= (1 − 𝜆 ) ( ) ∑ ( ) [𝜆(1 − 𝑝)]𝑛 (∗∗)
1−𝑝 𝑡
𝑛=0

We can evaluate the sum in (⋆⋆) using the fact that Negative Binomial probabilities sum to 1.
We can try this if we like, but it is quite tricky.
[Hint: use the Negative Binomial (𝑡 + 1, 1 − 𝜆(1 − 𝑝)) distribution.].
1−𝜆
Overall, we obtain the same answer that 𝑇 ∼ 𝐺𝑒𝑜𝑚𝑒𝑡𝑟𝑖𝑐 ( ), but hopefully we can see
1−𝜆+𝜆𝑝
why the PGF is so useful.
Without the PGF, we have two major difficulties:
1. Writing down 𝑃(𝑇 = 𝑡| 𝑁 = 𝑛);
2. Evaluating the sum over 𝑛 𝑖𝑛 (⋆⋆).
For a general problem, both of these steps might be too difficult to do without a computer. The
PGF has none of these difficulties, and even if 𝐺𝑇 (𝑆) does not simplify readily, it still tells us
everything there is to know about the distribution of 𝑇.
Q36. The set of probabilities for the Poisson distribution (𝑖. 𝑒. 𝑋~𝑃𝑜 (𝜆)) can be defined as:
𝜆𝑥 −𝜆
𝑃 (𝑋 = 𝑥 ) = 𝑒 𝑤ℎ𝑒𝑟𝑒 𝑥 = 0,1, …
𝑥!
Obtain the probability generating function. Hence otherwise, find 𝐸 (𝑋) 𝑎𝑛𝑑 𝑣𝑎𝑟(𝑋)
ANSWER
∞ ∞
𝜆𝑥 −𝜆 𝜆𝑥
𝐺𝑋 (𝑆) = 𝐸 (𝑆 𝑋 ) =∑ 𝑆𝑥 𝑒 = 𝑒 −𝜆 ∑ 𝑆 𝑥
𝑥! 𝑥!
𝑥=0 𝑥=0

(𝜆𝑆) 𝑥 (𝜆𝑆)0 (𝜆𝑆)1 (𝜆𝑆)2 (𝜆𝑆)3
= 𝑒 −𝜆 ∑ = 𝑒 −𝜆 [ + + + + ⋯]
𝑥! 0! 1! 2! 3!
𝑥=0

The item in brackets is easily recognized as an exponential series, the expansion of 𝑒 𝜆𝑆 ,


𝜆𝑆 𝜆2 𝑆 2 𝜆3 𝑆 3
1+ + + + ⋯ = 𝑒 𝜆𝑆
1! 2! 3!

So the generating function and its first two derivatives are:

𝐺𝑋 (𝑆) = 𝑒 −𝜆 . 𝑒 𝜆𝑆 = 𝑒 −𝜆(1−𝑆)

𝑑 −𝜆(1−𝑆)
𝐺𝑋′ (𝑆) = [𝑒 ] = 𝜆(𝑒 −𝜆(1−𝑆) )
𝑑𝑆

𝑑2 𝑑 𝑑 −𝜆(1−𝑆)
𝐺𝑋′′ (𝑆) = 2 (𝐺𝑋 (𝑆)) = [𝐺𝑋′ (𝑆)] = 𝜆 [𝑒 ] = 𝜆2 (𝑒 −𝜆(1−𝑆) )
𝑑𝑆 𝑑𝑆 𝑑𝑆

To obtain the mean and variance

𝐺𝑋 (𝑆 = 1) = 𝑒 −𝜆(1−1) = 𝑒 0 = 1
𝜇1′ = 𝐺𝑋′ (𝑆 = 1) = 𝜆(𝑒 −𝜆(1−𝑆) ) = 𝜆𝑒 0 = 𝜆. 1 = 𝜆

𝜇1′ = 𝜆 = 𝑚𝑒𝑎𝑛

𝜇2′ = 𝐺𝑋′′ (𝑆 = 1) = 𝜆2 (𝑒 −𝜆(1−1) )

= 𝜆2 𝑒 0 = 𝜆2 . 1

= 𝜆2

𝜇2 = 𝑣𝑎𝑟 (𝑥 ) = 𝐺𝑋′′ (𝑠 = 1) + 𝐺𝑥′ (𝑠 = 1) − [𝐺𝑥′ (𝑠 = 1)]2

= 𝜆2 + 𝜆 − (𝜆)2 = 𝜆2 + 𝜆 − 𝜆2

=𝜆

Q37. Suppose that 𝑋1 , … , 𝑋𝑛 are independent random variables and let = 𝑋1 + ⋯ + 𝑋𝑛 . Show
𝑁

𝐺𝑌 (𝑆) = ∏ 𝐺𝑋𝑖 (𝑆).


𝐼=1

ANSWER
𝐺𝑌 (𝑆) = 𝔼(𝑆 𝑌 ) = 𝔼(𝑆 (𝑋1+⋯+𝑋𝑛) )

= 𝔼(𝑆 𝑋1 𝑆𝑋2 𝑆 𝑋3 … 𝑆 𝑋𝑛 )
= 𝔼(𝑆 𝑋1 )𝔼(𝑆𝑋2 ) … 𝔼(𝑆 𝑋𝑛 ) (𝑏𝑒𝑐𝑎𝑢𝑠𝑒 𝑋1 , … , 𝑋𝑛 𝑎𝑟𝑒 𝑖𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡)
𝑁

𝐺𝑌 (𝑆) = ∏ 𝐺𝑋𝑖 (𝑆). 𝑎𝑠 𝑟𝑒𝑞𝑢𝑖𝑟𝑒𝑑


𝐼=1

Q38. Suppose that 𝑋 𝑎𝑛𝑑 𝑌 are independent with 𝑋 ∼ 𝑃𝑜𝑖𝑠𝑠𝑜𝑛(𝜆) 𝑎𝑛𝑑 𝑌 ∼ 𝑃𝑜𝑖𝑠𝑠𝑜𝑛(𝜃).
Find the distribution of + 𝑌 . Using probability generating function
ANSWER
If 𝑋 ∼ 𝑃𝑜𝑖𝑠𝑠𝑜𝑛(𝜆), then the probability generating function is obtained as

𝐺𝑋 (𝑆) = 𝑒 𝜆(𝑆−1)
Similarly, if 𝑌 ∼ 𝑃𝑜𝑖𝑠𝑠𝑜𝑛(𝜃), then the probability generating function is obtained as

𝐺𝑌 (𝑆) = 𝑒 𝜃(𝑆−1)
Therefore, the distribution of 𝑋 𝑎𝑛𝑑 𝑌 is obtained as follow:
𝐺𝑋+𝑌 (𝑆) = 𝐸 (𝑆 𝑋+𝑌 ) = 𝐸 (𝑆 𝑋 . 𝑆 𝑌 ) = 𝐸 (𝑆 𝑋 )𝐸(𝑆 𝑌 )
= 𝐺𝑋 (𝑆). 𝐺𝑌 (𝑆)

= 𝑒 𝜆(𝑆−1) . 𝑒 𝜃(𝑆−1) = 𝑒 𝜆(𝑆−1)+𝜃(𝑆−1)

= 𝑒 (𝜆+𝜃)(𝑆−1)
This is the probability generating function (PGF) of the Poisson(𝜆 + 𝜃) distribution. So, by the
uniqueness of PGFs, 𝑋 + 𝑌~𝑃𝑜𝑖𝑠𝑠𝑜𝑛(𝜆 + 𝜃).
Q39. Let 𝑋1 , 𝑋2 , … be a sequence of independent and identically distributed random variables with
common PGF 𝐺𝑋 . Let 𝑁 be a random variable, independent of the 𝑋𝑖 ’𝑠 , with PGF 𝐺𝑁 , and let

𝑇𝑁 = 𝑋1 +. . . +𝑋𝑁 = ∑𝑁
𝑖=1 𝑋𝑖 .

Show that the PGF of 𝑇𝑁 is: 𝐺𝑇𝑁 (𝑆) = 𝐺𝑁 (𝐺𝑋 (𝑆) ). Find the mean of 𝑇𝑁

ANSWER
𝐺𝑇𝑁 (𝑆) = 𝔼(𝑆 𝑇𝑁 ) = 𝔼(𝑆𝑋1 +⋯+𝑋𝑁 )

= 𝔼𝑁 {𝔼(𝑆𝑋1 +⋯+𝑋𝑁 |𝑁)} (𝐶𝑜𝑛𝑑𝑖𝑡𝑖𝑜𝑛𝑎𝑙 𝑒𝑥𝑝𝑒𝑐𝑡𝑎𝑡𝑖𝑜𝑛)


= 𝔼𝑁 {𝔼(𝑆𝑋1 𝑆𝑋2 … 𝑆𝑋𝑁 |𝑁)}

= 𝔼𝑁 {𝔼(𝑆𝑋1 𝑆𝑋2 … 𝑆𝑋𝑁 )} (𝑋 ′ 𝑖𝑠 𝑎𝑟𝑒 𝑖𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑜𝑓 𝑁)

= 𝔼𝑁 {𝔼(𝑆𝑋1 )𝔼(𝑆𝑋2 ) … 𝔼(𝑆𝑋𝑁 )} (𝑋 ′ 𝑖𝑠 𝑎𝑟𝑒 𝑖𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑜𝑓 𝑒𝑎𝑐ℎ 𝑜𝑡ℎ𝑒𝑟)

= 𝔼𝑁 (𝐺𝑋 (𝑆)) (𝐵𝑦 𝑑𝑒𝑓𝑖𝑛𝑖𝑡𝑖𝑜𝑛 𝑜𝑓 𝐺𝑁 )

To find the mean


𝑑
𝔼(𝑇𝑁 ) = 𝐺𝑇′ 𝑁 (𝑆 = 1) = 𝐺 (𝐺 (𝑆))|𝑆=1
𝑑𝑆 𝑁 𝑋
= 𝐺𝑁′ (𝐺𝑋 (𝑆)). 𝐺𝑋′ (𝑆)|𝑆=1

= 𝐺𝑁′ (1). 𝐺𝑋′ (1) 𝑁𝑜𝑡𝑒: 𝐺𝑋 (1) = 1 𝑓𝑜𝑟 𝑎𝑛𝑦 𝑟. 𝑣 𝑋


= 𝔼(𝑁). 𝔼(𝑋1 )

Q40. Cash machine model. 𝑁 customers arrive during the day. Customer 𝑖 withdraws amount 𝑋𝑖 .
The total amount withdrawn during the day is 𝑇𝑁 = 𝑋1 + . . . + 𝑋𝑁. Used the laws of total
expectation and variance to show that 𝔼(𝑇𝑁 ) = 𝜇𝔼(𝑁) 𝑎𝑛𝑑 𝑉𝑎𝑟 (𝑇𝑁 ) = 𝜎 2 𝔼(𝑁) + 𝜇2 𝑉𝑎𝑟 (𝑁),
where 𝜇 = 𝔼(𝑋𝑖 ) 𝑎𝑛𝑑 𝜎 2 = 𝑉𝑎𝑟(𝑋𝑖 )
ANSWER
𝑑
𝔼(𝑇𝑁 ) = 𝐺𝑇′ 𝑁 (𝑆 = 1) = 𝐺 (𝐺 (𝑆))|
𝑑𝑆 𝑁 𝑋𝑖 𝑆=1
= 𝐺𝑁′ (𝐺𝑋𝑖 (𝑆)) . 𝐺𝑋′ 𝑖 (𝑆)|
𝑆=1

= 𝐺𝑁′ (1). 𝐺𝑋′ 𝑖 (1) 𝑁𝑜𝑡𝑒: 𝐺𝑋𝑖 (1) = 1 𝑓𝑜𝑟 𝑎𝑛𝑦 𝑟. 𝑣 𝑋

= 𝔼(𝑁). 𝔼(𝑋𝑖 ) = 𝜇𝔼(𝑁) 𝑤ℎ𝑒𝑟𝑒 𝜇 = 𝐸(𝑋𝑖 )

Similarly,
2
𝑉𝑎𝑟 (𝑇𝑁 ) = 𝔼[(𝑇𝑁 )(𝑇𝑁 − 1)] + 𝐸 (𝑇𝑁 ) − (𝐸(𝑇𝑁 ))

= 𝔼𝑁 {𝑉𝑎𝑟 (𝑋1 + . . . + 𝑋𝑁 |𝑁)} + 𝑉𝑎𝑟𝑁 {𝐸(𝑋1 + . . . + 𝑋𝑁 |𝑁)}


= 𝔼𝑁 {𝑉𝑎𝑟 (𝑋1 + . . . + 𝑋𝑁 )} + 𝑉𝑎𝑟𝑁 {𝐸(𝑋1 + . . . + 𝑋𝑁 )}
Where 𝑁 is now considered constant; (because all 𝑋𝑖′ 𝑠 are independent of 𝑁)
𝑉𝑎𝑟 (𝑇𝑁 ) = 𝔼𝑁 {𝑉𝑎𝑟 (𝑋1 + ⋯ + 𝑋𝑁 )} + 𝑉𝑎𝑟𝑁 {𝐸 (𝑋1 + . . . + 𝑋𝑁 )}
= 𝔼𝑁 {𝑉𝑎𝑟 (𝑋1 ) + ⋯ + 𝑉𝑎𝑟(𝑋𝑁 )} + 𝑉𝑎𝑟𝑁 {(𝑋1 )+ . . . + 𝐸(𝑋𝑁 )}
= 𝑉𝑎𝑟 (𝑋𝑖 )𝔼(𝑁) + 𝑉𝑎𝑟(𝑁𝜇)
= 𝜎 2 𝔼(𝑁) + 𝜇2 𝑉𝑎𝑟(𝑁)
Note: General results for randomly sums:
Suppose 𝑋1 , 𝑋2 , … each have the same mean 𝜇 𝑎𝑛𝑑 𝑣𝑎𝑟𝑖𝑎𝑛𝑐𝑒 𝜎 2 and 𝑋1 , 𝑋2 , …, and 𝑁 are
mutually independent. Let 𝑇𝑁 = 𝑋1 + ⋯ + 𝑋𝑁 be randomly stopped sum
𝔼(𝑇𝑁 ) = 𝜇𝔼(𝑁) 𝑎𝑛𝑑 𝑉𝑎𝑟(𝑇𝑁 ) = 𝜎2𝔼 (𝑁) + 𝜇2𝑉𝑎𝑟(𝑁)

Q41. Let 𝑋~𝑁𝐵(𝑟, 𝑝). Obtain the probability generating function. Hence otherwise find the
expected and variance of the distribution of 𝑋.
ANSWER
If X distribute negative binomial (𝑟, 𝑝) with probability mass function of
𝑛 − 1 𝑟(
𝑓 (𝑋 = 𝑥 ) = ( ) 𝑝 1 − 𝑝) 𝑥−𝑟 ; 𝑥 = 𝑟, 𝑟 + 1, 𝑟 + 2, …
𝑟−1
To obtain the probability generating function
∞ ∞
𝑛 − 1 𝑟(
𝐺𝑋 (𝑆) = 𝐸 (𝑆 𝑋 ) = ∑ 𝑆 𝑥 𝑓(𝑋 = 𝑥) = ∑ 𝑆 𝑥 ( ) 𝑝 1 − 𝑝) 𝑥−𝑟
𝑟−1
𝑥=𝑟 𝑥=𝑟

𝑛 − 1 𝑟 𝑥−𝑟
= ∑ 𝑆 𝑥−𝑟+𝑟 ( )𝑝 𝑞 𝑤ℎ𝑒𝑟𝑒 𝑥 = 𝑥 − 𝑟 + 𝑟 𝑎𝑛𝑑 𝑞 = 1 − 𝑝
𝑟−1
𝑥=𝑟
∞ ∞
𝑛 − 1 𝑟 𝑥−𝑟 𝑛−1
=∑ 𝑆 𝑥−𝑟 . 𝑆 𝑟 ( )𝑝 𝑞 =∑( ) (𝑝𝑆)𝑟 (𝑞𝑆) 𝑥−𝑟
𝑟−1 𝑟−1
𝑥=𝑟 𝑥=𝑟

)𝑟 𝑛 − 1 [ ]𝑥 ( )𝑟 (
= (𝑝𝑆 ∑( ) 𝑞𝑆 = 𝑝𝑆 1 − 𝑞𝑆) −𝑟
𝑟−1
𝑥=𝑟

𝑛 − 1 [𝑞𝑆]𝑥 (1
𝑤ℎ𝑒𝑟𝑒 ∑ ( ) = − 𝑞𝑆)−𝑟 𝑖𝑠 𝑠𝑢𝑚 𝑜𝑓 𝑔𝑒𝑜𝑚𝑒𝑡𝑟𝑖𝑐 𝑠𝑒𝑟𝑖𝑒𝑠 (𝑠𝑢𝑚 𝑜𝑓 𝑖𝑛𝑓𝑖𝑛𝑖𝑡𝑦 𝑠𝑒𝑟𝑖𝑒𝑠)
𝑟−1
𝑥=𝑟

𝑝𝑆 𝑟
𝐺𝑋 (𝑆) = ( )
1 − 𝑞𝑆
So the generating function and its first two derivatives are:

𝑝𝑆 𝑟
𝐺𝑋 (𝑆) = ( )
1 − 𝑞𝑆

𝑑 𝑝𝑆 𝑟 (1 − 𝑞𝑆)𝑝 − 𝑝𝑆 (−𝑞 ) 𝑝𝑆 𝑟−1


𝐺𝑋′ (𝑆) = [( ) ] = 𝑟( ) ( )
𝑑𝑆 1 − 𝑞𝑆 (1 − 𝑞𝑆) 2 1 − 𝑞𝑆

𝑝 − 𝑝𝑞𝑆 + 𝑝𝑞𝑆 𝑝𝑆 𝑟−1 𝑟𝑝 𝑝𝑆 𝑟−1


= 𝑟[ ]( ) = ( )( )
(1 − 𝑞𝑆)2 1 − 𝑞𝑆 (1 − 𝑞𝑆)2 1 − 𝑞𝑆

𝑟𝑝 𝑝𝑟−1 𝑆 𝑟−1 𝑟𝑝𝑟 𝑆 𝑟−1


= ( ) . ( ) =
(1 − 𝑞𝑆)2 (1 − 𝑞𝑆)𝑟−1 (1 − 𝑞𝑆)𝑟+1

𝑆 𝑟−1
= 𝑟𝑝𝑟 ( )
(1 − 𝑞𝑆) 𝑟+1

𝑑2 𝑑 𝑑 𝑆 𝑟−1
𝐺𝑋′′ (𝑆) = 2 (𝐺𝑋 (𝑆)) = ′
[𝐺 (𝑆)] = 𝑟𝑝𝑟 [ ]
𝑑𝑆 𝑑𝑆 𝑋 𝑑𝑆 (1 − 𝑞𝑆) 𝑟+1

(𝑟 − 1)(1 − 𝑞𝑆)𝑟+1 𝑆 𝑟−2 − 𝑆 𝑟−1 (𝑟 + 1)(1 − 𝑞𝑆)𝑟 (−𝑞)


= 𝑟𝑝𝑟 [ ]
[(1 − 𝑞𝑆)𝑟+1 ]2

(𝑟 − 1)(1 − 𝑞𝑆)𝑆 𝑟−2 + 𝑞(𝑟 + 1)𝑆 𝑟−1


= 𝑟𝑝𝑟 (1 − 𝑞𝑆)𝑟 [ ]
[(1 − 𝑞𝑆)𝑟+1 ]2

To obtain the mean and variance

𝑝. 1 𝑟 𝑝 𝑟 𝑝 𝑟
( )
𝐺𝑋 𝑆 = 1 = ( ) =( ) = ( ) = 1𝑟 = 1
1 − 𝑞. 1 1−𝑞 𝑝
1𝑟−1 𝑟𝑝𝑟 𝑟
𝜇1′ = 𝐺𝑋′ (𝑆 = 1) = 𝑟𝑝𝑟 ( ) = =
(1 − 𝑞 )𝑟+1 𝑝𝑟+1 𝑝

𝑟
𝜇1′ = = 𝑚𝑒𝑎𝑛
𝑝

(𝑟 − 1)(1 − 𝑞 )1𝑟−2 + 𝑞(𝑟 + 1)1𝑟−1


𝜇2′ = 𝐺𝑋′′ (𝑆 = 1) = 𝑟𝑝𝑟 (1 − 𝑞 )𝑟 [ ]
[(1 − 𝑞 )𝑟+1 ]2

(𝑟 − 1)𝑝 + 𝑞(𝑟 + 1)
= 𝑟𝑝𝑟 𝑝𝑟 [ ]
[𝑝𝑟+1 ]2

𝑟𝑝 − 𝑝 + 𝑟𝑞 + 𝑞
= 𝑟𝑝2𝑟 [ ]
𝑝2𝑟+2

𝑟 𝑟
= ( 𝑟𝑝 + 𝑟𝑞 + 𝑞 − 𝑝) = [𝑟(𝑝 + 𝑞 ) + (𝑞 − 𝑝)], 𝑤ℎ𝑒𝑟𝑒 𝑝 + 𝑞 = 1
𝑝2 𝑝2

𝑟 𝑟(𝑟 + 𝑞 − 𝑝)
= ( 𝑟 + 𝑞 − 𝑝) =
𝑝2 𝑝2

𝜇2 = 𝑣𝑎𝑟 (𝑥 ) = 𝐺𝑋′′ (𝑠 = 1) + 𝐺𝑥′ (𝑠 = 1) − [𝐺𝑥′ (𝑠 = 1)]2

𝑟(𝑟 + 𝑞 − 𝑝) 𝑟 𝑟 2 𝑟 (𝑟 + 𝑞 − 𝑝) 𝑟 𝑟 2
= + −( ) = + − 2
𝑝2 𝑝 𝑝 𝑝2 𝑝 𝑝

𝑟 2 + 𝑟𝑞 − 𝑟𝑝 + 𝑟𝑝 − 𝑟 2
=
𝑝2

𝑟𝑞 𝑟 (1 − 𝑝)
= =
𝑝2 𝑝2

Q42. Calculate the probability generating function (PGF) of the Uniform(𝑎, 𝑏).
ANSWER
If X distribute uniform (𝑎, 𝑏) with probability density function of
1
𝑓 (𝑋 = 𝑥 ) = ; 𝑎<𝑥<𝑏
𝑏−𝑎
To obtain the probability generating function
𝑏 𝑏 1
𝐺𝑋 (𝑆) = 𝐸 (𝑆 𝑋 ) = ∫ 𝑆 𝑥 𝑓 (𝑋 = 𝑥 )𝑑𝑥 = ∫ 𝑆 𝑥 𝑑𝑥
𝑎 𝑎 𝑏−𝑎
1 𝑏
𝑥
1 1 𝑥 𝑥=𝑏
= ∫ 𝑆 𝑑𝑥 = [ 𝑆 ]
𝑏−𝑎 𝑎 𝑏 − 𝑎 𝑙𝑛𝑆 𝑥=𝑎

1 𝑆𝑏 − 𝑆𝑎
= ( )
𝑏−𝑎 𝑙𝑛𝑆

𝑆𝑏 − 𝑆𝑎
=
(𝑏 − 𝑎) 𝑙𝑛𝑆
Hence, the probability generating function, 𝐺𝑋 (𝑆) is obtained as
𝑆𝑏 − 𝑆𝑎
𝐺𝑋 (𝑆) =
(𝑏 − 𝑎)𝑙𝑛𝑆
2.3 MOMENT GENERATING FUNCTION (M.G.F) FOR DISCRETE AND
CONTINUOUS RANDOM VARIABLE
Let 𝑢(𝑥) = 𝑒 𝑡𝑥 and suppose 𝐸(𝑒 𝑡𝑥 ) exist, then the moment – generating function m(t) or
𝑀𝑥 (𝑡) for a random variable is define to be 𝐸(𝑒 𝑡𝑥 , we say that moment – generating function for
𝑥 exist if there is a positive constant such that 𝑚𝑥 (𝑡) or 𝑀𝑥 (𝑡) is finite for (𝑡) ≤ ℎ

𝑥𝑡 (𝑥𝑡)2 (𝑥𝑡)𝑟
𝑒 𝑡𝑥 = 1 + + + …..+ + … ….
1! 2! 𝑟!

𝑥𝑡 (𝑥𝑡)2 (𝑥𝑡)𝑟
𝑀𝑥 (𝑡) = 𝑒 𝑡𝑥 = ∑ [1 + + + …..+ ] 𝑝(𝑥)
1! 2! 𝑟!

𝑡 2 ∑ 𝑥 2 𝑝(𝑥) 𝑡 𝑟 ∑ 𝑥 𝑟 𝑝(𝑥)
= ∑ 𝑝(𝑥) + 𝑡 ∑ 𝑥 𝑝(𝑥 ) + + …..+
2! 𝑟!

𝑡2 𝑡𝑟
= 1 + 𝑡µ + µ′2 + … . . + µ′
2! 𝑟! 𝑟

𝑡𝑟
𝑀𝑥 (𝑡) = 1 + ∑ µ′ 𝑓𝑜𝑟 𝑑𝑖𝑠𝑐𝑟𝑒𝑡𝑒 𝑟𝑎𝑛𝑑𝑜𝑚 𝑣𝑎𝑟𝑖𝑎𝑏𝑙𝑒
𝑟! 𝑟
𝑥=1

For continuous random variable


𝑥𝑡 (𝑥𝑡)2 (𝑥𝑡)𝑟
𝑀𝑥 (𝑡) = 𝑒 𝑡𝑥 = ∫ [1 + + + …..+ ] 𝑝(𝑋 = 𝑥 )𝑑𝑥
1! 2! 𝑟!

𝑡2 𝑡𝑟
= ∫ 𝑝(𝑥 )𝑑𝑥 + 𝑡 ∫ 𝑥𝑑𝑥 + ∫ 𝑥 2 𝑝(𝑥)𝑑𝑥 + … . . + ∫ 𝑥 𝑟 𝑝(𝑥)𝑑𝑥
2! 𝑟!
𝑡2 𝑡𝑟
= 1 + 𝑡µ + µ′ + … . . + µ′
2! 2 𝑟! 𝑟
∞ 𝑡𝑟
𝑀𝑥 (𝑡) = 1 + ∫ µ′ 𝑑𝑥 𝑓𝑜𝑟 𝑐𝑜𝑛𝑡𝑖𝑛𝑢𝑜𝑢𝑠 𝑟𝑎𝑛𝑑𝑜𝑚 𝑣𝑎𝑟𝑖𝑎𝑏𝑙𝑒
𝑋=1 𝑟! 𝑟

Uses of moment – generating functions


i. Moment -generating functions are used to generate moment.
ii. Moment-generating functions is used to determine the probability density function for
many random variable

Theorem 2.1

If 𝑀𝑥 (𝑡) exist, then for any positive integer


𝑑𝑟 𝑀𝑥 (𝑡)
⁄𝑡 = 0 = 𝐸 (𝑥 𝑟 ) = µ𝑟
𝑑𝑡 𝑟
2.3.1 MOMENTS GENERATING FUNCTION FOR DISCRETE RANDOM VARIABLE
𝑑𝑟 𝑀𝑥 (𝑡)
⁄𝑡 = 0 = 𝑚 𝑛 (0) = 𝐸 (𝑥 𝑟 ) = µ𝑟 𝑓𝑜𝑟 𝑎𝑙𝑙 𝑛 ≥ 0
𝑑𝑡 𝑟
Q43. Suppose 𝑋 is a discrete random variable and has the MGF.
1 2𝑡 3 3𝑡 2 5𝑡 1 8𝑡
𝑀𝑋 (𝑡) = 𝑒 + 𝑒 + 𝑒 + 𝑒
7 7 7 7
What is the PMF of X? Find 𝐸 (𝑥 )𝑎𝑛𝑑 𝑣𝑎𝑟(𝑥)
ANSWER
This does not match any of the known 𝑀𝐺𝐹𝑠 directly. Reading off from the MGFs we guess
4
1 2𝑡 3 3𝑡 2 5𝑡 1 8𝑡
𝑒 + 𝑒 + 𝑒 + 𝑒 = ∑ 𝑒 𝑡𝑥𝑖 𝑝(𝑥𝑖 )
7 7 7 7
𝑖=1

Then, we can take


1 3 2 1
𝑝(2) = , 𝑝(3) = , 𝑝(5) = 𝑎𝑛𝑑 𝑝(8) =
7 7 7 7
Note that these add up to 1, so this is indeed a PMF.
To find the first and second moment
𝑑 𝑑 1 3 2 1
𝐸 (𝑥 ) = 𝑀𝑋′ (𝑡) = (𝑀𝑋 (𝑡)) = ( 𝑒 2𝑡 + 𝑒 3𝑡 + 𝑒 5𝑡 + 𝑒 8𝑡 )
𝑑𝑡 𝑑𝑡 7 7 7 7
2 2𝑡 9 3𝑡 10 5𝑡 8 8𝑡
= 𝑒 + 𝑒 + 𝑒 + 𝑒
7 7 7 7
𝑑 𝑑 2 9 10 5𝑡 8 8𝑡
𝐸 (𝑥 2 ) = 𝑀𝑋′′ (𝑡) = (𝑀𝑋′ (𝑡)) = ( 𝑒 2𝑡 + 𝑒 3𝑡 + 𝑒 + 𝑒 )
𝑑𝑡 𝑑𝑡 7 7 7 7
4 2𝑡 27 3𝑡 50 5𝑡 64 8𝑡
= 𝑒 + 𝑒 + 𝑒 + 𝑒
7 7 7 7
To find the mean and variance
2 0 9 0 10 0 8 0 2 + 9 + 10 + 8 29
𝐸 (𝑥 ) = 𝑀𝑋′ (𝑡 = 0) = 𝑒 + 𝑒 + 𝑒 + 𝑒 = =
7 7 7 7 7 7
4 0 27 0 50 0 64 0 4 27 50 64 145
𝐸 (𝑥 2 ) = 𝑀𝑋′′ (𝑡 = 0) = 𝑒 + 𝑒 + 𝑒 + 𝑒 = + + + =
7 7 7 7 7 7 7 7 7
2 145 29 2
𝑣𝑎𝑟 (𝑥 ) = 𝑀𝑋′′ (𝑡 = 0) − (𝑀𝑋′ (𝑡 = 0)) = −( )
7 7
145 841 1015 − 841 174
= − = =
7 49 49 49
2 3 1
Q44.Consider a random variable 𝑋 with three state 0,1,2 with probability masses , ,
6 6 6
respectively. Find MGF. Hence, obtain the mean and variance of random variable 𝑋
ANSWER
The moment generating function is obtained as follow
2

𝑀𝑋 (𝑡) = 𝐸 (𝑒 𝑡𝑋 ) = ∑ 𝑒 𝑡𝑥 𝑝(𝑥) = 𝑒 𝑡(0) . 𝑝(𝑥 = 0) + 𝑒 𝑡(1) . 𝑝(𝑥 = 1) + 𝑒 𝑡(2) . 𝑝(𝑥 = 2)


𝑋=0

1 1 1
= + 𝑒 𝑡 . + 𝑒 2𝑡 .
3 2 6
1 𝑒 𝑡 𝑒 2𝑡
= + +
3 2 6
To find the first and second moment
𝑑 𝑑 1 𝑒 𝑡 𝑒 2𝑡
𝐸 (𝑥 ) = 𝑀𝑋′ (𝑡) = (𝑀𝑋 (𝑡)) = ( + + )
𝑑𝑡 𝑑𝑡 3 2 6

𝑒 𝑡 𝑒 2𝑡
= +
2 3
𝑑 𝑑 𝑒 𝑡 𝑒 2𝑡 𝑒 𝑡 2𝑒 2𝑡
𝐸 (𝑥 2 ) = 𝑀𝑋′′ (𝑡) ′
= (𝑀𝑋 (𝑡)) = ( + )= +
𝑑𝑡 𝑑𝑡 2 3 2 3
To find the mean and variance
𝑒 0 𝑒 2(0) 1 1 3 + 2 5
𝐸 (𝑥 ) = 𝑀𝑋′ (𝑡 = 0) = + = + = =
2 3 2 3 6 6
𝑒 0 2𝑒 2(0) 1 2 3 + 4 7
𝐸 (𝑥 2 ) = 𝑀𝑋′′ (𝑡 = 0) = + = + = =
2 3 2 3 6 6
2 7 5 2 7 25 42 − 25 17
𝑣𝑎𝑟 (𝑥 ) = 𝑀𝑋′′ (𝑡 = 0) − (𝑀𝑋′ (𝑡 = 0)) = − ( ) = − = =
6 6 6 36 36 36
Q45. Suppose X has the moment generating function as

𝑀𝑥 (𝑡) = (1 − 2𝑡)−1/2
Find the first and second moments of 𝑋. Then obtain the variance of the distribution of 𝑋
ANSWER
We have
To find the first and second moment
𝑑 𝑑 1 1 3
𝑀𝑋′ (𝑡) = (𝑀𝑋 (𝑡)) = ((1 − 2𝑡)−1/2 ) = − (1 − 2𝑡)−2−1 (−2) = (1 − 2𝑡 ) −2
𝑑𝑡 𝑑𝑡 2
𝑑 𝑑 3 3 3 5
𝑀𝑋′′ (𝑡) = (𝑀𝑋′ (𝑡)) = ((1 − 2𝑡)−2 ) = − (1 − 2𝑡)−2−1 (−2) = 3(1 − 2𝑡)−2
𝑑𝑡 𝑑𝑡 2
To find the mean and variance
3
𝐸 (𝑥 ) = 𝑀𝑋′ (𝑡 = 0) = (1 − 2(0))−2 = 1
5
𝐸 (𝑥 2 ) = 𝑀𝑋′′ (𝑡 = 0) = 3(1 − 2(0))−2 = 3 × 1 = 3
2
𝑣𝑎𝑟 (𝑥 ) = 𝑀𝑋′′ (𝑡 = 0) − (𝑀𝑋′ (𝑡 = 0)) = 3 − (1)2 = 3 − 1 = 2

Q46. Suppose that we have a fair 5-sided die and let 𝑋 be the random variable representing the
value of the number rolled.
(a) Write down the moment generating function for 𝑋
(b) Use this moment generating function to compute first, second moments and variance of 𝑋
ANSWER
(a) The moment generating function is obtained as follow
5

𝑀𝑋 (𝑡) = 𝐸 (𝑒 𝑡𝑋 ) = ∑ 𝑒 𝑡𝑥 𝑝(𝑥) = 𝑒 𝑡(1) 𝑝(1) + 𝑒 𝑡(2) 𝑝(2) + 𝑒 𝑡(3) 𝑝(3) + 𝑒 𝑡(4) 𝑝(4) + 𝑒 𝑡(5) 𝑝(5)
𝑋=1
1 1.𝑡 1 2.𝑡 1 3.𝑡 1 4.𝑡 1 5.𝑡
= 𝑒 + 𝑒 + 𝑒 + 𝑒 + 𝑒
5 5 5 5 5
1 𝑡
= (𝑒 + 𝑒 2𝑡 + 𝑒 3𝑡 + 𝑒 4𝑡 + 𝑒 5𝑡 )
5
(b) To find the first and second moment
𝑑 1𝑑 𝑡
𝐸 (𝑥 ) = 𝑀𝑋′ (𝑡) = (𝑀𝑋 (𝑡)) = (𝑒 + 𝑒 2𝑡 + 𝑒 3𝑡 + 𝑒 4𝑡 + 𝑒 5𝑡 )
𝑑𝑡 5 𝑑𝑡
1 𝑡
= (𝑒 + 2𝑒 2𝑡 + 3𝑒 3𝑡 + 4𝑒 4𝑡 + 5𝑒 5𝑡 )
5
𝑑 1𝑑 𝑡
𝐸 (𝑥 2 ) = 𝑀𝑋′′ (𝑡) = (𝑀𝑋′ (𝑡)) = (𝑒 + 2𝑒 2𝑡 + 3𝑒 3𝑡 + 4𝑒 4𝑡 + 5𝑒 5𝑡 )
𝑑𝑡 5 𝑑𝑡
1 𝑡
= (𝑒 + 4𝑒 2𝑡 + 9𝑒 3𝑡 + 16𝑒 4𝑡 + 25𝑒 5𝑡 )
5
To find the mean and variance
1 0
𝐸 (𝑥 ) = 𝑀𝑋′ (𝑡 = 0) = (𝑒 + 2𝑒 2(0) + 3𝑒 3(0) + 4𝑒 4(0) + 5𝑒 5(0) )
5
1 1
= (1 + 2(1) + 3(1) + 4(1) + 5(1)) = (1 + 2 + 3 + 4 + 5)
5 5
1
= (15) = 3
5
1 0
𝐸 (𝑥 2 ) = 𝑀𝑋′′ (𝑡 = 0) = (𝑒 + 4𝑒 2(0) + 9𝑒 3(0) + 16𝑒 4(0) + 25𝑒 5(0) )
5
1 1
= (1 + 4(1) + 9(1) + 16(1) + 25(1)) = (1 + 4 + 9 + 16 + 25)
5 5
1
= (55) = 11
5
2
𝑣𝑎𝑟 (𝑥 ) = 𝑀𝑋′′ (𝑡 = 0) − (𝑀𝑋′ (𝑡 = 0)) = 11 − (3)2 = 11 − 9 = 2

Q47. Suppose that a Economist determines that the income the UConn Dairy Bar makes in a
daily is a random variable, 𝑋, with moment generating function
1
𝑀𝑋 (𝑡) =
(1 − 2000𝑡) 5
Find the standard deviation of the revenue the UConn Dairy bar makes in a daily.
ANSWER
We have
To find the first and second moment
𝑑 𝑑
𝑀𝑋′ (𝑡) = (𝑀𝑋 (𝑡)) = ((1 − 2000𝑡 )−5 ) = 5(2000)(1 − 2000𝑡) −6
𝑑𝑡 𝑑𝑡
𝑑 𝑑
𝑀𝑋′′ (𝑡) = (𝑀𝑋′ (𝑡)) = (5(2000)(1 − 2000𝑡) −6 ) = 30(2000)2 (1 − 2000𝑡 ) −7
𝑑𝑡 𝑑𝑡
To find the mean and variance
𝐸 (𝑥 ) = 𝑀𝑋′ (𝑡 = 0) = 5(2000)(1 − 2000(0))−6 = 10,000
𝐸 (𝑥 2 ) = 𝑀𝑋′′ (𝑡 = 0) = 30(2000) 2 (1 − 2000(0))−7 = 30 × 4000,000 = 120,000,000
2
𝑣𝑎𝑟 (𝑥 ) = 𝑀𝑋′′ (𝑡 = 0) − (𝑀𝑋′ (𝑡 = 0)) = 120,000,000 − (10,000) 2

= 120,000,000 − 100,000,000 = 20,000,000


To estimate the Standard deviation

𝑆𝑇𝐷 (𝑋) = √𝑣𝑎𝑟(𝑥) = √20,000 = 4,472

2.3.1.1 MOMENT GENERATING FUNCTION OF BERNOULLI DISTRIBUTION


Q48. Let 𝑋~𝐵𝑒𝑟𝑛 (𝑝). Obtain the moment generating function and mean and variance.
ANSWER
The probability mass function is defined as
𝑓(𝑋 = 𝑥 ) = 𝑝𝑥 (1 − 𝑝)1−𝑥 ; 𝑥 = 0,1
The moment generating function
1 1

𝑀𝑋 (𝑡) = 𝐸 (𝑒 𝑡𝑥 ) =∑ 𝑒 𝑡𝑥 𝑝𝑥 (1 − 𝑝)1−𝑥 = ∑ (𝑝𝑒 𝑡 ) 𝑥 (1 − 𝑝)1−𝑥


𝑥=0 𝑥=0

= (𝑝𝑒 𝑡 )0 (1 − 𝑝)1−0 + (𝑝𝑒 𝑡 )1 (1 − 𝑝)1−1


= 1 − 𝑝 + 𝑝𝑒 𝑡
= 𝑞 + 𝑝𝑒 𝑡 𝑤ℎ𝑒𝑟𝑒 𝑞 = 1 − 𝑝
Hence,
𝑀𝑋 (𝑡) = 𝑞 + 𝑝𝑒 𝑡
So the generating function and its first two derivatives are:

𝑀𝑋 (𝑡) = 𝑞 + 𝑝𝑒 𝑡
𝑑
𝑀𝑋′ (𝑡) = [𝑞 + 𝑝𝑒 𝑡 ] = 𝑝𝑒 𝑡
𝑑𝑡

𝑑2 𝑑 𝑑
𝑀𝑋′′ (𝑡 ) = 2
(𝑀𝑋 (𝑡)) = [𝑀𝑋′ (𝑡)] = 𝑝 [𝑒 𝑡 ]
𝑑𝑡 𝑑𝑡 𝑑𝑡

= 𝑝[𝑒 𝑡 ] = 𝑝𝑒 𝑡

To obtain the mean and variance

𝑀𝑋 (𝑡 = 0) = 𝑞 + 𝑝𝑒 0 = 𝑝 + 𝑞 = 1
𝜇1′ = 𝑀𝑋′ (𝑡 = 0) = 𝑝𝑒 0 = 𝑝. 1 = 𝑝

𝜇1′ = 𝑝 = 𝑚𝑒𝑎𝑛

𝜇2′ = 𝑀𝑋′′ (𝑡 = 0) = 𝑝𝑒 0 = 𝑝. 1 = 𝑝

𝜇2 = 𝑣𝑎𝑟 (𝑥 ) = 𝑀𝑋′′ (𝑡 = 0) − [𝑀𝑥′ (𝑡 = 0)]2

= 𝑝 − (𝑝)2 = 𝑝(1 − 𝑝)

= 𝑝𝑞

Q49. If the moments of a variate 𝑋 are defined by 𝐸 (𝑋 𝑟 ) = 0.6; 𝑟 = 1,2,3 … show that
𝑃(𝑋 = 0) = 0.4, 𝑃(𝑋 = 1) = 0.6, 𝑃(𝑋 ≥ 2) = 0.
(b) If 𝑀𝑋 (𝑡) of a random variable is (1 − 𝑡)−1 , 𝑡 < 1, find 𝐸(𝑥 𝑟 )

ANSWER
The m.g.f of variate 𝑋 is:
∞ ∞ ∞
𝑡𝑟 𝑡𝑟 𝑡𝑟
𝑀𝑋 (𝑡) = ∑ 𝜇𝑟′ = 1 + ∑ (0.6) = 𝑚(0) + ∑ 0.6
𝑟! 𝑟! 𝑟!
𝑟=0 𝑟=1 𝑟=0

∞ ∞
𝑡𝑟 𝑡𝑟
= (1 − 0.6) + ∑ 0.6 = 0.4 + 0.6 ∑ = 0.4 + 0.6𝑒 𝑡 (𝑖)
𝑟! 𝑟!
𝑟=0 𝑟=0

But
∞ ∞
𝑡𝑥 )
𝑀𝑋 (𝑡) = 𝐸 (𝑒 = ∑ 𝑒 𝑃(𝑋 = 𝑥) = 𝑃 (𝑋 = 0) + 𝑒 . 𝑃 (𝑋 = 1) + ∑ 𝑒 𝑡𝑥 𝑃(𝑋 = 𝑥)
𝑡𝑥 𝑡
(𝑖𝑖)
𝑟=0 𝑥=2

From (i) and (ii), we get:


𝑃(𝑋 = 0) = 0.4, 𝑃(𝑋 = 1) = 0.6, 𝑃(𝑋 ≥ 2) = 0
Remark. In fact (𝑖) is the m.g.f. of Bernoulli variate 𝑋 with 𝑃 (𝑋 = 0) = 𝑞 = 0 · 4 and
𝑃 (𝑋 = 1) = 𝑝 = 0 · 6 𝑎𝑛𝑑 𝑃 (𝑋 ≥ 2) = 0.
𝑡 2! 𝑡 2 3! 𝑡 3 𝑟! 𝑡 𝑟
𝑀𝑋 (𝑡 ) = (1 − 𝑡)−1 = 1 + 𝑡 + 𝑡 2 + ⋯ + 𝑡 𝑟 + ⋯ = 1 + + + +⋯ + + …
1! 2! 3! 𝑟!

𝑡2 𝑡𝑟
= 𝑚(0) + 𝑚1 (0)𝑡 + 𝑚 2 (0) + … . . +𝑚 𝑟 (0)
2! 𝑟!

= 𝐸(𝑥 𝑟 ) = 𝑟!

2.3.1.1 MOMENT GENERATING FUNCTION OF BINOMIAL DISTRIBUTION


Q50. Let 𝑋~𝐵𝑖𝑛(𝑛, 𝑝). Obtain the moment generating function, hence otherwise, find the
moment generating function about mean of the distribution
ANSWER
The probability mass function is defined as
𝑛
𝑓(𝑋 = 𝑥 ) = ( ) 𝑝𝑥 (1 − 𝑝) 𝑛−𝑥 ; 𝑥 = 0,1,2, … , 𝑛
𝑥
The moment generating function
1 𝑛
𝑛 𝑛
𝑀𝑋 (𝑡) = 𝐸 (𝑒 𝑡𝑥 ) = ∑ 𝑒 𝑡𝑥 ( ) 𝑝𝑥 (1 − 𝑝) 𝑛−𝑥 = ∑ ( ) (𝑝𝑒 𝑡 ) 𝑥 (1 − 𝑝) 𝑛−𝑥
𝑥 𝑥
𝑥=0 𝑥=0

= (𝑞 + 𝑝𝑒 𝑡 )𝑛 (𝑏𝑖𝑛𝑜𝑚𝑖𝑎𝑙 𝑒𝑥𝑝𝑎𝑛𝑠𝑖𝑜𝑛)
Hence,
𝑀𝑋 (𝑡) = (𝑞 + 𝑝𝑒 𝑡 )𝑛
So the generating function and its first two derivatives are:

𝑀𝑋 (𝑡) = (𝑞 + 𝑝𝑒 𝑡 )𝑛
𝑑
𝑀𝑋′ (𝑡) = [𝑞 + 𝑝𝑒 𝑡 ]𝑛 = 𝑛𝑝𝑒 𝑡 (𝑞 + 𝑝𝑒 𝑡 )𝑛−1
𝑑𝑡
𝑑2 𝑑 𝑑
𝑀𝑋′′ (𝑡 ) = 2 (𝑀𝑋 (𝑡)) = [𝑀𝑋′ (𝑡)] = 𝑛𝑝 [𝑒 𝑡 (𝑞 + 𝑝𝑒 𝑡 )𝑛−1 ]
𝑑𝑡 𝑑𝑡 𝑑𝑡

= 𝑛𝑝[𝑒 𝑡 (𝑞 + 𝑝𝑒 𝑡 )𝑛−1 + (𝑛 − 1) 𝑝𝑒 2𝑡 (𝑞 + 𝑝𝑒 𝑡 )𝑛−2 ]

= 𝑛𝑝𝑒 𝑡 [(𝑞 + 𝑝𝑒 𝑡 )𝑛−1 + 𝑝𝑒 𝑡 (𝑛 − 1)(𝑞 + 𝑝𝑒 𝑡 )𝑛−2 ]

To obtain the mean and variance

𝑀𝑋 (𝑡 = 0) = (𝑞 + 𝑝𝑒 0 )𝑛 = (𝑝 + 𝑞 )𝑛 = 1𝑛 = 1
𝜇1′ = 𝑀𝑋′ (𝑡 = 0) = 𝑛𝑝𝑒 0 (𝑞 + 𝑝𝑒 0 )𝑛−1 = 𝑛𝑝 (𝑞 + 𝑝) 𝑛−1 = 𝑛𝑝

𝜇1′ = 𝑛𝑝 = 𝑚𝑒𝑎𝑛

𝜇2′ = 𝑀𝑋′′ (𝑡 = 0) = 𝑛𝑝𝑒 0 [(𝑞 + 𝑝𝑒 0 )𝑛−1 + 𝑝𝑒 0 (𝑛 − 1)(𝑞 + 𝑝𝑒 0 )𝑛−2 ]

= 𝑛𝑝(1 + 𝑛𝑝 − 𝑝)

= 𝑛 2 𝑝2 + 𝑛𝑝 − 𝑛𝑝2

𝜇2 = 𝑣𝑎𝑟 (𝑥 ) = 𝑀𝑋′′ (𝑡 = 0) − [𝑀𝑥′ (𝑡 = 0)]2

= 𝑛 2 𝑝2 + 𝑛𝑝 − 𝑛𝑝2 − (𝑛𝑝) 2

= 𝑛 2 𝑝2 + 𝑛𝑝 − 𝑛𝑝2 − 𝑛 2 𝑝2

= 𝑛𝑝(1 − 𝑝) = 𝑛𝑝𝑞

To obtain moment generating function bout mean of Binomial distribution

𝐸(𝑒 𝑡(𝑋−𝑛𝑝) ) = 𝐸 (𝑒 𝑡𝑥 . 𝑒 −𝑡𝑛𝑝) = 𝑒 −𝑡𝑛𝑝. 𝐸 (𝑒 𝑡𝑋 ) = 𝑒 −𝑡𝑛𝑝. 𝑀𝑋 (𝑡)

= 𝑒 −𝑡𝑛𝑝. (𝑞 + 𝑝𝑒 𝑡 )𝑛 𝑤ℎ𝑒𝑟𝑒, 𝑀𝑋 (𝑡) = (𝑞 + 𝑝𝑒 𝑡 )𝑛

= (𝑞𝑒 −𝑡𝑝 + 𝑝𝑒 𝑡𝑞 )𝑛

𝑛
𝑝2 𝑡 2 𝑝3 𝑡 3 𝑡 2𝑞2 𝑡 3𝑞3
= [𝑞 {1 − 𝑝𝑡 + − + ⋯ } + 𝑝 {1 + 𝑡𝑞 + + +. . }]
2! 3! 2! 3!

𝑛
𝑝2 𝑡 2 𝑝3 𝑡 3 𝑡 2𝑞2 𝑡 3𝑞3
= [{𝑞 − 𝑝𝑞𝑡 + 𝑞− 𝑞 + ⋯ } + {𝑝 + 𝑡𝑝𝑞 + 𝑝+ 𝑝 + ⋯ }]
2! 3! 2! 3!
𝑛
𝑡2 𝑡3 𝑡4
= [𝑞 + 𝑝 + 𝑝𝑞 (𝑝 + 𝑞 ) + 𝑝𝑞 (𝑞 − 𝑝 ) + 𝑝𝑞 (𝑞 3 + 𝑝3 ) + ⋯ ] ; 𝑤ℎ𝑒𝑟𝑒, 𝑝 + 𝑞 = 1
2 2
2! 3! 4!

𝑛
𝑡2 𝑡3 𝑡4
= [1 + 𝑝𝑞 + 𝑝𝑞(𝑞 − 𝑝) + 𝑝𝑞(𝑞 3 + 𝑝3 ) + ⋯ ] 𝑤ℎ𝑒𝑟𝑒 𝑞 3 + 𝑝3 = (𝑝 + 𝑞)3 − 3𝑝𝑞(𝑝 + 𝑞)
2! 3! 4!

𝑛
𝑡2 𝑡3 𝑡4
= [1 + 𝑝𝑞 + 𝑝𝑞(𝑞 − 𝑝) + 𝑝𝑞{(𝑝 + 𝑞)3 − 3𝑝𝑞(𝑝 + 𝑞)} + ⋯ ]
2! 3! 4!

𝑛
𝑡2 𝑡3 𝑡4
= [1 + 𝑝𝑞 + 𝑝𝑞(𝑞 − 𝑝) + 𝑝𝑞{1 − 3𝑝𝑞} + ⋯ ]
2! 3! 4!

𝑛 𝑡2 𝑡3 𝑡4
= [1 + ( ) { 𝑝𝑞 + 𝑝𝑞(𝑞 − 𝑝) + 𝑝𝑞(1 − 3𝑝𝑞) + ⋯ }
1 2! 3! 4!
2
𝑛 𝑡2 𝑡3 𝑡4
+ ( ) { 𝑝𝑞 + 𝑝𝑞(𝑞 − 𝑝) + 𝑝𝑞(1 − 3𝑝𝑞) + ⋯ } ]
2 2! 3! 4!

Now

𝑡2
𝜇2 = 𝐶𝑜𝑒𝑓𝑓𝑖𝑐𝑖𝑒𝑛𝑡 𝑜𝑓 = 𝑛𝑝𝑞
2!

𝑡3
𝜇3 = 𝐶𝑜𝑒𝑓𝑓𝑖𝑐𝑖𝑒𝑛𝑡 𝑜𝑓 = 𝑛𝑝𝑞(𝑞 − 𝑝)
3!

𝑡4
𝜇4 = 𝐶𝑜𝑒𝑓𝑓𝑖𝑐𝑖𝑒𝑛𝑡 𝑜𝑓 = 𝑛𝑝𝑞 (1 − 3𝑝𝑞 ) + 3𝑛 (𝑛 − 1)𝑝2 𝑞 2
4!

= 𝑛𝑝𝑞 − 3𝑛𝑝2 𝑞 2 + 3𝑛 2 𝑝2 𝑞 2 − 3𝑛𝑝2 𝑞 2 = 3𝑛 2 𝑝2 𝑞 2 + 𝑛𝑝𝑞 − 6𝑛𝑝2 𝑞 2

= 3𝑛 2 𝑝2 𝑞 2 + 𝑛𝑝𝑞(1 − 6𝑝𝑞)

= 𝑛𝑝𝑞 [3𝑛𝑝𝑞 + (1 − 6𝑝𝑞)]

Q51. If 𝑋 is binomially distributed with parameters𝑛 𝑎𝑛𝑑 𝑝. What is the distribution of 𝑌 = 𝑛 − 𝑋?

ANSWER

𝑋~𝐵𝑖𝑛(𝑛, 𝑝) represents the number of successes in 𝑛 independent trials with constant


probability 𝑝 of success for each trial.
∴ 𝑌 = 𝑛 − 𝑋, represents the number of failures in 𝑛 independent trial with constant probability
‘q’ of failure for each trial. Hence 𝑌 = 𝑛 − 𝑋~𝐵𝑖𝑛(𝑛, 𝑞)

Since 𝑋~𝐵𝑖𝑛 (𝑛, 𝑝), 𝑀𝑋 (𝑡) = 𝐸 (𝑒 𝑡𝑋 ) = (𝑞 + 𝑝𝑒 𝑡 )𝑛

∴ 𝑀𝑌 (𝑡) = 𝐸 (𝑒 𝑡𝑌 ) = 𝐸(𝑒 𝑡(𝑛−𝑌) ) = 𝑒 𝑛𝑡 𝐸 (𝑒 −𝑡𝑋 ) = 𝑒 𝑛𝑡 . 𝑀𝑋 (−𝑡)

= 𝑒 𝑛𝑡 (𝑞 + 𝑝𝑒 −𝑡 )𝑛 = [𝑒 𝑡 (𝑞 + 𝑝𝑒 −𝑡 )]𝑛

= (𝑝 + 𝑞𝑒 𝑡 )𝑛

Hence by uniqueness theorem of m.g.f , 𝑌 = 𝑛 − 𝑋~𝐵(𝑛, 𝑞)

2 3 10
Q52. The moment generating function of a r.v. 𝑋 is ( + 𝑒 𝑡 ) . Show that:
5 5

8 𝑥
10 3 2 10−𝑥
𝑃(𝜇 − 𝜎 < 𝑋 < 𝜇 + 𝜎 = ∑ ( ) ( ) ( )
𝑥 5 5
𝑥=5

ANSWER

Since

10
2 3
𝑀𝑋 (𝑡) = ( + 𝑒 𝑡 ) = (𝑞 + 𝑝𝑒 𝑡 )𝑛 ,
5 5
3
By uniqueness theorem of m.g.f. 𝑋~𝐵𝑖𝑛 (𝑛 = 10, 𝑝 = )
5

3 3 2
Hence, 𝐸 (𝑋) = 𝜇𝑥 = 𝑛𝑝 = 10 × = 6; 𝜎𝑋2 = 𝑛𝑝𝑞 = 10 × × = 2.4
5 5 5

𝜇 ± 𝜎 = 6 ± √2.4 = 6 ± 1.6 = (4.4, 7.6)

∴ 𝑃 (𝜇 − 𝜎 < 𝑋 < 𝜇 + 𝜎) = 𝑃(4.4 < 𝑋 < 7.6) = 𝑃(5 < 𝑋 < 8)

8 8 𝑥 8
𝑛 𝑥 𝑛−𝑥 10 3 2 10−𝑥
= ∑ 𝑝(𝑥) = ∑ ( ) 𝑝 𝑞 = ∑ ( )( ) ( )
𝑥 𝑥 5 5
𝑋=5 𝑥=5 𝑥=5

Q53. Find the m.g.f. of standard binomial variate (𝑋 − 𝑛𝑝)/√𝑛𝑝𝑞 and obtain its limiting form as
𝑛 → ∞. Also interpret the result.
ANSWER

We know that if 𝑋~𝐵(𝑛, 𝑝), then 𝑀𝑋 (𝑡 ) = (𝑞 + 𝑝𝑒 𝑡 )𝑛

The moment generating function of standard binomial variate.

𝑋 − 𝑛𝑝 𝑋−𝜇
𝑍= = ; 𝑤ℎ𝑒𝑟𝑒 𝜇 = 𝑛𝑝 𝑎𝑛𝑑 𝜎 2 = 𝑛𝑝𝑞, 𝑖𝑠 𝑔𝑖𝑣𝑒𝑛 𝑏𝑦
√𝑛𝑝𝑞 𝜎

𝑛𝑝𝑡 𝑡 𝑛
𝜇𝑡
−𝜎 𝑡 −
𝑛𝑝𝑞 𝑛𝑝𝑞
( )
𝑀𝑍 𝑡 = 𝑒 . 𝑀𝑋 ( ) = 𝑒 √ . (𝑞 + 𝑝𝑒 √ )
𝜎

𝑛𝑝𝑡 𝑡 𝑛

=𝑒 √𝑛𝑝𝑞 . (𝑞 + 𝑝𝑒 √ 𝑛𝑝𝑞 )

𝑝𝑡 𝑡 𝑛

= [𝑒 √𝑛𝑝𝑞 . (𝑞 + 𝑛𝑝𝑞
𝑝𝑒 √ )]

𝑝𝑡 𝑞𝑡 𝑛

= [(𝑞𝑒 √𝑛𝑝𝑞 + 𝑛𝑝𝑞
𝑝𝑒 √ )]

𝑛
𝑝2 𝑡 2
𝑝𝑡 3 𝑞𝑡 𝑞2𝑡 2 3
= [𝑞 {1 − + + 𝜂′ (𝑛 −2 )} + 𝑝 {1 + + + 𝜂′′ (𝑛 −2 )}]
√𝑛𝑝𝑞 2𝑛𝑝𝑞 √𝑛𝑝𝑞 2𝑛𝑝𝑞

3 3
Where 𝜂′ (𝑛 −2 ) 𝑎𝑛𝑑 𝜂′′ (𝑛 −2 ) involve terms containing 𝑛 3/2 and higher powers of n in

denominator.
𝑛
𝑡 2 𝑝𝑞 3
∴ 𝑀𝑍 (𝑡) = [(𝑞 + 𝑝) + (𝑝 + 𝑞 ) + 𝜂 (𝑛 −2 )] 𝑤ℎ𝑒𝑟𝑒 𝑝 + 𝑞 = 1
2𝑛𝑝𝑞

𝑛
𝑡2 3
= [1 + + 𝜂 (𝑛 −2 )]
2𝑛

3 3
Where, 𝜂 (𝑛 −2 ) involves terms with 𝑛 −2 and higher powers of n in the denominator.

𝑡2 3
∴ 𝑙𝑜𝑔𝑀𝑍 (𝑡) = 𝑛𝑙𝑜𝑔 [1 + + 𝜂 (𝑛 −2 )]
2𝑛
2
𝑡2 3 1 𝑡2 3
= 𝑛 [{ + 𝜂 (𝑛 −2 )} − { + 𝜂 (𝑛 −2 )} + ⋯ ]
2𝑛 2 2𝑛

𝑡2 1
= + 𝜂′′′ (𝑛 −2 )
2
1
Where 𝜂′′′ (𝑛 −2 ) involve terms with 𝑛1/2 and higher powers of 𝑛 in the denominator. Proceeding

to the limit as 𝑛 → ∞, we get

𝑡2
lim 𝑙𝑜𝑔𝑀𝑍 (𝑡) =
𝑛→∞ 2

𝑡2
⇒ ( )
lim 𝑀𝑍 𝑡 = exp ( ) ( ∗)
𝑛→∞ 2

Interpretation. (*) is the moment generating function of standard normal variate. Hence by
uniqueness theorem of moment generating functions, standard binomial variate tends to standard
normal variate as 𝑛 → ∞.

Q54. If a random variable 𝑋 has the following moment generating function:

3 1 𝑡 20
𝑀𝑋 (𝑡) = ( + 𝑒 ) 𝑓𝑜𝑟 𝑎𝑙𝑙 𝑡, 𝑡ℎ𝑒𝑛 𝑓𝑖𝑛𝑑 𝑡ℎ𝑒 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑚𝑎𝑠𝑠 𝑓𝑢𝑛𝑐𝑡𝑖𝑜𝑛 𝑜𝑓 𝑋
4 4
ANSWER
We previously determined that the moment generating function of a binomial random variable is:
𝑀𝑋 (𝑡) = [(1 − 𝑝) + 𝑝𝑒 𝑡 ]𝑛 𝑓𝑜𝑟 − ∞ < 𝑡 < ∞
Comparing the given moment generating function with that of a binomial random variable, we
1
see that 𝑋 must be a binomial random variable with 𝑛 = 20 𝑎𝑛𝑑 𝑝 = .
4

Therefore, the 𝑝. 𝑚. 𝑓 is X is:

20 1 𝑥 3 20−𝑥
𝑓 (𝑥 ) = ( ) ( ) ( ) 𝑓𝑜𝑟 𝑥 = 0,1,2, … ,20
𝑥 4 4
2.3.1.2 MOMENT GENERATING FUNCTION OF THE POISSON DISTRIBUTION
Q55. Let 𝑋~𝑃𝑜𝑖(𝜆). Obtain the moment generating function and the mean and variance of the
distribution
ANSWER
The probability mass function is defined as

𝜆𝑥 𝑒 −𝜆
𝑓 (𝑋 = 𝑥 ) = ; 𝑥 = 0,1,2, …,
𝑥!
The moment generating function
∞ ∞
𝑡𝑥 )
𝜆𝑥 𝑒 −𝜆 𝑡𝑥−𝜆
(𝜆𝑒 𝑡 ) 𝑥
𝑀𝑋 (𝑡) = 𝐸 (𝑒 =∑𝑒 . =𝑒 ∑
𝑥! 𝑥!
𝑥=0 𝑥=0

−𝜆 𝑡
(𝜆𝑒 𝑡 )2 𝜆𝑒 𝑡 𝑡
(𝜆𝑒 𝑡 )2
=𝑒 {1 + 𝜆𝑒 + + ⋯ } , 𝑤ℎ𝑒𝑟𝑒, 𝑒 = 1 + 𝜆𝑒 + +⋯
2! 2!
𝑡 𝑡 −1)
= 𝑒 −𝜆 . 𝑒 𝜆𝑒 = 𝑒 𝜆(𝑒 (𝑒𝑥𝑝𝑜𝑛𝑒𝑛𝑡𝑖𝑎𝑙 𝑠𝑒𝑟𝑖𝑒𝑠)
Hence,
𝑡 −1)
𝑀𝑋 (𝑡) = 𝑒 𝜆(𝑒
So the generating function and its first two derivatives are:
𝑡 −1)
𝑀𝑋 (𝑡) = 𝑒 𝜆(𝑒
𝑑 𝜆(𝑒 𝑡 −1) 𝑡
𝑀𝑋′ (𝑡) = [𝑒 ] = 𝜆𝑒 𝑡 (𝑒 𝜆(𝑒 −1) )
𝑑𝑡

𝑑2 𝑑 𝑑 𝑡
𝑀𝑋′′ (𝑡 ) = 2 (𝑀𝑋 (𝑡)) = [𝑀𝑋′ (𝑡)] = 𝜆 [𝑒 𝑡 (𝑒 𝜆(𝑒 −1) )]
𝑑𝑡 𝑑𝑡 𝑑𝑡
𝑡 −1) 𝑡 −1)
= 𝜆[𝑒 𝑡 (𝑒 𝜆(𝑒 ) + 𝜆𝑒 𝑡 (𝑒 𝜆(𝑒 )𝑒 𝑡 ]

𝑡 −1) 𝑡 −1)
= 𝜆𝑒 𝑡 [(𝑒 𝜆(𝑒 ) + 𝜆𝑒 𝑡 (𝑒 𝜆(𝑒 )]

To obtain the mean and variance

0 −1)
𝑀𝑋 (𝑡 = 0) = (𝑒 𝜆(𝑒 ) = 𝑒0 = 1
0 −1)
𝜇1′ = 𝑀𝑋′ (𝑡 = 0) = 𝜆𝑒 0 (𝑒 𝜆(𝑒 ) = 𝜆(𝑒)0 = 𝜆. 1 = 𝜆

𝜇1′ = 𝜆 = 𝑚𝑒𝑎𝑛

0 −1) 0 −1)
𝜇2′ = 𝑀𝑋′′ (𝑡 = 0) = 𝜆𝑒 0 [(𝑒 𝜆(𝑒 ) + 𝜆𝑒 0 (𝑒 𝜆(𝑒 )]

= 𝜆(1 + 𝜆)

= 𝜆2 + 𝜆
𝜇2 = 𝑣𝑎𝑟 (𝑥 ) = 𝑀𝑋′′ (𝑡 = 0) − [𝑀𝑥′ (𝑡 = 0)]2

= 𝜆2 + 𝜆 − (𝜆)2

= 𝜆2 + 𝜆 − 𝜆2

=𝜆

Q56. (a) If 𝑋 is a Poisson distribution with mean 𝜆, show that the expectation of 𝑒 −𝑘𝑋 is
exp[−𝜆(1 − 𝑒 −𝑘 )]. Hence, show that, if 𝑋̅ is the arithmetic mean of 𝑛 independent random
variables 𝑋1 , 𝑋2 , … , 𝑋𝑛 , each having Poisson distribution with parameter 𝜆, then 𝑒 −𝑥̅ as an estimate
of 𝑒 −𝑚 is biased, although 𝑋̅ is an unbiased estimate of 𝜆.

(b) If 𝑋 is a Poisson variate with mean 𝜆, what would be the expectation of 𝑘𝑋𝑒 −𝑘𝑋 , k is constant

ANSWER

The probability mass function is defined as


𝜆𝑥 𝑒 −𝜆
𝑓 (𝑋 = 𝑥 ) = ; 𝑥 = 0,1,2, …,
𝑥!
∞ ∞
−𝑘𝑋 ) −𝑘𝑥 −𝜆
(𝜆𝑒 −𝑘 ) 𝑥
𝐸 (𝑒 =∑𝑒 𝑝(𝑥) = 𝑒 ∑
𝑥!
𝑥=0 𝑥=0

−𝜆 −𝑘
(𝜆𝑒 −𝑘 )2 (𝜆𝑒 −𝑘 )3
=𝑒 [1 + 𝜆𝑒 + + +⋯]
2! 3!
−𝑘
= 𝑒 −𝜆 . 𝑒 𝜆𝑒
−𝑘 )
𝐸 (𝑒 −𝑘𝑋 ) = 𝑒 −𝜆(1−𝑒 (∗)
We have
𝑛 𝑛
1 1
𝐸 (𝑋̅) = 𝐸 ( ∑ 𝑋𝑖 ) = ∑ 𝐸(𝑋𝑖 )
𝑛 𝑛
𝑖=1 𝑖=1

Since 𝑋𝑖 ; 𝑖 = 1,2, … , 𝑛 is Poisson variate with parameter 𝜆, 𝐸 (𝑋𝑖 ) = 𝜆.


𝑛
1 1
∴ 𝐸 (𝑋̅ ) = ∑ 𝜆 = . 𝑛𝜆 = 𝜆
𝑛 𝑛
𝑖=0

Hence 𝑋̅ is an unbiased estimate of 𝜆.


Now
𝑛
−𝑋̅ )
1
𝐸(𝑒 = 𝐸 [exp (− ∑ 𝑋𝑖 )]
𝑛
𝑖=1
𝑋1 𝑋2 𝑋3 𝑋𝑛
= 𝐸 (𝑒 − 𝑛 . 𝑒 − 𝑛 . 𝑒 − 𝑛 … 𝑒 − 𝑛 )

𝑋1 𝑋2 𝑋3 𝑋𝑛
= 𝐸 (𝑒 − 𝑛 ) 𝐸 (𝑒 − 𝑛 ) 𝐸 (𝑒 − 𝑛 ) … 𝐸 (𝑒 − 𝑛 ) (𝑠𝑖𝑛𝑐𝑒 𝑋1 , 𝑋2 , … , 𝑋𝑛 )

𝑛
𝑋
−𝑋̅ ) −( 𝑛𝑖 )
∴ 𝐸(𝑒 = ∏ 𝐸(𝑒 … (∗∗)
𝑖=1

Using (*) with 𝑘 = 1/𝑛, we get


1
𝑋 −
−( 𝑛𝑖 ) −𝜆(1−𝑒 𝑘 )
𝐸(𝑒 =𝑒 , (Since 𝑋𝑖 is a Poisson variate with parameter 𝜆)
𝑛 𝑛
1 1
−(𝑘 ) −(𝑘)
∴ 𝐸(𝑒 −𝑋̅ ) = ∏ [exp {−𝑚 (1 − 𝑒 )}] = exp {−𝑚 (1 − 𝑒 )}
𝑖=1
1
−( )
= exp {−𝑚𝑛 (1 − 𝑒 𝑘 )} ≠ 𝑒 −𝜆

Hence, 𝐸(𝑒 −𝑋̅ ) is not an unbiased estimated of 𝑒 −𝜆 , though 𝑋̅ is an unbiased estimate of 𝜆.


𝑒 −𝜆 𝜆𝑥
(b) 𝐸 (𝑘𝑋𝑒 −𝑘𝑋 ) = ∑∞
𝑥=0 𝑘𝑥𝑒
−𝑘𝑥 . 𝑝(𝑥 ) = 𝑘 ∑∞ 𝑒 −𝑘𝑥 . 𝑥
𝑥=0 𝑥!
∞ ∞
−𝜆
(𝜆𝑒 −𝑘 ) 𝑥 −𝜆 −𝑘
(𝜆𝑒 −𝑘 ) 𝑥−1
= 𝑘𝑒 ∑ = 𝑘𝑒 𝜆𝑒 ∑
(𝑥 − 1)! (𝑥 − 1)!
𝑥=1 𝑥=1

(𝜆𝑒 −𝑘 )2 (𝜆𝑒 −𝑘 )3
= 𝑘𝜆𝑒 −𝜆−𝑘 [1 + 𝜆𝑒 −𝑘 + + + ⋯]
2! 3!
−𝑘
= 𝑘𝜆𝑒 −𝜆−𝑘 . 𝑒 𝜆𝑒 = 𝜆𝑘 exp[{𝜆(𝑒 −𝑘 − 1) − 𝑘}]
𝑋−𝜆
Q57. If 𝑋 is a Poisson variate with mean 𝜆, show that is a variable with mean zero and
√𝜆
variance unity. Find the moment generating function for this variable and show that it
2
approaches𝑒 𝑡 /2 𝑎𝑠 𝜆 → ∞. Also interpret the result.
ANSWER
𝑋−𝜆
Let 𝑌 =
√𝜆
𝑋−𝜆 1 1
∴ 𝐸( )= 𝐸 (𝑋 − 𝜆 ) = (𝜆 − 𝜆) = 0; 𝑤ℎ𝑒𝑟𝑒 𝐸 (𝑥 ) = 𝜆
√𝜆 √𝜆 √𝜆
𝑋−𝜆 2 1 1
𝑣𝑎𝑟 ( ) = 𝐸 (𝑋 − 𝜆)2 = 𝜇2
√𝜆 𝜆 𝜆
𝑡(𝑋−𝜆)
𝑀. 𝐺. 𝐹 𝑜𝑓 𝑌 = 𝑀𝑌 (𝑡) = 𝐸 (𝑒 𝑡𝑌 ) = 𝐸 (𝑒 𝑡𝑌 ) = 𝐸 (𝑒 √𝜆 )

= 𝑒 −𝑡√𝜆 [𝐸(𝑒 𝑡𝑋/√𝜆 )]



−𝑡√𝜆
𝑒 −𝜆 𝜆𝑥 𝑡𝑥
=𝑒 ∑ . 𝑒 √𝜆
𝑥!
𝑥=0
𝑡 𝑥
∞ (𝜆𝑒 √𝜆 )

= 𝑒 −𝜆−𝑡√𝜆 ∑ .
𝑥!
𝑥=0

𝑡 2 𝑡 2
𝑡 𝑡
(𝜆𝑒 )
√𝜆 (𝜆𝑒 )
√𝜆 𝑡
𝜆𝑒 √𝜆 𝜆𝑒 √𝜆
= 𝑒 −𝜆−𝑡√𝜆 1+ + + ⋯ ; 𝑤ℎ𝑒𝑟𝑒 1+ + +⋯= 𝑒 𝜆𝑒 √𝜆
1! 2! 1! 2!
[ ]
𝑡 𝑡
= 𝑒 −𝜆−𝑡√𝜆 . 𝑒 𝜆𝑒 √𝜆 = 𝑒𝑥𝑝 [−𝜆 − 𝑡√𝜆 + 𝜆𝑒 √𝜆 ]

𝑡2 𝑡3 𝑡
= 𝑒𝑥𝑝 [−𝜆 − 𝑡√𝜆 + 𝜆 (1 + + + 3 + ⋯ )]
√𝜆 2! 𝜆 3! 𝜆2

1 1 𝑡3
= 𝑒𝑥𝑝 [ 𝑡 2 + . + ⋯]
2 3! √𝜆

Now proceeding to limit as 𝜆 → ∞, we get


𝑡2
lim 𝑀𝑌 (𝑡) = 𝑒2 … (∗)
𝜆→∞

Interpretation. (*) is the 𝑚. 𝑔. 𝑓 of standard Normal variate. Hence by uniqueness theorem of


𝑚. 𝑔. 𝑓. ’𝑠, standard Poisson variate tends to standard normal variate as 𝜆 → ∞. Hence Poisson
distribution tends to Normal distribution for large values of parameter 𝜆.
𝑡 −1)
Q58. Suppose that moment generating of 𝑋 is given by 𝑀𝑋 (𝑡) = 𝑒 3(𝑒 . Find (𝑖) 𝑃 (𝑋 = 0)
(𝑖𝑖) 𝑃(𝑋 = 3) (iii) The first and second moment and the variance
ANSWER
We can match this MGF to a known MGF of one of the distributions we considered and then
apply. Observe that
𝑡 −1) 𝑡 −1)
𝑀𝑋 (𝑡) = 𝑒 3(𝑒 = 𝑒 𝜆(𝑒 , 𝑤ℎ𝑒𝑟𝑒 𝜆 = 3, 𝑡ℎ𝑢𝑠, 𝑋~𝑃𝑜𝑖𝑠𝑠𝑜𝑛(3)
3𝑥
∴ 𝑃 (𝑋 = 𝑥 ) = 𝑒 −3 .
𝑥!
30
(𝑖) 𝑃(𝑋 = 0) = 𝑒 −3 . = 𝑒 −3 . 1 = 𝑒 −3
1!
33 27 9 −3
(𝑖𝑖) 𝑃 (𝑋 = 3) = 𝑒 −3 . = 𝑒 −3 . = 𝑒
3! 6 2
𝑑 3(𝑒 𝑡 −1) 𝑡
(𝑖𝑖𝑖 )𝑀𝑋′ (𝑡) = (𝑒 ) = 𝑒 3(𝑒 −1) 3𝑒 𝑡 𝑎𝑛𝑑
𝑑𝑡
𝑑2 3(𝑒 𝑡 −1) 𝑑 𝑡
𝑀𝑋′′ (𝑡) = 2 (𝑒 ) = (3𝑒 𝑡 𝑒 3(𝑒 −1))
𝑑𝑡 𝑑𝑡
𝑡 −1) 𝑡 −1)
= 3𝑒 𝑡 𝑒 3(𝑒 . 3𝑒 𝑡 + 𝑒 3(𝑒 3𝑒 𝑡
𝑡 −1)
= 3𝑒 𝑡 𝑒 3(𝑒 [3𝑒 𝑡 + 1]
𝑡 −1)
𝑀′ (0) = 𝑒 3(𝑒 3𝑒 𝑡 = 3. 𝑒 0 . 𝑒 3(1−1) = 3.1.1 = 3
0 −1)
𝑀𝑋′′ (𝑡 = 0) = 3𝑒 0 𝑒 3(𝑒 [3𝑒 0 + 1] = 3(3 + 1) = 12
(𝑖𝑣 ) 𝑉𝑎𝑟 (𝑥 ) = 12 − (3)2 = 12 − 9 = 3

2.3.1.3 MOMENT GENERATING FUNCTION OF THE GEOMETRIC DISTRIBUTION


Q59. Let 𝑋 be geometric with parameter 𝑝 (𝑖. 𝑒. 𝑋~𝐺𝑒𝑜 (𝑝)). Compute the moment generating
function 𝔼(𝑒 𝑡𝑋 ). Hence otherwise obtain the mean and variance of the distribution.

ANSWER

The probability mass function is defined as


𝑓(𝑋 = 𝑥 ) = 𝑝(1 − 𝑝) 𝑥−1 = 𝑝𝑞 𝑥−1 ; 𝑥 = 1,2, …,
The moment generating function
∞ ∞

𝑀𝑋 (𝑡) = 𝐸 (𝑒 𝑡𝑥 ) = ∑ 𝑒 𝑡𝑥 𝑝𝑞 𝑥−1 = 𝑝𝑒 𝑡 ∑ (𝑞𝑒 𝑡 ) 𝑥−1


𝑥=1 𝑥=1

= 𝑝𝑒 𝑡 [1 + 𝑞𝑒 𝑡 + (𝑞𝑒 𝑡 )2 + (𝑞𝑒 𝑡 )3 + ⋯ ]
1
(1 + 𝑟 + 𝑟 2 + 𝑟 3 + ⋯ = ; (𝑠𝑢𝑚 𝑜𝑓 𝑖𝑛𝑓𝑖𝑛𝑖𝑡𝑦 𝑠𝑒𝑟𝑖𝑒𝑠)
1−𝑟
1 𝑝𝑒 𝑡
𝑀𝑋 (𝑡) = 𝑝𝑒 𝑡 × =
1 − 𝑞𝑒𝑡 1 − 𝑞𝑒 𝑡
Hence,
𝑝𝑒 𝑡
𝑀𝑋 (𝑡) =
1 − 𝑞𝑒 𝑡
So the generating function and its first two derivatives are:

𝑝𝑒 𝑡
𝑀𝑋 (𝑡) =
1 − 𝑞𝑒 𝑡
𝑑 𝑝𝑒 𝑡 (1 − 𝑞𝑒 𝑡 )𝑝𝑒 𝑡 − 𝑝𝑒 𝑡 (−𝑞𝑒 𝑡 ) 𝑝𝑒 𝑡 − 𝑝𝑒 𝑡 𝑞𝑒 𝑡 + 𝑝𝑒 𝑡 𝑞𝑒 𝑡 𝑝𝑒 𝑡
𝑀𝑋′ (𝑡) = [ ]= = =
𝑑𝑡 1 − 𝑞𝑒 𝑡 (1 − 𝑞𝑒 𝑡 )2 (1 − 𝑞𝑒 𝑡 )2 (1 − 𝑞𝑒 𝑡 )2

𝑑2 𝑑 𝑑 𝑒𝑡
𝑀𝑋′′ (𝑡 ) = (𝑀 ( 𝑡 )) = [ 𝑀 ′ ( )]
𝑡 = 𝑝 [ ]
𝑑𝑡 2 𝑋 𝑑𝑡 𝑋 𝑑𝑡 (1 − 𝑞𝑒 𝑡 )2

𝑝[𝑒 𝑡 (1 − 𝑞𝑒 𝑡 )2 − 2(1 − 𝑞𝑒 𝑡 )(−𝑞𝑒 𝑡 )]


=
(1 − 𝑞𝑒 𝑡 )4

𝑝𝑒 𝑡 (1 − 𝑞𝑒 𝑡 )[(1 − 𝑞𝑒 𝑡 ) + 2𝑞 ]
=
(1 − 𝑞𝑒 𝑡 )4

𝑝𝑒 𝑡 [(1 − 𝑞𝑒 𝑡 ) + 2𝑞 ]
=
(1 − 𝑞𝑒 𝑡 )3

To obtain the mean and variance

𝑝𝑒 0 𝑝 𝑝
𝑀𝑋 (𝑡 = 0) = 0
= = =1
1 − 𝑞𝑒 1−𝑞 𝑝
𝑝𝑒 0 𝑝 𝑝 1
𝜇1′ = 𝑀𝑋′ (𝑡 = 0) = 0 2
= 2
= 2=
(1 − 𝑞𝑒 ) (1 − 𝑞 ) 𝑝 𝑝

1
𝜇1′ = = 𝑚𝑒𝑎𝑛
𝑝

𝑝𝑒 0 [(1 − 𝑞𝑒 0 ) + 2𝑞 ] 𝑝(1 − 𝑞 + 2𝑞 )
𝜇2′ = 𝑀𝑋′′ (𝑡 = 0) = =
(1 − 𝑞𝑒 0 )3 (1 − 𝑞 )3

𝑝 (1 + 𝑞 ) 𝑝 (1 + 𝑞 ) 1 + 𝑞
= = = 2
(1 − 𝑞 )3 𝑝3 𝑝
𝜇2 = 𝑣𝑎𝑟 (𝑥 ) = 𝑀𝑋′′ (𝑡 = 0) − [𝑀𝑥′ (𝑡 = 0)]2

1+𝑞 1 2 1+𝑞 1
= 2
− ( ) = 2 − 2
𝑝 𝑝 𝑝 𝑝

1+𝑞−1 𝑞
= 2
= 2
𝑝 𝑝

Q60. Suppose that 𝑌 has the following moment generating function

𝑒𝑡
𝑀𝑋 (𝑡) = , 𝑡 < −ln(0.75)
4 − 3𝑒 𝑡

Find the 𝐸(𝑌), 𝐸 (𝑌 2 ) 𝑎𝑛𝑑 𝑣𝑎𝑟 (𝑌)

ANSWER

We have two methods to solve these are:

Method 1: obtain the first and second derivative

𝑒𝑡
𝑀𝑋 (𝑡) =
4 − 3𝑒 𝑡

𝑑 𝑑 𝑒𝑡 (4 − 3𝑒 𝑡 ). 𝑒 𝑡 − 𝑒 𝑡 (−3𝑒 𝑡 )
𝐸 (𝑌 ) = 𝑀𝑥′ (𝑡) = (𝑀𝑥 (𝑡)) = ( )=
𝑑𝑡 𝑑𝑡 4 − 3𝑒 𝑡 (4 − 3𝑒 𝑡 )2

4𝑒 𝑡 − 3𝑒 2𝑡 + 3𝑒 2𝑡 4𝑒 𝑡
= =
(4 − 3𝑒 𝑡 )2 (4 − 3𝑒 𝑡 )2

𝑑2 𝑑 𝑑 4𝑒 𝑡
𝐸 (𝑌 2 ) = 𝑀𝑋′′ (𝑡) = (𝑀 ( 𝑡 )) = (𝑀 ′ (𝑡 )) = ( )
𝑑𝑡 2 𝑥 𝑑𝑡 𝑥 𝑑𝑡 (4 − 3𝑒 𝑡 )2

(4 − 3𝑒 𝑡 )2 . 4𝑒 𝑡 − 8𝑒 𝑡 (4 − 3𝑒 𝑡 )(−3𝑒 𝑡 )
=
(4 − 3𝑒 𝑡 )4

4𝑒 𝑡 (4 − 3𝑒 𝑡 )[4 − 3𝑒 𝑡 + 6𝑒 𝑡 ] 4𝑒 𝑡 (4 + 3𝑒 𝑡 )
= =
(4 − 3𝑒 𝑡 )4 (4 − 3𝑒 𝑡 )3

To obtain 𝐸(𝑌), 𝐸 (𝑌 2 ) 𝑎𝑛𝑑 𝑣𝑎𝑟 (𝑌)

4𝑒 0 4 4
𝐸 (𝑌 ) = 𝑀𝑋′ (𝑡 = 0) = 𝑡 2
= 2
= =4
(4 − 3𝑒 ) (4 − 3) 1
4𝑒 0 (4 + 3𝑒 0 ) 4(4 + 3) 4(7)
𝐸 (𝑌 2 ) = 𝑀𝑋′′ (𝑡 = 0) = = = = 28
(4 − 3𝑒 0 )3 (4 − 3)3 1

𝜇2 = 𝑣𝑎𝑟 (𝑥 ) = 𝑀𝑋′′ (𝑡 = 0) − [𝑀𝑥′ (𝑡 = 0)]2 = 28 − 42 = 28 − 16 = 12

Method 2: we can recognized that this is moment generating function for geometric distribution.
We can match this MGF to a known MGF of one of the distributions we considered and then apply.
Observe that:

1 𝑡
𝑒 𝑝𝑒 𝑡 1 1 3
𝑀𝑋 (𝑡) = 4 = 𝑤ℎ𝑒𝑟𝑒 𝑝 = 𝑎𝑛𝑑 𝑞 = 1 − =
3 𝑡
1 − 𝑒 𝑡 1 − 𝑞𝑒 4 4 4
4

Since the expected value of geometric random variable is given as

1 1 1 4
𝐸 (𝑌 ) = = =1÷ =1× =4
𝑝 1 4 1
4

𝑞 3 1 2 3
𝑉𝑎𝑟 (𝑌 ) = = ÷ ( ) = × 16 = 3 × 4 = 12
𝑝2 4 4 4

2.3.1.4 MOMENT GENERATING FUNCTION OF THE NEGATIVE BINOMIAL


DISTRIBUTION
Q61. Let 𝑋~𝑁𝐵𝑖𝑛(𝑟, 𝑝). Obtain the moment generating function, hence otherwise, find the
moment generating function about mean of the distribution
ANSWER
The probability mass function is defined as
𝑥 − 1 𝑟(
𝑓 (𝑋 = 𝑥 ) = ( ) 𝑝 1 − 𝑝) 𝑥−𝑟 ; 𝑥 = 𝑟, 𝑟 + 1, 𝑟 + 2, …,
𝑟−1
𝑥 − 1 𝑟 𝑥−𝑟
𝑓 (𝑋 = 𝑥 ) = ( ) 𝑝 𝑞 ; 𝑥 = 𝑟, 𝑟 + 1, 𝑟 + 2, … , 𝑤ℎ𝑒𝑟𝑒, 𝑞 = 1 − 𝑝
𝑟−1
The moment generating function
∞ ∞
𝑥 − 1 𝑟 𝑥−𝑟 𝑥 − 1 𝑡(𝑥−𝑟) 𝑥−𝑟
𝑀𝑋 (𝑡) = 𝐸 (𝑒 𝑡𝑥 ) = ∑ 𝑒 𝑡𝑥 ( )𝑝 𝑞 = 𝑝𝑟 𝑒 𝑟𝑡 ∑ ( )𝑒 𝑞
𝑟−1 𝑟−1
𝑥=𝑟 𝑥=𝑟

𝑥 − 1 ( 𝑡 ) 𝑥−𝑟
= 𝑝𝑟 𝑒 𝑟𝑡 ∑ ( ) 𝑞𝑒
𝑟−1
𝑥=𝑟
𝑙𝑒𝑡 𝑥 − 𝑟 = 0 𝑜𝑟 𝑥 − 𝑟 = 𝑘 ⇒ 𝑥 = 𝑘 + 𝑟

𝑟 𝑟𝑡 𝑘 + 𝑟 − 1 ( 𝑡 )𝑘
=𝑝 𝑒 ∑( ) 𝑞𝑒 (∗)
𝑟−1
𝑘=0

Recall sum of negative binomial series



(1 − 𝑥 )−𝑛 𝑘 + 𝑛 − 1 ( )𝑘 |𝑥 | < 1
= ∑( ) 𝑥 (∗∗)
𝑛−1
𝑘=1

We compare equation (*) and (**), then we deduce that


1
𝑀𝑋 (𝑡) = (𝑝𝑒 𝑡 )𝑟 (1 − 𝑞𝑒 𝑡 )−𝑟 𝑝𝑟𝑜𝑣𝑖𝑑𝑒𝑑 |𝑞𝑒 𝑡 | < 1 ⇒ 𝑡 < ln ( )
𝑞
𝑟
(𝑝𝑒 𝑡 )𝑟 𝑝𝑒 𝑡
𝑀𝑋 (𝑡) = = ( )
(1 − 𝑞𝑒 𝑡 )𝑟 1 − 𝑞𝑒 𝑡
Hence,
𝑟
𝑝𝑒 𝑡
( )
𝑀𝑋 𝑡 = ( )
1 − 𝑞𝑒 𝑡
So the generating function and its first two derivatives are:

𝑟
𝑝𝑒 𝑡
𝑀𝑋 (𝑡) = ( )
1 − 𝑞𝑒 𝑡
𝑟 𝑟−1
𝑑 𝑝𝑒 𝑡 𝑝𝑒 𝑡 (1 − 𝑞𝑒 𝑡 )𝑝𝑒 𝑡 − 𝑝𝑒 𝑡 (−𝑞𝑒 𝑡 )
𝑀𝑋′ (𝑡) = ( ) =𝑟( ) ×[ ]
𝑑𝑡 1 − 𝑞𝑒 𝑡 1 − 𝑞𝑒 𝑡 (1 − 𝑞𝑒 𝑡 )2

𝑟−1 𝑟−1
𝑝𝑒 𝑡 𝑝𝑒 𝑡 − 𝑝𝑒 𝑡 𝑞𝑒 𝑡 + 𝑝𝑒 𝑡 𝑞𝑒 𝑡 𝑝𝑒 𝑡 𝑝𝑒 𝑡
= 𝑟( ) ×[ ] = 𝑟 ( ) ×
1 − 𝑞𝑒 𝑡 (1 − 𝑞𝑒 𝑡 )2 1 − 𝑞𝑒 𝑡 (1 − 𝑞𝑒 𝑡 )2

𝑟 (𝑝𝑒 𝑡 )𝑟
=
(1 − 𝑞𝑒 𝑡 )𝑟+1

𝑑2 𝑑 𝑑 𝑒 𝑟𝑡
𝑀𝑋′′ (𝑡 ) = (𝑀 ( 𝑡 )) = [ 𝑀 ′ ( )]
𝑡 = 𝑟𝑝𝑟
[ ]
𝑑𝑡 2 𝑋 𝑑𝑡 𝑋 𝑑𝑡 (1 − 𝑞𝑒 𝑡 )𝑟+1

𝑟𝑒 𝑟𝑡 (1 − 𝑞𝑒 𝑡 )𝑟+1 − (𝑟 + 1)(1 − 𝑞𝑒 𝑡 )𝑟 (−𝑞𝑒 𝑡 )𝑒 𝑟𝑡


= 𝑟𝑝𝑟 [ ]
((1 − 𝑞𝑒 𝑡 )𝑟+1 )2
𝑟(1 − 𝑞𝑒 𝑡 ) + (𝑟 + 1)(𝑞𝑒 𝑡 )
= 𝑟𝑝𝑟 𝑒 𝑟𝑡 (1 − 𝑞𝑒 𝑡 )𝑟 [ ]
((1 − 𝑞𝑒 𝑡 )𝑟+1 )2

𝑟 − 𝑟𝑞𝑒 𝑡 + 𝑟𝑞𝑒 𝑡 + 𝑞𝑒 𝑡
= 𝑟𝑝𝑟 𝑒 𝑟𝑡 (1 − 𝑞𝑒 𝑡 )𝑟 [ ]
((1 − 𝑞𝑒 𝑡 )𝑟+1 )2

𝑟 + 𝑞𝑒 𝑡
= 𝑟𝑝𝑟 𝑒 𝑟𝑡 [ ]
(1 − 𝑞𝑒 𝑡 )𝑟+2

To obtain the mean and variance


𝑟
𝑝𝑒 0 𝑝 𝑟 𝑝 𝑟
𝑀𝑋 (𝑡 = 0) = ( ) = ( ) = ( ) =1
1 − 𝑞𝑒 0 1−𝑞 𝑝

𝑟 (𝑝𝑒 0 )𝑟 𝑟𝑝𝑟 𝑟𝑝𝑟 𝑟


𝜇1′ = 𝑀𝑋′ (𝑡 = 0) = 0 𝑟+1
= 𝑟+1
= 𝑟
=
(1 − 𝑞𝑒 ) (1 − 𝑞 ) 𝑝 ×𝑝 𝑝

𝑟
𝜇1′ = = 𝑚𝑒𝑎𝑛
𝑝

𝑟 + 𝑞𝑒 0
𝜇2′ = 𝑀𝑋′′ (𝑡 = 0) = 𝑟𝑝𝑟 𝑒 𝑟(0) [ ]
(1 − 𝑞𝑒 0 )𝑟+2

𝑟+𝑞 𝑟+𝑞
= 𝑟𝑝𝑟 ( 𝑟+2
) = 𝑟𝑝𝑟 ( 𝑟 )
(1 − 𝑞 ) 𝑝 × 𝑝2

𝑟 (𝑟 + 𝑞 )
=
𝑝2

𝜇2 = 𝑣𝑎𝑟 (𝑥 ) = 𝑀𝑋′′ (𝑡 = 0) − [𝑀𝑥′ (𝑡 = 0)]2

𝑟(𝑟 + 𝑞 ) 𝑟 2
= − ( )
𝑝2 𝑝

𝑟 2 + 𝑟𝑞 𝑟 2
= − 2
𝑝2 𝑝

𝑟 2 + 𝑟𝑞 − 𝑟 2 𝑟𝑞
= = 2
𝑝2 𝑝
2.3.2 MOMENTS GENERATING FUNCTION FOR COUNTINUOUS RANDOM
VARIABLE
Q62. Let 𝑋 be a random variable whose probability density function is given by

1
𝑒 −2𝑥 + 𝑒 −𝑥 , 𝑥 > 0
𝑓𝑋 (𝑥 ) = { 2

0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒

(a) Write down the moment generating function for 𝑋


(b) Use this moment generating function to compute the first & second moments and variance
of 𝑋

ANSWER

The moment generating function


∞ ∞ 1
𝑀𝑋 (𝑡) = 𝐸 (𝑒 𝑡𝑥 ) = ∫ 𝑒 𝑡𝑥 𝑓(𝑥 )𝑑𝑥 = ∫ 𝑒 𝑡𝑥 (𝑒 −2𝑥 + 𝑒 −𝑥 ) 𝑑𝑥
0 0 2

∞ 1 ∞ 1
= ∫ (𝑒 −2𝑥+𝑡𝑥 + 𝑒 −2𝑥+𝑡𝑥 ) 𝑑𝑥 = ∫ (𝑒 −𝑥(2−𝑡) + 𝑒 −𝑥(1−𝑡) ) 𝑑𝑥
0 2 0 2

𝑥=∞ 𝑥=∞
1 −𝑥(2−𝑡)
1 −𝑥(1−𝑡)
=− 𝑒 | − 𝑒 |
2−𝑡 𝑥=0 2 (1 − 𝑡 ) 𝑥=0

1 1
=− (𝑒 −∞ − 𝑒 −0 ) − (𝑒 −∞ − 𝑒 −0 )
2−𝑡 2 (1 − 𝑡 )

1 1 1 1
=− (0 − 1) − (0 − 1) = − (−1) − (−1)
2−𝑡 2 (1 − 𝑡 ) 2−𝑡 2(1 − 𝑡)

1 1 2 − 2𝑡 + 2 − 𝑡
= + =
2 − 𝑡 2(1 − 𝑡 ) 2(2 − 𝑡)(1 − 𝑡)

4 − 3𝑡
=
2(2 − 𝑡)(1 − 𝑡 )

So the generating function and its first two derivatives are:

1 1
𝑀𝑋 (𝑡) = +
2 − 𝑡 2(1 − 𝑡)
𝑑 1 1 1 1
𝑀𝑋′ (𝑡) = ( + )= 2
+
𝑑𝑡 2 − 𝑡 2(1 − 𝑡) (2 − 𝑡 ) 2(1 − 𝑡)2

𝑑2 𝑑 𝑑 1 1 2 1
𝑀𝑋′′ (𝑡) = 2 (𝑀𝑋 (𝑡)) = [𝑀𝑋′ (𝑡)] = [ + ] = +
𝑑𝑡 𝑑𝑡 𝑑𝑡 (2 − 𝑡)2 2(1 − 𝑡)2 (2 − 𝑡 )3 (1 − 𝑡 )3

To obtain the mean and variance

4 − 3(0) 4 4
𝑀𝑋 (𝑡 = 0) = = = =1
2(2 − 0)(1 − 0) 2(2) 4
1 1 1 1 1+2 3
𝜇1′ = 𝑀𝑋′ (𝑡 = 0) = + = + = =
(2 − 0)2 2(1 − 0) 2 22 2 4 4

3
𝜇1′ = = 𝑚𝑒𝑎𝑛
4

2 1 2 2 + 8 10 5
𝜇2′ = 𝑀𝑋′′ (𝑡 = 0) = 3
+ 3
= 3+1 = = =
(2 − 0) (1 − 0) 2 8 8 4

5 3 2 5 9 20 − 9 11
𝜇2 = 𝑣𝑎𝑟 (𝑥 ) = 𝑀𝑋′′ (𝑡 = 0) − [𝑀𝑥′ (𝑡 =0)]2 = −( ) = − = =
4 4 4 16 16 16

Q63. Let 𝑋 be a continuous random variable with values in [0,2] and prob. density function 𝑓𝑋

1 1
(𝑖) 𝑓𝑋 = 𝑥 (𝑖𝑖)𝑓𝑋 (𝑥 ) = 1 − 𝑥
2 2

(a) Find the moment generating function 𝑀𝑋 (𝑡)


ANSWER

(i) The probability density function is given as


1
𝑓𝑋 = 𝑥
2
The moment generating function
∞ 2 1
𝑀𝑋 (𝑡) = 𝐸 (𝑒 𝑡𝑥 ) =∫ 𝑒 𝑡𝑥 𝑓(𝑥 )𝑑𝑥 = ∫ 𝑒 𝑡𝑥 𝑥𝑑𝑥
0 0 2

2
1 2 1 2 1 𝑥𝑒 𝑡𝑥 2 𝑒 𝑡𝑥
= ∫ 𝑥𝑒 𝑡𝑥 𝑑𝑥 = [∫ 𝑥𝑒 𝑡𝑥 𝑑𝑥 ] = [( ) −∫ 𝑑𝑥 ]
2 0 2 0 2 𝑡 0 0 𝑡
2 2
1 𝑥𝑒 𝑡𝑥 𝑒 𝑡𝑥 1 2𝑒 2𝑡 𝑒 2𝑡 𝑒 0 1 2𝑒 2𝑡 𝑒 2𝑡 1
= [( ) − 2| ]= [ − ( 2 − 2 )] = [ − 2 + 2]
2 𝑡 0 𝑡 0 2 𝑡 𝑡 𝑡 2 𝑡 𝑡 𝑡

𝑒 2𝑡 𝑒 2𝑡 1 2𝑡𝑒 2𝑡 − 𝑒 2𝑡 + 1
= − 2+ 2=
𝑡 2𝑡 2𝑡 2𝑡 2

(ii) The probability density function is given as


1
𝑓𝑋 = 1 − 𝑥
2
The moment generating function
∞ 2 1 2 1 2
𝑀𝑋 (𝑡) = 𝐸 (𝑒 𝑡𝑥 ) = ∫ 𝑒 𝑡𝑥 𝑓(𝑥 )𝑑𝑥 = ∫ 𝑒 𝑡𝑥 (1 − 𝑥) 𝑑𝑥 = ∫ 𝑒 𝑡𝑥 𝑑𝑥 − ∫ 𝑥𝑒 𝑡𝑥 𝑑𝑥
0 0 2 0 2 0

2 2
𝑒 𝑡𝑥 1 𝑥𝑒 𝑡𝑥 2 𝑒 𝑡𝑥
= | − [( ) −∫ 𝑑𝑥 ]
𝑡 0 2 𝑡 0 0 𝑡

2 2
𝑒 2𝑡 1 𝑥𝑒 𝑡𝑥 𝑒 𝑡𝑥 1 2𝑒 2𝑡 𝑒 2𝑡 𝑒 0
= − [( ) − | ]= [ − ( 2 − 2 )]
𝑡 2 𝑡 0 𝑡2 0 2 𝑡 𝑡 𝑡

𝑒 2𝑡 1 2𝑒 2𝑡 𝑒 2𝑡 1 𝑒 2𝑡 𝑒 2𝑡 𝑒 2𝑡 1
= − [ − 2 + 2] = − − 2+ 2
𝑡 2 𝑡 𝑡 𝑡 𝑡 𝑡 2𝑡 2𝑡

1 𝑒 2𝑡 1 − 𝑒 2𝑡
= − =
2𝑡 2 2𝑡 2 2𝑡 2

2.3.2.1 MOMENT GENERATING FUNCTION OF THE UNIFORM (OR RECTANGULAR)


DISTRIBUTION

Q64. A random variable 𝑋 is said to have a continuous uniform distribution over an interval (𝑎, 𝑏) if
its probability density function of 𝑋

1
, 𝑎<𝑥<𝑏
𝑓𝑋 (𝑥) = { 𝑏 − 𝑎

0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒

(a) Find the moment generating function 𝑓𝑋


ANSWER

The probability density function is given as

1
𝑓𝑋 (𝑥) = , 𝑎<𝑥<𝑏
𝑏−𝑎

The moment generating function is obtained as

∞ 𝑏 1
𝑀𝑋 (𝑡) = 𝐸 (𝑒 𝑡𝑥 ) = ∫ 𝑒 𝑡𝑥 𝑓(𝑥 )𝑑𝑥 = ∫ 𝑒 𝑡𝑥 𝑑𝑥
−∞ 𝑎 𝑏−𝑎

𝑏
1 𝑏 1 𝑏 1 𝑒 𝑡𝑥
= ∫ 𝑒 𝑡𝑥 𝑑𝑥 = [∫ 𝑒 𝑡𝑥 𝑑𝑥 ] = [( ) ]
𝑏−𝑎 𝑎 𝑏−𝑎 𝑎 𝑏−𝑎 𝑡 𝑎

1 𝑒 𝑏𝑡 𝑒 𝑎𝑡 1 𝑒 𝑏𝑡 − 𝑒 𝑎𝑡
= [ − ]= [ ]
(𝑏 − 𝑎) 𝑡 𝑡 𝑏−𝑎 𝑡

𝑒 𝑏𝑡 − 𝑒 𝑎𝑡
=
𝑡(𝑏 − 𝑎)

If 𝑡 = 0

𝑒 𝑏(0) − 𝑒 𝑎(0) 1 − 1 0
𝑀𝑋 (0) = = = (𝑖𝑛𝑑𝑒𝑡𝑒𝑟𝑚𝑖𝑛𝑎𝑡𝑒)
0(𝑏 − 𝑎) 0 0

𝑖𝑓 𝑡 → 0 using L’ Hopitals rule

𝑑 𝑏𝑡
𝑒 𝑏𝑡 − 𝑒 𝑎𝑡 (𝑒 − 𝑒 𝑎𝑡 ) 𝑏𝑒 𝑏𝑡 − 𝑎𝑒 𝑎𝑡
lim = lim 𝑑𝑡 = lim
𝑡→0 𝑡(𝑏 − 𝑎) 𝑡→0 𝑑 𝑡→0 ( 𝑏 − 𝑎)
[𝑡(𝑏 − 𝑎)]
𝑑𝑡

𝑏𝑒𝑏(0) − 𝑎𝑒𝑎(0) (𝑏 × 1 − 𝑎 × 1) 𝑏 − 𝑎
= = = =1
(𝑏 − 𝑎) (𝑏 − 𝑎) 𝑏−𝑎

This is an indeterminate form (0/0) for 𝑡 = 0. Using L’Hospital’s rule : 𝑀(0) = 1

1, 𝑡=0
𝑀𝑥 (𝑡) = {𝑒 𝑏𝑡 − 𝑒 𝑎𝑡
, 𝑡≠0
𝑡(𝑏 − 𝑎)
Q65. Show that for the rectangular distribution:

1
𝑓 (𝑥 ) = , −𝑎 < 𝑥 < 𝑎
2𝑎
1
The m.g.f about origin is (sinh 𝑎𝑡). Also show that moments of even order are given by
𝑎𝑡

𝑎2𝑛
𝜇2𝑛 =
2𝑛 + 1

ANSWER

The probability density function is give as

1
𝑓 (𝑥 ) = , −𝑎 < 𝑥 < 𝑎
2𝑎

The moment generating function is obtained as follow:

∞ 𝑎 1
𝑀𝑋 (𝑡) = 𝐸 (𝑒 𝑡𝑥 ) = ∫ 𝑒 𝑡𝑥 𝑓(𝑥 )𝑑𝑥 = ∫ 𝑒 𝑡𝑥 𝑑𝑥
−∞ −𝑎 2𝑎

𝑎
1 𝑎 1 𝑎 1 𝑒 𝑡𝑥
= ∫ 𝑒 𝑡𝑥 𝑑𝑥 = [∫ 𝑒 𝑡𝑥 𝑑𝑥 ] = [( ) ]
2𝑎 −𝑎 2𝑎 −𝑎 2𝑎 𝑡 −𝑎

1 𝑒 𝑎𝑡 𝑒 −𝑎𝑡 1 𝑒 𝑎𝑡 − 𝑒 −𝑎𝑡
= [ − ]= [ ]
2𝑎 𝑡 𝑡 2𝑎 𝑡

𝑒 𝑎𝑡 − 𝑒 −𝑎𝑡 1 sinh 𝑎𝑡 (𝑒𝑎𝑡 − 𝑒−𝑎𝑡 )


= = (𝑒𝑎𝑡 − 𝑒−𝑎𝑡 ) = 𝑤ℎ𝑒𝑟𝑒 sinh 𝑎𝑡 =
2𝑎𝑡 2𝑎𝑡 𝑎𝑡 2

1 (𝑎𝑡)3 (𝑎𝑡)5 𝑎2𝑡 2 𝑎4𝑡 4


= [𝑎𝑡 + + + ⋯] = 1 + + +⋯
𝑎𝑡 3! 5! 3! 5!

Since there are no terms with odd powers of 𝑡 in 𝑀𝑥 (𝑡), all moments of odd order about origin
vanish, i.e.,


𝜇2𝑛+1 (𝑎𝑏𝑜𝑢𝑡 𝑜𝑟𝑖𝑔𝑖𝑛) = 0

In particular 𝜇1′ (𝑎𝑏𝑜𝑢𝑡 𝑜𝑟𝑖𝑔𝑖𝑛) = 0 𝑖. 𝑒. 𝑚𝑒𝑎𝑛 = 0

Thus, 𝜇𝑟′ (𝑎𝑏𝑜𝑢𝑡 𝑜𝑟𝑖𝑔𝑖𝑛) = 𝜇𝑟 (𝑠𝑖𝑛𝑐𝑒 𝑚𝑒𝑎𝑛 𝑖𝑠 𝑜𝑟𝑖𝑔𝑖𝑛)


i.e., all moments of odd order about mean vanish. The moments of even order are given by

𝑡 2𝑛 𝑎 2𝑛
𝜇2𝑛 = 𝑐𝑜𝑒𝑓𝑓𝑖𝑐𝑖𝑒𝑛𝑡 𝑜𝑓 𝑖𝑛 𝑀𝑋 (𝑡) =
(2𝑛)! 2𝑛 + 1
2.3.2.2 MOMENT GENERATING FUNCTION OF THE TRIANGULAR DISTRIBUTION

Q66. (a) A random variable 𝑋 is said to have a triangular distribution in the interval(𝑎, 𝑏), if its
probability density function is given by:

2(𝑥 − 𝑎)
; 𝑎<𝑥≤𝑐
(𝑏 − 𝑎)(𝑐 − 𝑎)
𝑓(𝑥) = (∗)
2(𝑏 − 𝑥)
;𝑐 < 𝑥 < 𝑏
{ (𝑏 − 𝑎)(𝑏 − 𝑐)

Obtain the moment generating function of the distribution of x.

(b) In particular, taking 𝑎 = 0, 𝑐 = 1 𝑎𝑛𝑑 𝑏 = 2 , 𝑖𝑛 (∗), the probability density function of the
𝑇𝑟𝑔(0,2) variate with peak at 𝑥 = 1 is given by

𝑥; 0≤𝑥≤1
𝑓(𝑥) = { 2 − 𝑥; 1 ≤ 𝑥2 (∗∗)
0; 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒

Show that moment generating function of the distribution of 𝑋 is

(𝑒 𝑡 − 1)2
𝑀𝑥 (𝑡) =
𝑡2

(c) In equation (*) above, replacing 𝑎 𝑏𝑦 − 2𝑎 (𝑎 = −2𝑎), 𝑏 𝑏𝑦 2𝑎(𝑏 = 2𝑎)𝑎𝑛𝑑 𝑐 𝑏𝑦 0(𝑥 = 0),
the probability density function of triangular distribution on the interval (−2𝑎, 2𝑎) with peak at 𝑥 =
0 is given by:

(2𝑎 + 𝑥)
; −2𝑎 < 𝑥 ≤ 0
4𝑎 2
𝑓(𝑥) = (∗∗∗)
(2𝑎 − 𝑥)
{ ; 0 < 𝑥 < 2𝑎
4𝑎 2

ANSWER
(a) The probability density function is give as
2(𝑥 − 𝑎)
; 𝑎<𝑥≤𝑐
(𝑏 − 𝑎)(𝑐 − 𝑎)
𝑓(𝑥) =
2(𝑏 − 𝑥)
;𝑐 < 𝑥 < 𝑏
{ (𝑏 − 𝑎)(𝑏 − 𝑐)

The moment generating function is obtained as follow:

𝑏 𝑐 𝑏
𝑀𝑋 (𝑡) = 𝐸 (𝑒 𝑡𝑥 ) = ∫ 𝑒 𝑡𝑥 𝑓(𝑥 )𝑑𝑥 = (∫ + ∫ ) 𝑒 𝑡𝑥 𝑓(𝑥 )𝑑𝑥
𝑎 𝑎 𝑐

2 𝑐 2 𝑏
𝑡𝑥 (
= )
∫ 𝑒 𝑥 − 𝑎 𝑑𝑥 + ∫ 𝑒𝑡𝑥 (𝑏 − 𝑥)𝑑𝑥
(𝑏 − 𝑎)(𝑐 − 𝑎) 𝑎 (𝑏 − 𝑎)(𝑏 − 𝑐) 𝑐

𝑐 𝑏
∫ 𝑒 𝑡𝑥 (𝑥 − 𝑎) 𝑑𝑥 𝑎𝑛𝑑 ∫ 𝑒 𝑡𝑥 (𝑏 − 𝑥)𝑑𝑥 (𝑏𝑦 𝑖𝑛𝑡𝑒𝑔𝑟𝑎𝑡𝑖𝑜𝑛 𝑏𝑦 𝑝𝑎𝑟𝑡)
𝑎 𝑐

𝑐 𝑏
2 (𝑥 − 𝑎)𝑒𝑡𝑥 𝑒𝑡𝑥 2 (𝑏 − 𝑥)𝑒𝑡𝑥 𝑒𝑡𝑥
= [ − 2] + [ + 2]
(𝑏 − 𝑎)(𝑐 − 𝑎) 𝑡 𝑡 𝑎 (𝑏 − 𝑎)(𝑏 − 𝑐) 𝑡 𝑡 𝑐

2 (𝑐 − 𝑎)𝑒𝑐𝑡 𝑒𝑐𝑡 𝑒𝑎𝑡 2 𝑒𝑏𝑡 (𝑏 − 𝑐)𝑒𝑐𝑡 𝑒𝑐𝑡


= [ − 2 + 2]+ [ − − 2]
(𝑏 − 𝑎)(𝑐 − 𝑎) 𝑡 𝑡 𝑡 (𝑏 − 𝑎)(𝑏 − 𝑐) 𝑡2 𝑡 𝑡

𝑒 𝑐𝑡 𝑒 𝑐𝑡 𝑒 𝑎𝑡 𝑒 𝑏𝑡 𝑒 𝑐𝑡
= 2{ − + + −
(𝑏 − 𝑎)𝑡 (𝑏 − 𝑎)(𝑐 − 𝑎) 𝑡 2 (𝑏 − 𝑎)(𝑐 − 𝑎)𝑡 2 (𝑏 − 𝑎)(𝑏 − 𝑐) 𝑡 2 (𝑏 − 𝑎)𝑡
𝑒 𝑐𝑡
− }
(𝑏 − 𝑎)(𝑏 − 𝑐)𝑡 2

2 𝑒 𝑎𝑡 𝑒 𝑐𝑡 𝑒 𝑐𝑡 𝑒 𝑏𝑡
= { − − + }
𝑡 2 (𝑏 − 𝑎)(𝑐 − 𝑎)𝑡 2 (𝑏 − 𝑎)(𝑐 − 𝑎)𝑡 2 (𝑏 − 𝑎)(𝑏 − 𝑐)𝑡 2 (𝑏 − 𝑎)(𝑏 − 𝑐)𝑡 2

2 𝑒 𝑎𝑡 𝑒 𝑐𝑡 𝑏−𝑎 𝑒 𝑏𝑡
= { − ( ) + }
𝑡 2 (𝑏 − 𝑎)(𝑐 − 𝑎) (𝑏 − 𝑎) (𝑐 − 𝑎)(𝑏 − 𝑐) (𝑏 − 𝑎)(𝑏 − 𝑐)

2 𝑒 𝑎𝑡 𝑒 𝑐𝑡 𝑒 𝑏𝑡
= { + + };𝑎 < 𝑥 < 𝑏
𝑡 2 (𝑎 − 𝑏 )(𝑎 − 𝑐) (𝑐 − 𝑎)(𝑏 − 𝑐) (𝑏 − 𝑎)(𝑏 − 𝑐)
(b) The probability density function is given as

𝑥; 0≤𝑥≤1
𝑓(𝑥) = { 2 − 𝑥; 1 ≤ 𝑥2
0; 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒

The moment generating function is obtained as follow:

2 1 2
𝑀𝑋 (𝑡) = 𝐸 (𝑒 𝑡𝑥 ) = ∫ 𝑒 𝑡𝑥 𝑓(𝑥 )𝑑𝑥 = (∫ + ∫ ) 𝑒 𝑡𝑥 𝑓(𝑥 )𝑑𝑥
0 0 1

1 2 1 2 2
= ∫ 𝑒𝑡𝑥 𝑥 𝑑𝑥 + ∫ 𝑒𝑡𝑥 (2 − 𝑥)𝑑𝑥 = ∫ 𝑥𝑒𝑡𝑥 𝑑𝑥 + ∫ 2𝑒𝑡𝑥 𝑑𝑥 − ∫ 𝑥𝑒𝑡𝑥 𝑑𝑥
0 1 0 1 1

1 𝑏
∫ 𝑒 𝑥 𝑑𝑥 𝑎𝑛𝑑 ∫ 𝑒 𝑡𝑥 (2 − 𝑥)𝑑𝑥 (𝑏𝑦 𝑖𝑛𝑡𝑒𝑔𝑟𝑎𝑡𝑖𝑜𝑛 𝑏𝑦 𝑝𝑎𝑟𝑡)
𝑡𝑥
0 𝑐

1 2 2
𝑥𝑒𝑡𝑥 𝑒𝑡𝑥 𝑒𝑡𝑥 𝑥𝑒𝑡𝑥 𝑒𝑡𝑥
𝑀𝑋 (𝑡) = [ − ] +2[ ] −[ − ]
𝑡 𝑡2 0
𝑡 1
𝑡 𝑡2 1

𝑒𝑡 𝑒𝑡 𝑒0 𝑒2𝑡 𝑒𝑡 2𝑒2𝑡 𝑒2𝑡 𝑒𝑡 𝑒𝑡


= [( − 2
) − (0 − 2
)] + 2 [ − ] − [( − 2
)−( − )]
𝑡 𝑡 𝑡 𝑡 𝑡 𝑡 𝑡 𝑡 𝑡2

𝑒𝑡 𝑒𝑡 1 2𝑒 2𝑡 2𝑒 𝑡 2𝑒2𝑡 𝑒2𝑡 𝑒𝑡 𝑒𝑡
=[ − + + − − + 2 + − 2 ] (𝑐𝑜𝑙𝑙𝑒𝑐𝑡 𝑡ℎ𝑒 𝑎𝑙𝑖𝑘𝑒 𝑡𝑒𝑟𝑚)
𝑡 𝑡 2
𝑡2 𝑡 𝑡 𝑡 𝑡 𝑡 𝑡

𝑒𝑡 𝑒𝑡 2𝑒 𝑡 2𝑒 2𝑡 2𝑒2𝑡 𝑒2𝑡 𝑒𝑡 𝑒𝑡 1
= [( + − )+( − ) + ( 2 − 2 − 2 + 2 )]
𝑡 𝑡 𝑡 𝑡 𝑡 𝑡 𝑡 𝑡 𝑡

1 2𝑡
= (𝑒 − 2𝑒 𝑡 + 1) (𝑠𝑜𝑙𝑣𝑒 𝑡ℎ𝑒 𝑞𝑢𝑎𝑑𝑟𝑎𝑡𝑖𝑐 𝑒𝑥𝑝𝑟𝑒𝑠𝑠𝑖𝑜𝑛 𝑢𝑠𝑖𝑛𝑔 𝑓𝑎𝑐𝑡𝑜𝑟𝑖𝑧𝑎𝑡𝑖𝑜𝑛)
𝑡2

1
= [(𝑒2𝑡 − 𝑒 𝑡 ) − (𝑒 𝑡 + 1)]
𝑡2

1 𝑡 𝑡 𝑡 − 1)] =
1
= 2
[ 𝑒 ( 𝑒 − 1) − 1(𝑒 2
[(𝑒 𝑡 − 1)(𝑒 𝑡 − 1)]
𝑡 𝑡

1
= [(𝑒 𝑡 − 1)2 ]
𝑡2

(𝑒 𝑡 − 1) 2
𝑀𝑋 (𝑡) = 𝑎𝑠 𝑟𝑒𝑞𝑢𝑖𝑟𝑒𝑑
𝑡2
(c) The probability density function is given as

(2𝑎 + 𝑥)
; −2𝑎 < 𝑥 ≤ 0
4𝑎 2
𝑓(𝑥) =
(2𝑎 − 𝑥)
; 0 < 𝑥 < 2𝑎
{ 4𝑎 2

The moment generating function is obtained as follow:

2𝑎 0 2
𝑀𝑋 (𝑡) = 𝐸 (𝑒 𝑡𝑥 ) = ∫ 𝑒 𝑡𝑥 𝑓(𝑥 )𝑑𝑥 = (∫ +∫ ) 𝑒 𝑡𝑥 𝑓(𝑥 )𝑑𝑥
−2𝑎 −2𝑎 0

0 (2𝑎 + 𝑥) 2𝑎 (2𝑎 − 𝑥)
= ∫ 𝑒𝑡𝑥 𝑑𝑥 + ∫ 𝑒𝑡𝑥 𝑑𝑥
−2𝑎 4𝑎2 0 4𝑎2

0 2𝑎
1 𝑡𝑥
= [∫ (2𝑎 + 𝑥)𝑒 𝑑𝑥 + ∫ (2𝑎 − 𝑥)𝑒 𝑡𝑥 𝑑𝑥]
4𝑎 2 −2𝑎 0

0 2𝑎
∫ (2𝑎 + 𝑥)𝑒 𝑡𝑥 𝑎𝑛𝑑 ∫ (2𝑎 − 𝑥)𝑒 𝑡𝑥 𝑑𝑥 (𝑏𝑦 𝑖𝑛𝑡𝑒𝑔𝑟𝑎𝑡𝑖𝑜𝑛 𝑏𝑦 𝑝𝑎𝑟𝑡)
−2𝑎 0

1 0 0 2𝑎 2𝑎
𝑀𝑋 (𝑡) = 2
[2𝑎 ∫ 𝑒𝑡𝑥 + ∫ 𝑥𝑒 𝑡𝑥 𝑑𝑥 + 2𝑎 ∫ 𝑒 𝑡𝑥 𝑑𝑥 − ∫ 𝑥𝑒 𝑡𝑥 𝑑𝑥 ]
4𝑎 −2𝑎 −2𝑎 0 0

0 0 2𝑎 2𝑎
1 𝑒 𝑡𝑥 𝑥𝑒 𝑡𝑥 𝑒 𝑡𝑥 𝑒 𝑡𝑥 𝑥𝑒 𝑡𝑥 𝑒 𝑡𝑥
= 2 [2𝑎 ( ) +( − 2) + 2𝑎 ( ) − ( − 2) ]
4𝑎 𝑡 −2𝑎 𝑡 𝑡 −2𝑎 𝑡 0 𝑡 𝑡 0

1 1 𝑒 −2𝑎𝑡 1 2𝑎𝑒 −2𝑎𝑡 𝑒 −2𝑎𝑡 𝑒 2𝑎𝑡 1 2𝑎𝑒 2𝑎𝑡 𝑒 2𝑎𝑡 1


= [2𝑎 ( − ) + (− + + ) + 2𝑎 ( − ) − ( − 2 + 2 )]
4𝑎2 𝑡 𝑡 𝑡2 𝑡 𝑡2 𝑡 𝑡 𝑡 𝑡 𝑡

1 2𝑎 2𝑎𝑒 −2𝑎𝑡 1 2𝑎𝑒 −2𝑎𝑡 𝑒 −2𝑎𝑡 2𝑎𝑒 2𝑎𝑡 2𝑎 2𝑎𝑒 2𝑎𝑡 𝑒 2𝑎𝑡 1
= [ − − 2+ + 2 + − − + 2 − 2]
4𝑎2 𝑡 𝑡 𝑡 𝑡 𝑡 𝑡 𝑡 𝑡 𝑡 𝑡

(Collect the alike terms)

1 2𝑎 2𝑎 2𝑎𝑒 −2𝑎𝑡 2𝑎𝑒 −2𝑎𝑡 2𝑎𝑒 2𝑎𝑡 2𝑎𝑒 2𝑎𝑡 𝑒 −2𝑎𝑡 𝑒 2𝑎𝑡 1 1
𝑀𝑋 (𝑡) = 2
[ − + − + − + 2 + 2 − 2 − 2]
4𝑎 𝑡 𝑡 𝑡 𝑡 𝑡 𝑡 𝑡 𝑡 𝑡 𝑡

1 2𝑎 − 2𝑎 2𝑎𝑒 −2𝑎𝑡 − 2𝑎𝑒 −2𝑎𝑡 2𝑎𝑒 2𝑎𝑡 − 2𝑎𝑒 2𝑎𝑡 𝑒 −2𝑎𝑡 𝑒 2𝑎𝑡 1 1
= 2
[ + + + 2 + 2 − 2 − 2]
4𝑎 𝑡 𝑡 𝑡 𝑡 𝑡 𝑡 𝑡
1 𝑒 −2𝑎𝑡 𝑒 2𝑎𝑡 2 1 𝑒 −2𝑎𝑡 𝑒 2𝑎𝑡 2
= 2 [0 + 0 + 0 + 2 + 2 − 2 ] = 2 [ 2 + 2 − 2 ]
4𝑎 𝑡 𝑡 𝑡 4𝑎 𝑡 𝑡 𝑡

1 1
= 2 2
(𝑒 2𝑎𝑡 + 𝑒 −2𝑎𝑡 − 2) = 2 2 (𝑒 𝑎𝑡 . 𝑒 𝑎𝑡 + 𝑒 −𝑎𝑡 . 𝑒 −𝑎𝑡 − 𝑒 𝑎𝑡 . 𝑒 −𝑎𝑡 − 𝑒 𝑎𝑡 . 𝑒 −𝑎𝑡 )
4𝑎 𝑡 4𝑎 𝑡

1
= 2 2
[(𝑒 𝑎𝑡 . 𝑒 𝑎𝑡 − 𝑒 𝑎𝑡 . 𝑒 −𝑎𝑡 ) + (𝑒 −𝑎𝑡 . 𝑒 −𝑎𝑡 − 𝑒 𝑎𝑡 . 𝑒 −𝑎𝑡 )]
4𝑎 𝑡

1 1
= 2 2
[𝑒 𝑎𝑡 (𝑒 𝑎𝑡 − 𝑒 −𝑎𝑡 ) − 𝑒 −𝑎𝑡 (𝑒 𝑎𝑡 − 𝑒 −𝑎𝑡 )] = 2 2 [(𝑒 𝑎𝑡 − 𝑒 −𝑎𝑡 )(𝑒 𝑎𝑡 − 𝑒 −𝑎𝑡 )]
4𝑎 𝑡 4𝑎 𝑡

1 𝑎𝑡 − 𝑒 −𝑎𝑡 )2 ] =
(𝑒 𝑎𝑡 − 𝑒 −𝑎𝑡 )2
= [( 𝑒
4𝑎2 𝑡 2 4𝑎2 𝑡 2
2
𝑒𝑎𝑡 − 𝑒−𝑎𝑡
𝑀𝑋 (𝑡) = ( )
2𝑎𝑡

2.3.2.3 MOMENT GENERATING FUNCTION OF THE NORMAL DISTRIBUTION

Q67. Find the moment- generating function of the normal random variable. Hence, find the mean
and the variance.

ANSWER

The probability density function is give as

1 −(𝑥−𝜇)2
𝑓 (𝑥 ) = 𝑒 2𝜎 2 𝑓𝑜𝑟 − ∞ ≤ 𝑥 ≤ ∞
√2𝜋𝜎 2

The moment generating function is obtained as follow:



𝑀𝑋 (𝑡) = 𝐸 (𝑒 𝑡𝑥 ) = ∫ 𝑒 𝑡𝑥 𝑓(𝑥 )𝑑𝑥
−∞

∞ 1 −(𝑥−𝜇)2 1 ∞ (𝑥 2−2𝜇𝑥+𝜇 2)

= ∫ 𝑒 2𝜎2 𝑒 𝑡𝑥 𝑑𝑥 = ∫ 𝑒 2𝜎 2 𝑒 𝑡𝑥 𝑑𝑥
−∞ √2𝜋𝜎 2 √2𝜋𝜎 2 −∞

1 ∞ (𝑥 2−2𝜇𝑥+𝜇 2−2𝜎2𝑡𝑥)

= ∫ 𝑒 2𝜎2 𝑑𝑥
√2𝜋𝜎 2 −∞
𝑤ℎ𝑒𝑟𝑒, 𝑥 2 − 2𝜇𝑥 + 𝜇2 − 2𝜎 2 𝑡𝑥

= 𝑥 2 − 2(𝜇 − 𝜎 2 𝑡)𝑥 + 𝜇 2 (𝑠𝑜𝑙𝑣𝑒 𝑡ℎ𝑒 𝑞𝑢𝑎𝑑. 𝑒𝑥𝑝𝑟𝑒𝑠𝑠𝑖𝑜𝑛 𝑏𝑦 𝑢𝑠𝑖𝑛𝑔 𝑐𝑜𝑚𝑝𝑙𝑒𝑡𝑖𝑛𝑔 𝑡ℎ𝑒 𝑠𝑞𝑢𝑎𝑟𝑒)

= 𝑥 2 − 2 (𝜇 − 𝜎 2 𝑡 )𝑥 + (𝜇 − 𝜎 2 𝑡 )2 − (𝜇 − 𝜎 2 𝑡 )2 + 𝜇 2

= [(𝑥 − (𝜇 + 𝜎 2 𝑡)2 )2 − (𝜇 + 𝜎 2 𝑡)2 + 𝜇2 ]

= [(𝑥 − (𝜇 + 𝜎 2 𝑡)2 )2 − 𝜇2 − 2𝜇𝜎 2 𝑡 − 𝜎 4 𝑡 2 + 𝜇2 ]

[(𝑥 − (𝜇 + 𝜎 2 𝑡)2 )2 − 2𝜇𝜎 2 𝑡 − 𝜎 4 𝑡 2 ]

∞ 1 2 2
1 − 2{(𝑥− (𝜇+ 𝜎 2 𝑡) ) − 2𝜇𝜎2𝑡−𝜎 4𝑡 2}
= ∫ 𝑒 2𝜎
√2𝜋𝜎 2 −∞

1 ∞ (𝑥− (𝜇+ 𝜎 2𝑡))2 −2𝜇𝜎2𝑡−𝜎 4𝑡 2


− −[ ]
= ∫ 𝑒 2𝜎 2 𝑑𝑥 𝑒 2𝜎2
√2𝜋𝜎 2 −∞

1
𝑀𝑋 (𝑡) = exp (𝜇𝑡 + 𝜎 2 𝑡 2 )
2

So the generating function and its first four derivatives are:

1
𝑀𝑋 (𝑡) = exp (𝜇𝑡 + 𝜎 2 𝑡 2 )
2

The first moment generating function of 𝑿 is given as

𝑑𝑀𝑥 (𝑡) 1
𝑀𝑋′ (𝑡) = ⁄𝑡 = (𝜇 + 𝜎 2 𝑡) exp (𝜇𝑡 + 𝜎 2 𝑡 2 ) = 𝐸(𝑋)
𝑑𝑡 2

The second moment generating function of 𝑿 is given as

𝑑2 𝑀𝑥 (𝑡) 𝑑 𝑑 1
𝑀𝑋′′ (𝑡) = ⁄𝑡 = (𝑀 ′ ( ))
𝑡 = [ ( 𝜇 + 𝜎 2 𝑡 ) exp (𝜇𝑡 + 𝜎 2 𝑡 2 )]
𝑑𝑡 2 𝑑𝑡 𝑋 𝑑𝑡 2

1 1
= 𝜎 2 exp (𝜇𝑡 + 𝜎 2 𝑡 2 ) + (𝜇 + 𝜎 2 𝑡). (𝜇 + 𝜎 2 𝑡) exp (𝜇𝑡 + 𝜎 2 𝑡 2 )
2 2

1
= exp (𝜇𝑡 + 𝜎 2 𝑡 2 ) (𝜎 2 + (𝜇 + 𝜎 2 𝑡)2 )
2
The third moment generating function of 𝑿 is given as

𝑑3 𝑀𝑥 (𝑡) 𝑑 𝑑 1
𝑀𝑋′′′(𝑡 ) = 3
⁄𝑡 = (𝑀𝑋′′ (𝑡)) = [(𝜎 2 + (𝜇 + 𝜎 2 𝑡)2 ) exp (𝜇𝑡 + 𝜎 2 𝑡 2 )]
𝑑𝑡 𝑑𝑡 𝑑𝑡 2

1 1
= 2𝜎 2 (𝜇 + 𝜎 2 𝑡) exp (𝜇𝑡 + 𝜎 2 𝑡 2 ) + (𝜎 2 + (𝜇 + 𝜎 2 𝑡)2 )(𝜇 + 𝜎 2 𝑡) exp (𝜇𝑡 + 𝜎 2 𝑡 2 )
2 2

1
= (𝜇 + 𝜎 2 𝑡) exp (𝜇𝑡 + 𝜎 2 𝑡 2 ) (2𝜎 2 + 𝜎 2 + (𝜇 + 𝜎 2 𝑡)2 )
2

1
= (3𝜎 2 + (𝜇 + 𝜎 2 𝑡)2 )(𝜇 + 𝜎 2 𝑡) exp (𝜇𝑡 + 𝜎 2 𝑡 2 )
2

1
= [3𝜎 2 (𝜇 + 𝜎 2 𝑡) + (𝜇 + 𝜎 2 𝑡 )3 ] exp (𝜇𝑡 + 𝜎 2 𝑡 2 )
2

1
= [3𝜇𝜎 2 + 3 𝜎 4 𝑡 + (𝜇 + 𝜎 2 𝑡)3 ] exp (𝜇𝑡 + 𝜎 2 𝑡 2 )
2

The fourth moment generating function of 𝑿 is given as

(4) 𝑑4 𝑀𝑥 (𝑡) 𝑑 𝑑 1
𝑀𝑋 (𝑡) = 4
⁄𝑡 = (𝑀𝑋′′′ (𝑡)) = [[3𝜇𝜎 2 + 3 𝜎 4 𝑡 + (𝜇 + 𝜎 2 𝑡)3 ] exp (𝜇𝑡 + 𝜎 2 𝑡 2 )]
𝑑𝑡 𝑑𝑡 𝑑𝑡 2

1
= 3 𝜎 4 + 3 𝜎 2 (𝜇 + 𝜎 2 𝑡)2 exp (𝜇𝑡 + 𝜎 2 𝑡 2 )
2
1
+ [3𝜇𝜎 2 + 3 𝜎 4 𝑡 + (𝜇 + 𝜎 2 𝑡)3 ](𝜇 + 𝜎 2 𝑡) exp (𝜇𝑡 + 𝜎 2 𝑡 2 )
2

1
= (𝜇 + 𝜎 2 𝑡) exp (𝜇𝑡 + 𝜎 2 𝑡 2 ) [3 𝜎 4 + 3 𝜎 2 (𝜇 + 𝜎 2 𝑡) + 3𝜇𝜎 2 + 3 𝜎 4 𝑡 + (𝜇 + 𝜎 2 𝑡 )3 ]
2

1
= (𝜇 + 𝜎 2 𝑡) exp (𝜇𝑡 + 𝜎 2 𝑡 2 ) [3 𝜎 4 + 6𝜇 𝜎 2 + 6𝜎 4 𝑡 + (𝜇 + 𝜎 2 𝑡)3 ]
2

1
= (𝜇 + 𝜎 2 𝑡) exp (𝜇𝑡 + 𝜎 2 𝑡 2 ) [3 𝜎 4 + 6𝜎 2 (𝜇 + 𝜎 2 𝑡) + (𝜇 + 𝜎 2 𝑡)3 ]
2

1
= [3 𝜎 4 (𝜇 + 𝜎 2 𝑡) + 6𝜎 2 (𝜇 + 𝜎 2 𝑡)2 + (𝜇 + 𝜎 2 𝑡)4 ] exp (𝜇𝑡 + 𝜎 2 𝑡 2 )
2
The first four moment about origin at 𝒕 = 𝟎 of 𝑿 is given as

1
𝑀𝑋′ (0) = 𝜇1′ = 𝐸 (𝑋) = (𝜇 + 𝜎 2 (0)) exp (𝜇(0) + 𝜎 2 (0)2 ) = 𝜇
2

1
𝑀𝑋′′ (0) = 𝜇2′ = 𝐸 (𝑋 2 ) = exp (𝜇(0) + 𝜎 2 02 ) (𝜎 2 + (𝜇 + 𝜎 2 (0))2 ) = 𝜇2 + 𝜎 2
2

1
𝑀𝑋′′′(0) = 𝜇3′ = [3𝜇𝜎 2 + 3 𝜎 4 (0) + (𝜇 + 𝜎 2 (0))3 ] exp (𝜇(0) + 𝜎 2 (0)2 ) = 3𝜇𝜎 2 + 𝜇3
2

(4) 1
𝑀𝑋 (0) = 𝜇4′ = [3 𝜎 4 (𝜇 + 𝜎 2 . 0) + 6𝜎 2 (𝜇 + 𝜎 2 . 0)2 + (𝜇 + 𝜎 2 . 0)4 ] exp (𝜇. 0 + 𝜎 2 . 02 )
2

= 3𝜇𝜎 4 + 6𝜇2 𝜎 2 + 𝜇4

To obtain central moment, skewness and Kurtosis


2
𝜇2 = 𝜇2′ − 𝜇1′ = 𝜇2 + 𝜎 2 − (𝜇)2 = 𝜇2 + 𝜎 2 − 𝜇2 = 𝜎 2
3
𝜇3 = 𝜇3′ − 3𝜇2′ 𝜇1′ + 2𝜇1′ = 3𝜇𝜎 2 + 𝜇3 − 3(𝜇2 + 𝜎 2 )(𝜇) + 2(𝜇)3

= 3𝜇𝜎 2 + 𝜇3 − 3𝜇3 − 3𝜇𝜎 2 + 2𝜇3 = 0


2 4
𝜇4 = 𝜇4′ − 4𝜇3′ 𝜇1′ + 6𝜇2′ 𝜇1′ − 3𝜇1′

= 3𝜇𝜎 4 + 6𝜇2 𝜎 2 + 𝜇4 − 4(3𝜇𝜎 2 + 𝜇3 )(𝜇) + 6(𝜇2 + 𝜎 2 )(𝜇)2 − 3(𝜇)4

= 3𝜇𝜎 4 + 6𝜇2 𝜎 2 + 𝜇4 − 12𝜇2 𝜎 2 − 4𝜇 4 + 6𝜇4 + 6𝜇2 𝜎 2 − 3𝜇4

= 3𝜇𝜎 4 + (6𝜇 2 𝜎 2 + 6𝜇2 𝜎 2 − 12𝜇2 𝜎 2 ) + (𝜇4 + 6𝜇4 − 4𝜇4 − 3𝜇4 )

= 3𝜇𝜎 4

Skewness 𝜷𝟏 , Kurtosis, 𝜷𝟐 , Coefficient of Skewness, 𝜸𝟏 and Coefficient Kurtosis 𝜸𝟐

𝜇32 0 𝜇4 3𝜇𝜎 4
𝛽1 = 3 = 2 = 0; 𝛽2 = 2 = 2
𝜇2 (𝜇 + 𝜎 2 )3 𝜇2 (𝜇 + 𝜎 2 )2

𝛾1 = √0 = 0;

3𝜇𝜎 4 3𝜇𝜎 4 − 3(𝜎 2 )2 3[𝜇𝜎 4 − 𝜎 4 ]


𝛾2 = 2 2 − 3 = =
(𝜎 ) (𝜎 4 ) 𝜎4
3𝜎 4 (𝜇 − 1)
= = 3(𝜇 − 1)
𝜎4

You might also like