You are on page 1of 2

STAT 4510 - Moment-Generating Functions

(MGFs)
Elliot Outland
December 3, 2014

Explanation

A moment-generating function m(t) = E(etY ) is the real function whose derivatives at zero are equal to the expected value of the k th power of Y (the k th
moment of Y ). That is, for a random variable Y ,

dk MY (t)
(k)
= MY (0) = E(Y k )
dtk t=0
We can show this using the Taylor expansion of ex :
x
x2
x3
+
+
+
1!
2!
3!
t
t2
t3
ety = 1 + Y + Y 2 + Y 3
1!
2!
3!
t
t2
t3
MY (t) = E(ety ) = 1 + E(Y ) + E(Y 2 ) + E(Y 3 )
1!
2!
3!

2

(1)
2
3 t
= E(Y )
M (t) = 0 + E(Y ) + E(Y )t + E(Y ) + = m(1) (t)
2!
t=0


(2)
2
3 t
(2)
MY (t) = 0 + E(Y ) + E(Y ) + = m (t)
= E(Y 2 )
1!
t=0
..
.


t
(k)
MY (t) = E(Y k ) + E(Y k+1 ) + = m(k) (t)
= E(Y k )
1!
ex = 1 +

t=0

As stated above, part of the usefulness of moment-generating functions is the


resulting ease with which one can calculate the moments of random variables.
Perhaps more importantly, however, they allow us to compare the probability
distributions of random variables. If a pair of random variable X and Y have
the same moment generating function, then they have the same probability
distribution. That is, FX (x) = FY (y) MY (t) = MX (t).

Example

Suppose Y is a discrete random variable having a Bernoulli distribution. Find


E(Y 2 ), and V (Y ) by the method of moment-generating functions.
Since Y is a Bernoulli random variable,
is:

p,
pY (y) = 1 p,

0,
Then:

its probability mass function pY (y)


ify = 1
if y = 0
otherwise

MY (t) = E(etY )
=

1
X

etY pY (y)

y=0

= et pY (1) + e0 pY (0)
= et p + 1 p
Given the moment-generating function, we have:


(1)
= e0 p = p
E(Y ) = MY (t)
t=0


(2)
2
= et p = e0 p = p
E(Y ) = MY (t)
t=0

V (Y ) = E(Y 2 ) [E(Y )]2 = p p2 = p(1 p)


So for a Bernoulli random variable, E(Y ) = p, E(Y 2 ) = p, and V (Y ) = p(1p).

You might also like