You are on page 1of 3

Moment Generating Functions

Here, we will introduce and discuss moment generating functions (MGFs). Moment
generating functions are useful for several reasons, one of which is their
application to analysis of sums of random variables. Before discussing MGFs, let's
define moments.

Definition . The nth moment of a random variable X is defined to be E[Xn]. The nth
central moment of X is defined to be E[(X−EX)n].

For example, the first moment is the expected value E[X]. The second central moment
is the variance of X. Similar to mean and variance, other moments give useful
information about random variables.
The moment generating function (MGF) of a random variable X is a function MX(s)
defined as
MX(s)=E[esX].
We say that MGF of X exists, if there exists a positive constant a such that MX(s)
is finite for all s∈[−a,a].

Before going any further, let's look at an example.


Example
For each of the following random variables, find the MGF.
X is a discrete random variable, with PMF
PX(k)=⎧⎩⎨⎪⎪⎪⎪1323k=1k=2
Y is a Uniform(0,1) random variable.
Solution

Why is the MGF useful? There are basically two reasons for this. First, the MGF of
X gives us all moments of X. That is why it is called the moment generating
function. Second, the MGF (if it exists) uniquely determines the distribution. That
is, if two random variables have the same MGF, then they must have the same
distribution. Thus, if you find the MGF of a random variable, you have indeed
determined its distribution. We will see that this method is very useful when we
work on sums of several independent random variables. Let's discuss these in
detail.
Finding Moments from MGF:
Remember the Taylor series for ex: for all x∈R, we have
ex=1+x+x22!+x33!+...=∑k=0∞xkk!.
Now, we can write
esX=∑k=0∞(sX)kk!=∑k=0∞Xkskk!.
Thus, we have
MX(s)=E[esX]=∑k=0∞E[Xk]skk!.
We conclude that the kth moment of X is the coefficient of skk! in the Taylor
series of MX(s). Thus, if we have the Taylor series of MX(s), we can obtain all
moments of X.
Example
If Y∼Uniform(0,1), find E[Yk] using MY(s).
Solution

We remember from calculus that the coefficient of skk! in the Taylor series of
MX(s) is obtained by taking the kth derivative of MX(s) and evaluating it at s=0.
Thus, we can write
E[Xk]=dkdskMX(s)|s=0.
We can obtain all moments of Xk from its MGF:
MX(s)=∑k=0∞E[Xk]skk!,
E[Xk]=dkdskMX(s)|s=0.
Example
Let X∼Exponential(λ). Find the MGF of X, MX(s), and all of its moments, E[Xk].
Solution

Example
Let X∼Poisson(λ). Find the MGF of X, MX(s).
Solution

As we discussed previously, the MGF uniquely determines the distribution. This is a


very useful fact. We will see examples of how we use it shortly. Right now let's
state this fact more precisely as a theorem. We omit the proof here.
Theorem Consider two random variables X and Y. Suppose that there exists a
positive constant c such that MGFs of X and Y are finite and identical for all
values of s in [−c,c]. Then,
FX(t)=FY(t), for all t∈R.
Example
For a random variable X, we know that
MX(s)=22−s, for s∈(−2,2).
Find the distribution of X.
Solution

Sum of Independent Random Variables:


Suppose X1, X2, ..., Xn are n independent random variables, and the random variable
Y is defined as
Y=X1+X2+⋯+Xn.
Then,
MY(s)=E[esY]=E[es(X1+X2+⋯+Xn)]=E[esX1esX2⋯esXn]=E[esX1]E[esX2]⋯E[esXn](since the
Xi's are independent)=MX1(s)MX2(s)⋯MXn(s).
If X1, X2, ..., Xn are n independent random variables, then
MX1+X2+⋯+Xn(s)=MX1(s)MX2(s)⋯MXn(s).

Example
If X∼Binomial(n,p) find the MGF of X.
Solution

Example
Using MGFs prove that if X∼Binomial(m,p) and Y∼Binomial(n,p) are independent, then
X+Y∼Binomial(m+n,p).
Solution

← previousnext →
0 Preface
1 Basic Concepts
2 Combinatorics: Counting Methods
3 Discrete Random Variables
4 Continuous and Mixed Random Variables
5 Joint Distributions
6 Multiple Random Variables
6.0 Introduction
6.1 Methods for More Than Two Random Variables
6.1.1 Joint Distributions and Independence
6.1.2 Sums of Random Variables
6.1.3 Moment Generating Functions
6.1.4 Characteristic Functions
6.1.5 Random Vectors
6.1.6 Solved Problems
6.2 Probability Bounds
6.3 Problems
7 Limit Theorems and Convergence of Random Variables
8 Statistical Inference I: Classical Methods
9 Statistical Inference II: Bayesian Inference
10 Introduction to Random Processes
11 Some Important Random Processes
12 Introduction to Simulation Using MATLAB
13 Introduction to Simulation Using R
14 Recursive Methods
Appendix
Bibliography
Creative Commons License
Introduction to Probability by Hossein Pishro-Nik is licensed under a Creative
Commons Attribution-NonCo

You might also like