moment

© All Rights Reserved

10 views

moment

© All Rights Reserved

- Personal Smart AssistaPersonal Smart Assistant for Digital Media and Advertisementnt for Digital Media and Advertisement
- Probability,Statistics and numerical Methods.pdf
- Project Scheduling -- Probabilistic PERT
- Chapter 4
- SOA 2001.05 Exam 1-P Solutions
- Expectancy Management
- Untitled
- Probability Distributions
- 00001580
- Doctrine-of-Chances-2005-11-02-A4.pdf
- Monte Carlo White Paper
- var32
- Hetrogenous Concrete.2011.1
- 80anos Control Charts
- 2004_01
- 15.08062201
- Stigler stat Ch. 2`
- The Multidimensional Wisdom of Crowds
- Lecture 7 Presentation
- JER Mushroom

You are on page 1of 5

Ernie Croot

October 23, 2008

1 Introduction

Given a random variable X, let f(x) be its pdf. The quantity (in the con-

tinuous case the discrete case is dened analogously)

E(X

k

) =

x

k

f(x)dx

is called the kth moment of X.

The moment generating function gives us a nice way of collecting to-

gether all the moments of a random varaible X into a single power series (i.e.

Maclaurin series) in the variable t. It is dened to be

M

X

(t) := E(e

Xt

) = E

k=0

X

k

t

k!

.

Thinking here of the t as a constant, at least from the perspective of taking

expectations, we use the linearity of expectation to conclude that

E(e

Xt

) =

k=0

E(X

k

)t

k

k!

So, the coecients of the powers of t give us moments divided by k!.

It is fairly easy to see that if we take the kth derivative of the moment

generating function, and set t = 0, the result will be the kth moment. In

symbols, this is

d

dt

k

M

X

(t)

t=0

= kth moment of X.

1

Caution!: It may be that the moment generating function does not exist,

because some of the moments may be innite (or may not have a denite

value, due to integrability issues). Also, even if the moments are all nite

and have denite values, the generating function may not converge for any

value of t other than 0. All that said, these convergence issues and innities

rarely come up in the sort of problems we will consider in this course (and

rarely come up in real-world problems).

Take home message. I expect you to know everything from this section.

2 Some examples

Example 1. We saw in class that if X is a Bernoulli random variable with

parameter p, then

M

X

(t) = E(e

Xt

) = e

0t

(1 p) + e

1t

p = pe

t

+ 1 p.

Also, because X

k

= X, it is clear that the kth moment of X, k 1, is the

same as the rst moment, which is just p; indeed, taking the kth derivative

of M

X

(t) and setting t = 0 we nd that

kth moment = pe

t

t=0

= p.

Example 2. Let us compute the moment generating function for a normal

random variable having variance

2

and mean = 0. Note that the pdf for

such a random variable is just

f(x) =

1

2

e

x

2

/2

2

.

So, we have that

M

X

(t) = E(e

Xt

) =

e

xt

1

2

e

x

2

/2

2

dx

=

e

(1/2

2

)(x

2

2

2

tx)

2

dx.

2

Now we complete x

2

2

2

tx to a square by adding to it (and then subtracting)

4

t

2

, and so

M

X

(t) =

1

e

(1/2

2

)((x

2

t)

2

4

t

2

)

dx

= e

2

t

2

/2

2

e

(x

2

t)

2

/2

2

dx

.

Now we recognize that the last integral here equals 1, since it is the integral

of the pdf for the normal random variable N(

2

t,

2

) over the full interval

(, ). So,

M

X

(t) = e

2

t

2

/2

.

Just to make sure you understand how moment generating functions work,

try the following two example problems.

Problem 1. Compute the moment generating function for the random vari-

able X having uniform distribution on the interval [0, 1].

Problem 2. Based on your answer in problem 1, compute the fourth moment

of X i.e. E(X

4

).

Take home message. I expect you to know how to compute the moment

generating function of some basic random variables, like those with Bernoulli

and uniform distribution. I do not expect you to know how to derive the

MGF for normal random variables for the purposes solving a problem on an

exam. Though, I may give you the MGF of some random variable on an

exam, and then ask you to compute moments of that r.v.

3 From moment generating functions to dis-

tributions

A key and profoundly useful fact about moment generating functions is the

following.

Key Fact. Suppose that X and Y are two random variables having moment

generating functions M

X

(t) and M

Y

(t) that exist for all t in some interval

3

[, ]. Then, if

M

X

(t) = M

Y

(t), for all t [, ],

we must have that X and Y have the same cumulative distribution; that is,

P(X a) = P(Y a), for all a R.

This beautiful fact can be exploited to pin down the distribution of the

sum of various types of independent random variables, due to the fact that

the exponential in the denition of MGFs allows us to transform sums of

random variables into products of expectations. More specically, we have

the following immensely useful reproductive property of the moment gen-

erating function.

Claim. Suppose that X

1

, ..., X

k

are independent random variables. Then,

M

X

1

++X

k

(t) = E(e

(X

1

++X

k

)t

)

= E(e

X

1

t

e

X

2

t

e

X

k

t

)

= E(e

X

1

t

) E(e

X

k

t

)

= M

X

1

(t) M

X

k

(t).

We note that we used the fact that the X

i

s are independent when we rewrote

the expectation of a product of exponentials as a product of expectations.

This is just the familiar fact that if Z

1

, ..., Z

k

are independent, then

E(Z

1

Z

k

) = E(Z

1

) E(Z

k

).

Take home message. Know everything from this section.

4 An application to sums of independent nor-

mal random variables

Using the moment generating function we computed for N(0,

2

) in the pre-

vious section, we now use it to show the following.

4

Claim. Suppose that X

1

, ..., X

k

are independent normal random variables

with means

1

, ...,

k

and variances

2

1

, ...,

2

k

, respectively. Then,

Y = X

1

+ + X

k

is normal with mean

1

+ +

k

and variance

2

1

+ +

2

k

.

Basically, to show that this is the case, we just compute the moment

generating function for Y , and show that it is the same as that of

N(

1

+ +

k

,

2

1

+ +

2

k

).

We will do this only for the case

1

= =

k

= 0 (the more general case is

really no more dicult!).

Combining together various facts from previous sections, we nd that

M

Y

(t) = M

X

1

(t) M

X

k

(t) = e

2

1

t

2

/2

e

2

k

t

2

/2

= e

(

2

1

++

2

k

)t

2

/2

,

which is clearly the moment generating function for

N(0,

2

1

+ +

2

k

),

so we are done (note that the moment generating function exists for all t,

not merely for t [, ]).

Problem 3. See if you can gure out how to handle the more general case

of where the X

i

s have means

i

that may or may not equal 0.

Take home message. Know everything from this section.

5

- Personal Smart AssistaPersonal Smart Assistant for Digital Media and Advertisementnt for Digital Media and AdvertisementUploaded byKamalMusa
- Probability,Statistics and numerical Methods.pdfUploaded byChakri Chakravarthy
- Project Scheduling -- Probabilistic PERTUploaded bySanjana Ganesh
- Chapter 4Uploaded bybaabaabaabaa
- SOA 2001.05 Exam 1-P SolutionsUploaded byZay Lia
- Expectancy ManagementUploaded byNafi Iz
- UntitledUploaded byeurolex
- Probability DistributionsUploaded byTapash Mandal
- 00001580Uploaded bySanjeev Singh
- Doctrine-of-Chances-2005-11-02-A4.pdfUploaded byAjay Kukreja
- Monte Carlo White PaperUploaded byashuadbnel
- var32Uploaded byRobert Shih
- Hetrogenous Concrete.2011.1Uploaded byRajesh Chowdary
- 80anos Control ChartsUploaded bysbastos16777
- 2004_01Uploaded byFélix Bangoura
- 15.08062201Uploaded bySantosh Kumar
- Stigler stat Ch. 2`Uploaded byplxnospam
- The Multidimensional Wisdom of CrowdsUploaded byGraf Orlok
- Lecture 7 PresentationUploaded byProfessr Professr
- JER MushroomUploaded byCogniah Nhện Con
- Mini Tab Capability Method ChooserUploaded byquantum70
- Sampling DistributionsUploaded byAamir Mukhtar
- Statistics Formulae Booklet for Non-Statistics Students (1)Uploaded byJeanelle Arpa
- FIN3600_2Uploaded byStephen Magudha
- Taburan KebarangkalianUploaded byAbdul Manaf
- chapter 1Uploaded bytsegay
- Using Abaqus Reliability Analysis Directional Sim 2008 FUploaded byIbnu Riswan
- PET472 DecisionAnalysis StudentUploaded by56962645
- Tutorial 2 ME323 Group 23Uploaded byabhi
- 1333355396Testing for Normality Using SPSSUploaded bym_ridwan_a

- Soil Stiffness (1)Uploaded byMehdiGhoddosi
- B5.4_Waves and Compressible Flow.pdfUploaded bymaurizio.desio4992
- 215728Uploaded byAlexandra Galan
- 40M_GBT_DGNUploaded byShashankSingh
- number systemsUploaded byapi-250124002
- CS DAG.docxUploaded byKhaled Hossain
- EC 6405 Control SystemsUploaded byTapasRout
- DMG-iTNC530_MillProgManual(533_190-23)Uploaded bylastowl
- part-a-qpaper.pdfUploaded bySoumik Sarangi
- Auto CadUploaded byShiva Krishna Bhagavatula
- Dissertation AlirezaUploaded byPrakash Thirunavukkarasu
- Part One Waveguides and CavitiesUploaded byAlex Trinh
- marty the book cat presentsUploaded byapi-248684364
- Course Syllabus.Spring 2011.Math Enrichment C.Geometry.Bujak.OLDUploaded byEdward Bujak
- fettweis1986Uploaded byAlex Arm.
- interdisciplinary unitUploaded byapi-209093515
- card trickUploaded byJeppo X. Julian
- Genetic Programming based Image SegmentationUploaded byTarundeep Dhot
- B17 Statics_Moment - ProblemsUploaded byJCLL
- GuidedTour1 eCognition8 GettingStarted Example SimpleBuildingExtractionUploaded byElkin Cañón
- PROJECT REPORT FINALE-1.docxUploaded byluis
- Article Sdr is QsUploaded byAntonella Silva
- CS201 Discrete Computational StructuresUploaded byAnoop
- 4.38 Tyba Economics WordUploaded byAzhaan Akhtar
- Personal Tutor Final.docUploaded by신재호
- [WSOLA]an Overlap-Add Technique Based on Waveform Similarity (WSOLA) for High-quality Time-scale Modifications of SpeechUploaded byAniela Milea
- Pert,Cpm(Pom)Uploaded byAarush Js
- Sol20_4eUploaded byRiteshHPatel
- Hubble Sphere - Universe in ProblemsUploaded byseth
- Lecture 3 VBAUploaded byJonahJunior

## Much more than documents.

Discover everything Scribd has to offer, including books and audiobooks from major publishers.

Cancel anytime.