0 Up votes0 Down votes

2 views42 pages1

Sep 29, 2017

© © All Rights Reserved

PDF, TXT or read online from Scribd

1

© All Rights Reserved

2 views

1

© All Rights Reserved

- SPSS Tutorial
- Probablity questions
- Ejemplo Simulacion en R
- Distributions
- TOBB ETU ELE471: Lecture 1
- Continous Distribution
- Exam 2 Review
- Penalty
- Probability
- continuous probability distributions
- AIM
- Normal Distribution
- 300 HWset-7 S13
- MAT 540 Quiz 1 Solution Perfect Score (2)
- Chapter 2
- Introduction to Continuous Probability Distributions
- 4.60 Statistics.pdf
- Mumbai University_FYBSc_syllabus_4.60 Statistics.pdf
- Handout 03 Continuous Random Variables
- Exponential Distribution PDF

You are on page 1of 42

600: Lecture 22

Sums of independent random variables

Scott Sheffield

MIT

Summing two random variables

know their density functions fX and fY .

Summing two random variables

know their density functions fX and fY .

I Now lets try to find FX +Y (a) = P{X + Y a}.

Summing two random variables

know their density functions fX and fY .

I Now lets try to find FX +Y (a) = P{X + Y a}.

I This is the integral over {(x, y ) : x + y a} of

f (x, y ) = fX (x)fY (y ). Thus,

Summing two random variables

know their density functions fX and fY .

I Now lets try to find FX +Y (a) = P{X + Y a}.

I This is the integral over {(x, y ) : x + y a} of

f (x, y ) = fX (x)fY (y ). Thus,

I Z Z ay

P{X + Y a} = fX (x)fY (y )dxdy

Z

= FX (a y )fY (y )dy .

Summing two random variables

know their density functions fX and fY .

I Now lets try to find FX +Y (a) = P{X + Y a}.

I This is the integral over {(x, y ) : x + y a} of

f (x, y ) = fX (x)fY (y ). Thus,

I Z Z ay

P{X + Y a} = fX (x)fY (y )dxdy

Z

= FX (a y )fY (y )dy .

I DifferentiatingRboth sides gives

d R

fX +Y (a) = da FX (ay )fY (y )dy = fX (ay )fY (y )dy .

Summing two random variables

know their density functions fX and fY .

I Now lets try to find FX +Y (a) = P{X + Y a}.

I This is the integral over {(x, y ) : x + y a} of

f (x, y ) = fX (x)fY (y ). Thus,

I Z Z ay

P{X + Y a} = fX (x)fY (y )dxdy

Z

= FX (a y )fY (y )dy .

I DifferentiatingRboth sides gives

d R

fX +Y (a) = da FX (ay )fY (y )dy = fX (ay )fY (y )dy .

I Latter formula makes some intuitive sense. Were integrating

over the set of x, y pairs that add up to a.

Independent identically distributed (i.i.d.)

distributed.

Independent identically distributed (i.i.d.)

distributed.

I It is actually one of the most important abbreviations in

probability theory.

Independent identically distributed (i.i.d.)

distributed.

I It is actually one of the most important abbreviations in

probability theory.

I Worth memorizing.

Summing i.i.d. uniform random variables

fX = fY = 1 on [0, 1].

Summing i.i.d. uniform random variables

fX = fY = 1 on [0, 1].

I What is the probability density function of X + Y ?

Summing i.i.d. uniform random variables

fX = fY = 1 on [0, 1].

I What is the probability density function of X + Y ?

R R1

I fX +Y (a) = fX (a y )fY (y )dy = 0 fX (a y ) which is

the length of [0, 1] [a 1, a].

Summing i.i.d. uniform random variables

fX = fY = 1 on [0, 1].

I What is the probability density function of X + Y ?

R R1

I fX +Y (a) = fX (a y )fY (y )dy = 0 fX (a y ) which is

the length of [0, 1] [a 1, a].

I Thats a when a [0, 1] and 2 a when a [1, 2] and 0

otherwise.

Review: summing i.i.d. geometric random variables

P{X = k} = (1 p)k1 p for k 1.

Review: summing i.i.d. geometric random variables

P{X = k} = (1 p)k1 p for k 1.

I Sum Z of n independent copies of X ?

Review: summing i.i.d. geometric random variables

P{X = k} = (1 p)k1 p for k 1.

I Sum Z of n independent copies of X ?

I We can interpret Z as time slot where nth head occurs in

i.i.d. sequence of p-coin tosses.

Review: summing i.i.d. geometric random variables

P{X = k} = (1 p)k1 p for k 1.

I Sum Z of n independent copies of X ?

I We can interpret Z as time slot where nth head occurs in

i.i.d. sequence of p-coin tosses.

I So Z is negative binomial

n1 (n, p). So

P{Z = k} = k1n1 p (1 p) kn p.

Summing i.i.d. exponential random variables

parameter . So fXi (x) = e x on [0, ) for all 1 i n.

Summing i.i.d. exponential random variables

parameter . So fXi (x) = e x on [0, ) for all 1 i n.

What is the law of Z = ni=1 Xi ?

P

I

Summing i.i.d. exponential random variables

parameter . So fXi (x) = e x on [0, ) for all 1 i n.

What is the law of Z = ni=1 Xi ?

P

I

distribution with parameters (, n).

Summing i.i.d. exponential random variables

parameter . So fXi (x) = e x on [0, ) for all 1 i n.

What is the law of Z = ni=1 Xi ?

P

I

distribution with parameters (, n).

e y (y )n1

I So fZ (y ) = (n) .

Summing i.i.d. exponential random variables

parameter . So fXi (x) = e x on [0, ) for all 1 i n.

What is the law of Z = ni=1 Xi ?

P

I

distribution with parameters (, n).

e y (y )n1

I So fZ (y ) = (n) .

I We argued this point by taking limits of negative binomial

distributions. Can we check it directly?

Summing i.i.d. exponential random variables

parameter . So fXi (x) = e x on [0, ) for all 1 i n.

What is the law of Z = ni=1 Xi ?

P

I

distribution with parameters (, n).

e y (y )n1

I So fZ (y ) = (n) .

I We argued this point by taking limits of negative binomial

distributions. Can we check it directly?

I By induction, would suffice to show that a gamma (, 1) plus

an independent gamma (, n) is a gamma (, n + 1).

Summing independent gamma random variables

I Say X is gamma (, s), Y is gamma (, t), and X and Y are

independent.

Summing independent gamma random variables

I Say X is gamma (, s), Y is gamma (, t), and X and Y are

independent.

I Intuitively, X is amount of time till we see s events, and Y is

amount of subsequent time till we see t more events.

Summing independent gamma random variables

I Say X is gamma (, s), Y is gamma (, t), and X and Y are

independent.

I Intuitively, X is amount of time till we see s events, and Y is

amount of subsequent time till we see t more events.

e x (x)s1 e y (y )t1

I So fX (x) = (s) and fY (y ) = (t) .

Summing independent gamma random variables

I Say X is gamma (, s), Y is gamma (, t), and X and Y are

independent.

I Intuitively, X is amount of time till we see s events, and Y is

amount of subsequent time till we see t more events.

e x (x)s1 y (y )t1

I So fX (x) = (s) and fY (y ) = e (t) .

R

I Now fX +Y (a) = fX (a y )fY (y )dy .

Summing independent gamma random variables

I Say X is gamma (, s), Y is gamma (, t), and X and Y are

independent.

I Intuitively, X is amount of time till we see s events, and Y is

amount of subsequent time till we see t more events.

e x (x)s1 y (y )t1

I So fX (x) = (s) and fY (y ) = e (t) .

R

I Now fX +Y (a) = fX (a y )fY (y )dy .

I Up to an a-independent multiplicative constant, this is

Z a Z a

(ay ) s1 y t1 a

e (ay ) e y dy = e (ay )s1 y t1 dy .

0 0

Summing independent gamma random variables

I Say X is gamma (, s), Y is gamma (, t), and X and Y are

independent.

I Intuitively, X is amount of time till we see s events, and Y is

amount of subsequent time till we see t more events.

e x (x)s1 y (y )t1

I So fX (x) = (s) and fY (y ) = e (t) .

R

I Now fX +Y (a) = fX (a y )fY (y )dy .

I Up to an a-independent multiplicative constant, this is

Z a Z a

(ay ) s1 y t1 a

e (ay ) e y dy = e (ay )s1 y t1 dy .

0 0

R1

e a as+t1 0 (1 x)s1 x t1 dx.

Summing independent gamma random variables

I Say X is gamma (, s), Y is gamma (, t), and X and Y are

independent.

I Intuitively, X is amount of time till we see s events, and Y is

amount of subsequent time till we see t more events.

e x (x)s1 y (y )t1

I So fX (x) = (s) and fY (y ) = e (t) .

R

I Now fX +Y (a) = fX (a y )fY (y )dy .

I Up to an a-independent multiplicative constant, this is

Z a Z a

(ay ) s1 y t1 a

e (ay ) e y dy = e (ay )s1 y t1 dy .

0 0

R1

e a as+t1 0 (1 x)s1 x t1 dx.

I This is (up to multiplicative constant) e a as+t1 . Constant

must be such that integral from to is 1. Conclude

that X + Y is gamma (, s + t).

Summing two normal variables

I X is normal with mean zero, variance 12 , Y is normal with

mean zero, variance 22 .

Summing two normal variables

I X is normal with mean zero, variance 12 , Y is normal with

mean zero, variance 22 .

x 2 y 2

1 e 2 2 1 e 2 2

I fX (x) = 21

1 and fY (y ) = 22

2 .

Summing two normal variables

I X is normal with mean zero, variance 12 , Y is normal with

mean zero, variance 22 .

x 2 y 2

1 e 2 2 1 2

I fX (x) = 21

1and fY (y ) = 2 e 22 .

2

R

I We just need to compute fX +Y (a) = fX (a y )fY (y )dy .

Summing two normal variables

I X is normal with mean zero, variance 12 , Y is normal with

mean zero, variance 22 .

x 2 y 2

1 e 2 2 1 2

I fX (x) = 21

1and fY (y ) = 2 e 22 .

2

R

I We just need to compute fX +Y (a) = fX (a y )fY (y )dy .

I We could compute this directly.

Summing two normal variables

I X is normal with mean zero, variance 12 , Y is normal with

mean zero, variance 22 .

x 2 y 2

1 e 2 2 1 2

I fX (x) = 21

1and fY (y ) = 2 e 22 .

2

R

I We just need to compute fX +Y (a) = fX (a y )fY (y )dy .

I We could compute this directly.

I Or we could argue with a multi-dimensional bell curve picture

that if X and Y have variance 1 then f1 X +2 Y is the density

of a normal random variable (and note that variances and

expectations are additive).

Summing two normal variables

I X is normal with mean zero, variance 12 , Y is normal with

mean zero, variance 22 .

x 2 y 2

1 e 2 2 1 2

I fX (x) = 21

1and fY (y ) = 2 e 22 .

2

R

I We just need to compute fX +Y (a) = fX (a y )fY (y )dy .

I We could compute this directly.

I Or we could argue with a multi-dimensional bell curve picture

that if X and Y have variance 1 then f1 X +2 Y is the density

of a normal random variable (and note that variances and

expectations are additive).

I Or use fact that if Ai {1, 1} are i.i.d. coin tosses then

P2 N

1 2

N i=1 Ai is approximately normal with variance when

N is large.

Summing two normal variables

I X is normal with mean zero, variance 12 , Y is normal with

mean zero, variance 22 .

x 2 y 2

1 e 2 2 1 2

I fX (x) = 21

1and fY (y ) = 2 e 22 .

2

R

I We just need to compute fX +Y (a) = fX (a y )fY (y )dy .

I We could compute this directly.

I Or we could argue with a multi-dimensional bell curve picture

that if X and Y have variance 1 then f1 X +2 Y is the density

of a normal random variable (and note that variances and

expectations are additive).

I Or use fact that if Ai {1, 1} are i.i.d. coin tosses then

P2 N

1 2

N i=1 Ai is approximately normal with variance when

N is large.

I Generally: if independent random P

variables P

Xj are normal

(j , j2 ) then nj=1 Xj is normal ( nj=1 j , nj=1 j2 ).

P

Other sums

Other sums

I Yes, binomial (m + n, p). Can be seen from coin toss

interpretation.

Other sums

I Yes, binomial (m + n, p). Can be seen from coin toss

interpretation.

I Sum of independent Poisson 1 and Poisson 2 ?

Other sums

I Yes, binomial (m + n, p). Can be seen from coin toss

interpretation.

I Sum of independent Poisson 1 and Poisson 2 ?

I Yes, Poisson 1 + 2 . Can be seen from Poisson point process

interpretation.

- SPSS TutorialUploaded byKenan Ali
- Probablity questionsUploaded bySetu Ahuja
- Ejemplo Simulacion en RUploaded bytype77
- DistributionsUploaded bySandipa Malakar
- TOBB ETU ELE471: Lecture 1Uploaded byUmit Guden
- Continous DistributionUploaded byAbhishek Singh
- Exam 2 ReviewUploaded bySRodenmayer
- PenaltyUploaded byNavneet Verma
- ProbabilityUploaded bySonam Pansari
- continuous probability distributionsUploaded byAkshay Vetal
- AIMUploaded bySaqib Ali
- Normal DistributionUploaded byaamirali31
- 300 HWset-7 S13Uploaded byWoon How
- MAT 540 Quiz 1 Solution Perfect Score (2)Uploaded bynoneybark
- Chapter 2Uploaded byMcNemar
- Introduction to Continuous Probability DistributionsUploaded bygopalmeb6368
- 4.60 Statistics.pdfUploaded byDaminiKapoor
- Mumbai University_FYBSc_syllabus_4.60 Statistics.pdfUploaded byreshmaraut
- Handout 03 Continuous Random VariablesUploaded bymuhammad ali
- Exponential Distribution PDFUploaded byAngel
- Applications of Central Limit TheoremUploaded byAslan Alp
- Lecture Slides - 05A - Linear Regression - t DistributionUploaded bytalha
- Devroye_Non_Uniform_random_variate_Generation.pdfUploaded byRacool Rafool
- Lab2 AC Impedance&PhasorsUploaded byHuseyin Oztoprak
- Samp DistUploaded byAmit K Biswas
- Bayesian characterization of buildings using seismic interferometry on ambient vibrationsUploaded byArthur Rebouças
- cisewski_jessi.pdfUploaded byangiemmendoza4366
- Book statistics.docxUploaded byJojo Ello
- Lecture 5.pptUploaded bySai Krish Potlapalli
- MAPEH ITEM ANALYSISUploaded byY.m. Lenjun Mantiza Boiser

- File 0085.pdfUploaded byEd Z
- File 0096.pdfUploaded byEd Z
- File 0099.pdfUploaded byEd Z
- File 0094.pdfUploaded byEd Z
- File 0098.pdfUploaded byEd Z
- File 0095.pdfUploaded byEd Z
- File 0097.pdfUploaded byEd Z
- File 0100.pdfUploaded byEd Z
- File 0077.pdfUploaded byEd Z
- File 0083.pdfUploaded byEd Z
- File 0078.pdfUploaded byEd Z
- File 0091.pdfUploaded byEd Z
- File 0092.pdfUploaded byEd Z
- File 0090.pdfUploaded byEd Z
- File 0086.pdfUploaded byEd Z
- File 0089.pdfUploaded byEd Z
- File 0093.pdfUploaded byEd Z
- File 0082.pdfUploaded byEd Z
- File 0088.pdfUploaded byEd Z
- File 0081.pdfUploaded byEd Z
- File 0080.pdfUploaded byEd Z
- File 0079.pdfUploaded byEd Z
- File 0084.pdfUploaded byEd Z
- File 0087.pdfUploaded byEd Z
- File 0071.pdfUploaded byEd Z
- File 0069.pdfUploaded byEd Z
- File 0064.pdfUploaded byEd Z
- File 0061.pdfUploaded byEd Z
- File 0063.pdfUploaded byEd Z
- File 0066.pdfUploaded byEd Z

- Waiting Lines 2Uploaded bylamartinezm
- BBA-syllabusUploaded bysandip315
- 03MA246HW2Uploaded byMuhammad Abdullah
- mathematical statsUploaded bybanjo111
- Tutorial Manual or 2171901 7th Sem 1Uploaded byPARESH JASANI
- Applied Statistics First Exam.manUploaded byNizar Mohammad
- ExamplesUploaded byJean claude onana
- Assignment 2Uploaded bycvikasgupta
- Exponential Distribution Cdf PDFUploaded byCraig
- NR 311303 Operations ResearchUploaded bySrinivasa Rao G
- Poisson+ProcessUploaded byRonald Francia Jimenez
- Failure RateUploaded byValentin Radulescu
- Arena 8 RandomVariateGenerationUploaded byVictoria Moore
- Tutorial 5Uploaded byMeghana Gupta
- Line Balancing and Optimization for Single Model Assembly Line at a Small Medium Industry (SMI)Uploaded byMouad Sarouti
- Assignment 3.pdfUploaded byDeepak Agrawal
- Tutsheet5Uploaded byqwertyslajdhjs
- QueueingmodelUploaded bygirithik14
- Probability NotesUploaded byVishnu
- 19-981-2-PB (1)Uploaded byAsk Bulls Bear
- IPUploaded byAnonymous GH1ydID
- Lab2 Fitting Probability DistributionsUploaded byrohan_sh2003
- Assessment of Production Regularity for Subsea Oil-GasUploaded byDan Agu
- 08 Exam Practice ProblemsUploaded byRyan Feehan
- SimQuick TextbookUploaded byJustin Selvaraj
- Stochastic.pptUploaded byEjaz Ahmad
- Matematics and ComputingUploaded byAkash Pushkar Charan
- Schoumaker 2012Uploaded byCathy Hu
- Network AnalysisUploaded bysailorweb
- Monte CarloUploaded bysksayyed

## Much more than documents.

Discover everything Scribd has to offer, including books and audiobooks from major publishers.

Cancel anytime.