13 views

Uploaded by Jessica Tang

- Lecture 9 Risk and Managerial Options in Capital Budgeting (1)
- Brownian Motion(Schilling)
- Lecture on Risk Theory
- Stochastic Signal Processing
- Stochastic processes for finance
- Decision Making
- ito's lemma
- Drought
- Analysis Narthana.docx
- PROBABILITY MODELS OF WASTEWATER TREATMENT PLANT OPERATION
- Summary Introduction to Probability
- A New Hplc Method for Azithromycin Quantitation
- Morphometric Parameters of Gray and Yellow Honey Bee in Serbia - M. Mladenović, V. Pešev, R. Radoš, S. Rašić
- Manufacturing layout analysis - Comparing Flexsim with Excel spreadsheets
- Chance-Constrained Problems With Stochastic Quadratic
- Ch10 Probability and Counting.pdf
- EE 564-Stochastic Systems-Momin Uppal
- DR Test on Energy Predictor Shootout Challenge
- Probability & Queuing Theory Formulas
- tele.pdf

You are on page 1of 96

MIE1650 Lecture 1

Course Overview

S YLLABUS

Office Hours:

◦ Monday: 04:30-06:00 PM

◦ By Appointment

Course HomePage:

◦ blackboard

◦ I will post outlines of lecture slides (if any) there before class.

◦ Homework assignments and grades will be posted there

Textbook: Adventures in Stochastic Processes, S. Resnick

Probability and Random Processes, G. Grimmett, D. Stirzaker

Grading:

◦ 5× Homework 20%

◦ 1× Midterm 40%

◦ 1× Final 40%

M Cevik MIE1605 - Probability Review 2 / 96

Course Overview

T ENTATIVE S CHEDULE

Dates Topic

12/09 - 19/09 Probability Review

26/09 - 10/10 Discrete Time Markov Chains

17/10 - 24/10 Poisson Processes

30/10 Midterm Exam, 5.15 - 7.30 PM

31/10 - 07/11 Continuous Time Markov Chains

14/11 - 21/11 Renewal Processes

28/12 - 05/12 Brownian Motion and Martingales

12/12 Final Exam, 9.00 - 11.30 PM

Course Overview

O UTLINE

1 Course Overview

2 Probability Basics

Introduction

Random variables

Sum of Independent RVs

Functions of RVs

3 Limit theorems

4 Generating Functions

Moment Generating Functions

Probability Generating Functions

5 Random Sums

6 Simple Branching Process

7 Simple Random Walk

Probability Basics Introduction

P ROBABILITY S PACE

each suitable subset on that set.

extended real number line is called a measure if it satisfies the

following properties:

(1) Non-negativity : For all E ∈ Σ: µ(E) ≥ 0.

(2) Null empty set: µ(∅) = 0.

(3) Countable additivity : For all countable collections

P∞ {Ei }∞

i=1 of

∞

pairwise disjoint sets in Σ, µ ∪k=1 Ek = k=1 µ(Ek )

Probability Basics Introduction

P ROBABILITY S PACE

and a probability measure is a measure with total measure one,

i.e., µ(X) = 1.

(1) A sample space, Ω , which is the set of all possible outcomes.

(2) A set of events, F, where each event is a set containing zero

or more outcomes.

(3) A probability measure, P, specifying the assignment of

probabilities to the events.

Probability Basics Introduction

P ROBABILITY S PACE

experiment.

ωi = T . . . TH

Ω = {H, TH, TTH, TTTH . . .} = {ω1 , ω2 , . . .}

Probability Basics Introduction

P ROBABILITY S PACE

(denoted by F ⊆ 2Ω ) if

◦ ∅ = Ω̄ ∈ F, Ω ∈ F

◦ if A1 , A2 , . . . ∈ F, then, ∪∞

i=1 Ai ∈ F ⇒ closed under countable

union

◦ if A ∈ F, then Ā = Ω\A ∈ F ⇒ closed under complements

Ex: F = {∅, {1, 2, 3, 4}, {1, 2}, {3, 4}, {1, 3}, {2, 4}} is not a σ-field,

because, {1, 2} ∪ {1, 3} = {1, 2, 3} is not in F.

(a) P(∅) = 0, P(Ω) = 1

(b) if A1 , A2 , . . . ∈ F, where Ai Aj = ∅, ∀i 6= j,

P∞

then P(∪∞ i=1 Ai ) = i=1 P(Ai )

Probability Basics Introduction

R ANDOM VARIABLES

(e.g., T = R). Let X be a (discrete) RV whose range is {0, 1, 2, . . .}.

Then,

P(X = k) = pk , k = 0, 1, 2, . . .

∞

X

If X is a proper RV, then pk = 1, pk ≥ 0 ∀k

k=0

Examples:

◦ Discrete RV

◦ Continuous RV

◦ Mixed RV

Probability Basics Introduction

D ISTRIBUTION F UNCTIONS

fX , where fX is a non-negative Lebesgue-integrable function if

Z b

P[a ≤ X ≤ b] = fX (x) dx

a

pmf fX : A → [0, 1] defined as

fX (x) = P(X = x) = P({s ∈ S : X(s) = x})

FX (x) = P(X ≤ x), P(a < X ≤ b) = FX (b) − FX (a)

Z t=x

◦ FX (x) = fX (t)dt

t=−∞

P P

◦ F(x) = xi ≤x P(X = xi ) = xi ≤x p(xi )

M Cevik MIE1605 - Probability Review 10 / 96

Probability Basics Introduction

BAYES ’ T HEOREM

P(A ∩ B) P(B|A)P(A)

P(A|B) = =

P(B) P(B)

P(B|Ai )P(Ai ) P

P(Ai |B) = P where P(B) = j P(B|Aj )P(Aj )

j P(B|Aj )P(Aj )

Example:

In a city, 51% of the adults are males. Also, 9.5% of males smoke

cigars, whereas 1.7% of females smoke cigars. If a randomly

selected adult smokes cigars, what’s the probability that selected

subject is a male?

Probability Basics Introduction

J OINT DISTRIBUTION

Joint pdf: f (x, y), x ∈ A, y ∈ B

FX (x) = P(X ≤ x)

F(x, y) = P(X ≤ x, Y ≤ y)

⇒ Similar definition for more than two RVs.

R R

P(X ∈ A, Y ∈ B) = B A f (x, y)dxdy

Probability Basics Introduction

M ARGINAL DISTRIBUTION

Z Z ∞

P(X ∈ A) = P(X ∈ A, Y ∈ R) = f (x, y)dy dx

A −∞

Z ∞

⇒ fX (x) = f (x, y)dy

−∞

X X

pX (x) = p(x, y), pY (y) = p(x, y)

y∈R x∈R

Probability Basics Introduction

M ARGINAL DISTRIBUTION

(

e−(x+y) , if x ≥ 0, y ≥ 0

f (x, y) =

0, otherwise

Probability Basics Introduction

P(X = x, Y = y) p(x, y)

pX|Y (x|y) = P(X = x|Y = y) = =

P(Y = y) pY (y)

f (x, y)

fX|Y (x|y) =

fY (y)

p(1, 1) = 0.5, p(1, 2) = 0.1, p(2, 1) = 0.1, p(2, 2) = 0.3.

Find the pmf of X given Y = 1.

Probability Basics Introduction

I NDEPENDENCE

If A ⊥ B ⇒ A ⊥ B̄

◦ FX,Y (x, y) = FX (x)FY (y)

◦ fx,y (X, Y) = fX (x)fY (y)

X and Y are dependent, since X + Y = n

Probability Basics Introduction

I NDEPENDENCE

X = sum of 2 flips

Y = difference of 2 flips

Outcome X Y Prob

00 0 0 0.25

01 1 -1 0.25

10 1 1 0.25

11 2 0 0.25

p(2, 1) = 0, pX (2) = 0.25, pY (1) = 0.25

Probability Basics Introduction

C ONDITIONAL I NDEPENDENCE

⇐⇒ P(R ∩ B | Y) = P(R | Y)P(B | Y)

or equivalently ⇐⇒ P(R | B ∩ Y) = P(R | Y)

R and B are being independent does not imply that A and B are

conditionally independent. Likewise, conditional independence of

R and B does not imply that they are independent.

◦ A: First coin toss is H

◦ B: Second coin toss is H

◦ C: Coin is biased

⇒ A and B are dependent.

⇒ A and B are conditionally independent given C.

M Cevik MIE1605 - Probability Review 18 / 96

Probability Basics Introduction

P(A ∩ B ∩ C) =?

Probability Basics Introduction

M EAN (E XPECTATION )

X X

◦ E(X) = xp(x), E[g(X)] = g(x)p(x) (discrete RV)

x∈R x∈R

Z Z

◦ E(X) = xf (x)dx, E[g(X)] = g(x)f (x)dx (continuous RV)

R R

Joint distributions:

Z Z Z

E[aX] = axf (x, y)dydx = a xfX (x)dx = aE[X]

R R R

Z Z

E(aX + bY) = (ax + by)f (x, y)dydx = aE[X] + bE[Y]

R R

P P

Probability Basics Introduction

M EAN (E XPECTATION )

Moments of a RV:

The rth moment of X is E[X r ]

The rth central moment of X is E[(X − E[X])r ]

expectation:

∞

X

◦ Discrete RV: E[X] = P(X > k)

k=0

Z ∞

◦ Continuous RV: E[X] = P(X > k)dk

k=0

Probability Basics Introduction

C ONDITIONAL E XPECTATION

X

◦ E[X|Y = y] = xpX|Y (x|y) (discrete RV)

x∈R

Z

◦ E[X|Y = y] = xfX|Y (x|y)dx (continuous RV)

x∈R

Z ∞

E[X] = E[X|Y = y]fY (y)dy

−∞

Z ∞Z ∞

= xfX|Y (x|y)fY (y)dxdy

−∞ −∞

M Cevik MIE1605 - Probability Review 22 / 96

Probability Basics Introduction

C ONDITIONAL E XPECTATION

moment when both are busy, but there is no one else waiting in

line. You will enter service when either clerk becomes free. If

service times for clerk i (for i = 1, 2) are exponential with rate λi ,

find E[T], where T is the total amount of time that you will spend in

the post office (including the time you spend in service). Note that

the two servers are not identical.

P.S. X ∼ Expo(λ) ⇒ fX (x) = λe−λx , x ≥ 0, FX (x) = 1 − e−λx , x ≥ 0

Solution:

3

E[T] = E[T|R1 < R2 ]P(R1 < R2 ) + E[T|R2 < R1 ]P(R2 < R1 ) = · · · = λ1 +λ2

Probability Basics Introduction

C OVARIANCE

Cov(X, Y) = E[(X − E[X])]E[(Y − E[Y])]

expanding the product yields

Cov(X, Y) = ... = E[XY] − E[X]E[Y]

Var(X) = E[X 2 ] − (E[X])2

Z Z Z Z

E[XY] = xyf (x, y)dydx = xyfX (x)fY (y)dydx = E[X]E[Y]

R R R R

⇒ Cov(X, Y) = 0.

M Cevik MIE1605 - Probability Review 24 / 96

Probability Basics Introduction

C OVARIANCE PROPERTIES

Cov(cX, Y) = cCov(X, Y)

Var(cX) = c2 Var(X)

P P P P

Cov( i Xi , i Yi ) = i j Cov(Xi , Yj )

Pn Pn Pn P

Var( i=1 Xi ) = i=1 Var(Xi ) +2 i=1 j<i Cov(Xi , Xj )

Pn Pn

Var( i=1 Xi ) = i=1 Var(Xi ) → if Xi ’s independent

Probability Basics Introduction

C OVARIANCE PROPERTIES

Ex: Flip a fair coin 3 times. Let X be the number of heads in the

first 2 flips and let Y be the number of heads on the last 2 flips (so

there is overlap on the middle flip). Compute Cov(X, Y).

1

Solution: 4

Probability Basics Introduction

S TANDARD DEVIATION / CV

p

Standard deviation of the RV X is σ(X) = Var(X)

⇒ Standard deviation and variance are measures of dispersion

about the mean

⇒ This is a way of normalizing the description of dispersion (it

removes a scale factor)

⇒ Not always well-defined (what if E[X] = 0?)

Probability Basics Introduction

C ORRELATION

Cov(X, Y)

ρ(X, Y) =

σ(X)σ(Y)

Ex: A box contains red, white and black balls. We draw balls from

the box n times where at each draw we note ball color and then

replace it to the box. Let X1 and X2 be the number of red balls and

white balls drawn, respectively. Find ρ(X1 , X2 ).

−np√1 p2

Solution: √

np1 (1−p1 ) np2 (1−p2 )

Probability Basics Introduction

X, Y are uncorrelated.

Beware!

If X and Y are uncorrelated, this does not imply they are

independent

Ex:

◦ X = sum of 2 coin flips, and Y = difference of the 2 flips

M Cevik MIE1605 - Probability Review 29 / 96

Probability Basics Random variables

B INOMIAL RV

Bernoulli trials. Let X1 , X2 , . . . Xn be independent and identically

distributed (iid) RV with

Xi = 1 with prob. p, and 0, with prob. (1 − p).

⇒ X1 , X2 , . . . Xn are called Bernoulli trials

n

X

⇒ Xi ∼ Bin(n, p).

i=1

n k

◦ P(X = k) = p (1 − p)n−k , k = 0, 1, 2, . . . , n

k

Xn

◦ E[X] = np, Var(X) = Var( Xi ) = . . . = np(1 − p)

i=1

Probability Basics Random variables

G EOMETRIC RV

failures in successive Bernoulli trials until first success occurs.

◦ P(X = k) = (1 − p)k p, k = 0, 1, 2 . . .

◦ P(X ≤ k) = 1 − P(X > k) = 1 − (1 − p)k+1 , k = 0, 1, 2 . . .

(1−p) (1−p)

◦ E[X] = p , Var[X] = p2

◦ P(X = k) = (1 − p)k−1 p, k = 1, 2 . . .

(1−p)

◦ E[X] = 1p , Var[X] = p2

X P(X < Y)?

X What’s distribution of min(X, Y)?

M Cevik MIE1605 - Probability Review 31 / 96

Probability Basics Random variables

N EGATIVE B INOMIAL RV

⇒ Nr basically corresponds to rth inter-arrival times.

◦ P(rth success on nth trial)

= P(r − 1 success in n − 1 trials)P(success on nth trial)

n − 1 r−1

p (1 − p)n−r p, n = r, r + 1, . . .

=

r−1

◦ E[Nr ] = r/p, Var(Nr ) = r(1 − p)/p2

◦ P(k failures before rth success)

= P(k failures in k + r − 1 trials)P(success on nth trial)

k + r − 1 r−1

p (1 − p)k p, k = 0, 1, 2, . . .

=

k

◦ E[Kr ] = r/p − r, Var(Kr ) = r(1 − p)/p2

M Cevik MIE1605 - Probability Review 32 / 96

Probability Basics Random variables

P OISSON RV

interval.

e−λ λk

◦ P(X = k) = , k = 0, 1, 2, . . .

k!

◦ E[X] = λ

◦ Var(X) = λ

is large and p is small.

Probability Basics Random variables

U NIFORM RV ( DISCRETE )

Parameters a, b ∈ Z, b ≥ a.

1

pmf: P(X = k) = , k ∈ {a, a + 1, . . . , b − 1, b}

b−a+1

bkc − a + 1

cdf: F(k; a, b) =

b−a+1

a+b (b − a + 1)2 − 1

E[X] = , Var(X) =

2 12

Ex: Roll a die. E[X] = 7/2, Var(X) = 35/12.

Probability Basics Random variables

N ORMAL (G AUSSIAN ) RV

1 (x − µ)2

f (x) = √ exp{− }, x∈R

2πσ 2 2σ 2

F(x) = 21 1 + erf σx−µ

√

2

, x∈R

⇒ If X ∼ N(µ, σ 2 ), then X = µ + σZ

1 2

pdf of standard normal distribution: φ(x) = √ e−x /2

2π

1 x−µ

⇒ pdf of X ∼ N(µ, σ 2 ) : fX (x) = φ( )

σ σ Z x

1 2

cdf of standard normal distribution: Φ(x) = √ e−t /2 dt

2π −∞

x−µ

⇒ cdf of X ∼ N(µ, σ 2 ) : FX (x) = Φ

σ

M Cevik MIE1605 - Probability Review 35 / 96

Probability Basics Random variables

E XPONENTIAL RV

Z ∞

P(X < Y) = P(X < k|Y = k)P(Y = k)dk

k=0

Z ∞

λ1

= (1 − e−λ1 k )λ2 e−λ2 k dk =

k=0 λ1 + λ2

= 1 − P(X > k)P(Y > k) = 1 − e−(λ1 +λ2 )k

⇒ min(X, Y) ∼ Expo(λ1 + λ2 )

M Cevik MIE1605 - Probability Review 36 / 96

Probability Basics Random variables

E XPONENTIAL RV

she does not receive a new kidney, then A will die after an

exponential time with rate µA , and B after an exponential time with

rate µB . New kidneys arrive in accordance with a Poisson process

having rate λ. It has been decided that the first kidney will go to A

(or to B if B is alive and A is not at that time) and the next one to B

(if still living).

(a) What is the probability that A obtains a new kidney?

(b) What is the probability that B obtains a new kidney?

Soln:

(a) λ/(λ + µA )

λ(µA + λ)

(b)

(µB + λ)(λ + µA + µB )

Probability Basics Random variables

E RLANG RV

λk xk−1 e−λx

f (x; k, λ) = , 0≤x<∞

(k − 1)!

k−1

X 1 −λx

F(x) = 1 − e (λx)n , 0≤x<∞

n!

n=0

k k

E[X] = , Var(X) =

λ λ2

λ

If X ∼ Erlang(k, λ) ⇒ aX ∼ Erlang(k, )

a

If X ∼ Erlang(k1 , λ), and Y ∼ Erlang(k2 , λ)

⇒ X + Y ∼ Erlang(k1 + k2 , λ)

Probability Basics Random variables

G AMMA RV

β α α−1 −βx

f (x; α, β) = x e , 0<x<∞

Γ (α)

Z βx

1

F(x) = tα−1 e−t dt, 0<x<∞

Γ (α) 0

Z ∞

Γ (α) = tα−1 e−t dt

0

α α

E[X] = , Var(X) =

β β2

Probability Basics Random variables

G AMMA RV

If Xi ∼ Gamma(αi , β), i = 1, 2, . . . , N

XN XN

⇒ Xi ∼ Gamma( αi , β)

i=1 i=1

If X ∼ Gamma(1, β) ⇒ X ∼ Expo(β)

distribution. That is, if X ∼ Γ (α, λ) ⇒ X ∼ Erlang(α, λ)

Probability Basics Random variables

B ETA RV

xα−1 (1 − x)β−1

f (x; α, β) = , 0≤x≤1

B(α, β)

B(x; α, β)

F(x) = , 0≤x≤1

B(α, β)

Z 1 Z x

Beta functions: B(a, b) = ta−1 (1 − t)b−1 dt, B(x; a, b) = ta−1 (1 − t)b−1 dt

0 0

α αβ

E[X] = , Var(X) =

α+β (α + β)2 (α + β + 1)

⇒ X/(X + Y) has a beta distribution with params. α and β.

Probability Basics Random variables

the evidence X.

p(θ)p(X|θ)

p(θ|X) =

p(X)

p(θ|X) : posterior probability

p(X|θ) : Likelihood function

p(θ) : prior

R probability

p(X) = p(X|θ)p(θ)d(θ) : Normalization constant

Probability Basics Random variables

0.3, P(λ = 0.03) = 0.2, P(λ = 0.05) = 0.4, P(λ = 0.1) = 0.1.

Assume we observed 3 failures in 100 time units. Determine

posterior distribution of λ given the evidence using poisson

likelihood function.

Probability Basics Random variables

a−1 (1−p)b−1

f (p) = Γ (a+b)p

Γ (a)Γ (b) , 0 < p < 1. If we observe evidence of k

failures in n trials, show that posterior distn. for

P ∼ Beta(a + k, b + n − k).

Probability Basics Random variables

U NIFORM RV ( CONTINUOUS )

pdf:

1

, if x ∈ [a, b]

f (x) = b − a

0, otw

cdf:

0, if x < a

x − a

F(x) = , if x ∈ [a, b]

b−a

1, if x ≥ b

Probability Basics Random variables

M EMORYLESS P ROPERTY

P(X > m + n, X ≥ n)

P(X > m + n | X ≥ n) =

P(X ≥ n)

P(X > m + n)

= = P(X > m)

P(X ≥ n)

Probability Basics Sum of Independent RVs

S UM OF I NDEPENDENT D ISCRETE RV S

independent, what is the distribution of ni=1 Xi ?

◦ P(X = i) = ai , P(Y = i) = bi , i = 0, 1, 2, . . .

◦ {X + Y = n} = ∪ni=0 {X = i, Y = n − i}

◦ For i 6= j, {X = i, Y = n − i} and {X = j, Y = n − j} are mutually exclusive

(or disjoint), they cannot happen at the same time.

X n

X P(X + Y = n) = P(X = i, Y = n − i)

i=0

n

X

⇒ Cn = P(X + Y = n) = ai bn−i

i=0

k

X

X P(X + Y + Z = k) = P(X + Y = i)P(Z = k − i),

i=0

P(X + Y = i) = ci , P(Z = k − i) = dk−i

Probability Basics Sum of Independent RVs

S UM OF I NDEPENDENT D ISCRETE RV S

not be independent).

X X

⇒ fZ (z) = fX (x)fY|X (z − x|x) = fY (y)fX|Y (z − y|y)

x y

Probability Basics Sum of Independent RVs

S UM OF I NDEPENDENT D ISCRETE RV S

what’s the distribution of X + Y?

Soln:

X + Y ∼ Poisson(λ + µ)

Probability Basics Sum of Independent RVs

S UM OF I NDEPENDENT D ISCRETE RV S

then the distribution of X + Y?

Soln:

X + Y ∼ Binom(n + m, p)

Probability Basics Sum of Independent RVs

S UM OF I NDEPENDENT C ONTINUOUS RV S

Z ∞

P(X + Y ≤ a) = P(X ≤ a − y)fY (y)dy : CDF of X + Y

0

Z ∞

d

fX+Y (a) = FX+Y (a) = fX (a − y)fY (y)dy : pdf of X + Y

da 0

Probability Basics Sum of Independent RVs

S UM OF I NDEPENDENT C ONTINUOUS RV S

fZ (z).

Soln:

z,

if 0 ≤ z ≤ 1

fZ (z) = 2 − z, if 1 < z < 2

0, otw

⇒ pdf of a triangular RV

Probability Basics Functions of RVs

functions mapping R2 to R, then what is the joint density function of

the pair Y1 = g(X1 , X2 ), Y2 = h(X1 , X2 )?

RV with pdf fX and support I, where I = [a, b]. Let g : I → R be a

continuous monotonic function with inverse function h : J → I,

where J = g(I). Let Y = g(X). Then the pdf fY of Y satisfies

(

fX (h(y)) · |h0 (y)|, if y ∈ J

fY (y) =

0, otw.

Probability Basics Functions of RVs

θ

fX (x) = (θ+1) , x > 1, θ > 0.

x

Find the density of Y = ln(X).

Probability Basics Functions of RVs

fX1 ,X2 (x1 , x2 ) with support A = {(x1 , x2 ) : f (x1 , x2 ) > 0}. We are

interested in RVs Y1 = g1 (X1 , X2 ) and Y2 = g2 (X1 , X2 ). The

transformation y1 = g1 (x1 , x2 ), y2 = g2 (x1 , x2 ) is a one-to-one

transformation of A onto B. The inverse transformation is

x1 = g−1 −1

1 (y1 , y2 ), x2 = g2 (y1 , y2 ).

∂x1 ∂x2 ∂x ∂x ∂x2 ∂x1

∂y1 ∂y1 1 2

J = ∂x1 ∂x2 = −

∂y ∂y ∂y1 ∂y2 ∂y1 ∂y2

2 2

fY1 ,Y2 (y1 , y2 ) = |J| fX1 ,X2 (g−1 −1

1 (y1 , y2 ), g2 (y1 , y2 )).

If (y1 , y2 ) is not in the range of g, fY1 ,Y2 (y1 , y2 ) = 0.

M Cevik MIE1605 - Probability Review 55 / 96

Probability Basics Functions of RVs

Ex: If X and Y have joint density function f , find the density function

U = XY.

Probability Basics Functions of RVs

density of Y1 = X1 + X2 , Y2 = X1 /X2 . Show they are indep.

Limit theorems

M ODES OF CONVERGENCE

converge to a real number α if, for any > 0 there exists an

integer N such that for all n > N

|αn − α| <

⇒ We express the convergence as αn → α as n → ∞ or as

limn→∞ αn = α.

that for all n > N : |αn − 1| < , so limn→∞ αn = 1. (E.g., consider

N = [ 1 ])

Limit theorems

C ONVERGENCE IN P ROBABILITY

probability (or weakly) to a real number µ if for any > 0 and γ > 0

there exists an integer N such that for all n > N:

P(|Xn − µ| < ) > 1 − γ.

p p

⇒ We express the convergence as Xn →

− µ or Xn − µ →

− 0 as

n → ∞.

Definition: A sequence of RVs {X1 , X2 , . . .} is said to converge in

probability (or weakly) to a real number µ if for any > 0:

limn→∞ P(|Xn − µ| < ) = 1

Limit theorems

Definition: A sequence of RVs {X1 , X2 , . . .} is said to converge

almost surely (or strongly) to a real number µ if for any > 0:

lim P(sup |Xn − µ| < ) = 1.

N→∞ n>N

a.s.

⇒ We express the convergence as Xn − µ −−→ 0 as n → ∞.

Definition: A sequence of RVs {X1 , X2 , . . .} is said to converge

almost surely (or strongly) to a real number µ if

P( lim Xn = µ) = 1.

n→∞

Alternative terminology:

a.e.

Xn → X almost everywhere, Xn −−→ X

w.p.1

Xn → X with probability 1, Xn −−−→ X

M Cevik MIE1605 - Probability Review 60 / 96

Limit theorems

C ONVERGENCE IN D ISTRIBUTION

corresponding sequence of cdfs, FX1 , FX2 , . . . so that for

n = 1, 2, . . ., FXn = P(Xn ≤ x). Suppose that there exists a cdf FX

such that for all x at which FX is continuous,

limn→∞ FXn (x) = FX (x).

Then X1 , . . . , Xn converges in distribution to RV X with cdf FX

denoted

d

Xn →

− X

and FX is the limiting distribution.

Limit theorems

r

mean to RV X (or a real number µ), denoted Xn → − X if

limn→∞ E[|Xn − X|r ] = 0.

r=2

If limn→∞ E[(Xn − X)2 ] = 0, then we write Xn −−→ X

That is, {Xn } converges to X in mean-square or in quadratic mean.

r=r1 r=r

2

Xn −−−→ X ⇒ Xn −−−→ X.

Limit theorems

relationships hold:

Limit theorems

M ARKOV ’ S INEQUALITY

E[X] = xf (x)dx

ZR Z

= xf (x)dx + xf (x)dx

{x≥t} {0≤x<t}

Z

≥ xf (x)dx

{x≥t}

Z

≥ tf (x)dx = tP(X ≥ t)

{x≥t}

E[X]

⇒ If t > 0, P(X ≥ t) ≤

t

Scaling Markov’s inequality: For t > 0

P(X ≥ tE[X]) ≤ (tE[X])−1 E[X] = 1/t

M Cevik MIE1605 - Probability Review 64 / 96

Limit theorems

C HEBYSHEV ’ S INEQUALITY

P((X − E[X])2 ≥ 2 ) ≤ Var(X)/2

Remember, Var(X) = E[(X − E[X])2 ]

σ2

P(X ≥ E[X] + t) ≤

σ 2 + t2

M Cevik MIE1605 - Probability Review 65 / 96

Limit theorems

C HEBYSHEV ’ S INEQUALITY

known range, e.g., 10 ≤ X ≤ 30. For random samples of size

1000, give a lower bound for P(|X̄ − E[X]| ≤ 1).

Soln: P(|X̄ − E[X]| ≤ 1) ≥ 0.9

Limit theorems

C HEBYSHEV ’ S INEQUALITY

can we say about P(5 < X < 15)?

Soln: P(5 < X < 15) ≥ 2/5

Limit theorems

Let X1 , X2 , X3 , ... be a sequence of independent and identically

distributed (i.i.d) RV’s, with E[Xi ] = µ. Then, with probability 1,

n

1X

X̄n := Xi → µ as n → ∞

n

i=1

Limit theorems

Let X1 , X2 , X3 , ... be a sequence of i.i.d RVs, with mean µ and

variance σ 2 . Then,

X̄n − µ d

Zn := √ → − N(0, 1) as n → ∞

σ/ n

That is,

d

P(Zn ≤ z) → − Φ(z) as n → ∞, ∀z

Remarks:

◦ If n is large, then X̄n ≈ Nor(µ, σ 2 /n)

◦ Xi ’s need not be normally distributed

◦ Usually n ≥ 30 for better approximations (fewer observations

needed when Xi ’s are from symmetric distribution)

M Cevik MIE1605 - Probability Review 69 / 96

Limit theorems

L IMIT THEOREMS

numbers. Namely, if X1 , X2 , . . . are iid with mean µ and variance σ 2

then, for any ,

X1 + X2 + . . . + Xn

P | − µ |> → 0 as n → ∞

n

Limit theorems

L IMIT THEOREMS

P each being

uniformly distributed over (0, 1). Estimate P( 10 i=1 Xi > 7).

P10

Soln: P( i=1 Xi > 7) ≈ 1 − Φ(2.2) = 0.0139

Limit theorems

L IMIT THEOREMS

100 independent baseball games, each of which they have

probability 0.8 of winning. What’s the probability that they win at

least 90?

Soln: P(Y ≥ 90) ≈ 1 − Φ(2.5) = 0.0088.

Generating Functions

G ENERATING F UNCTIONS

Z

iXt

Fourier Transform: E[e ] = eixt fX (x)dx

x∈R

Z

Laplace Transform: E[e−Xt ] = e−xt fX (x)dx

x∈R

Z

Xt

Moment Generating Functions: E[e ] = ext fX (x)dx

x∈R

Z

Probability Generating Functions: E[sX ] = sx fX (x)

x∈R

Generating Functions Moment Generating Functions

Z

Xt

E[e ] = ext fX (x)dx = φX (t)

φX (0) = 1

∂φX (t) ∂

= E[ ext ] = E[XeXt ]

∂t ∂t

0 00

⇒ φX (0) = E[X], φX (0) = E[X 2 ], etc.

Sum of RVs:

P

Xi t

φPi Xi (t) = E[e i ] = E[Πi eXi t ].

If Xi0 s are independent ⇒ E[Πi eXi t ] = Πi E[eXi t ] = Πi φXi (t)

Generating Functions Moment Generating Functions

Bernoulli Distribution:

φX (t) = E[eXt ] = pet + (1 − p)

Binomial Distribution:

P

φX (t) = E[eXt ] = E[e i Xi t

] = (E[eXi t ])n

= (pet + (1 − p))n → since Xi0 s are idd Bernoulli RVs.

Note that φX (t) gives a hint about distribution of a RV. If we

recognize something like (pet + (1 − p))n , then we can say that it’s

a Binomial(n, p) RV.

Generating Functions Moment Generating Functions

∞

Xt

X pet

φX (t) = E[e ] = ext p(1 − p)x−1 =

1 − et (1 − p)

x=1

pet r

φNr (t) = [φN (t)]r =

t

1 − e (1 − p)

Exponential Distribution:

φX (t) = E[ext ] = λ/(λ − t)

Generating Functions Probability Generating Functions

∞

X

E[sX ] = sk P(X = k) = P(s)

k=0

Ex: Poisson RV

∞ ∞

X e−λ λk X (λs)k

P(s) = sk = e−λ = eλ(s−1)

k! k!

k=0 k=0

Ex: Geometric RV

∞

X 1

P(s) = sk (1 − p)k p = p , (s < 1/(1 − p))

1 − s(1 − p)

k=0

Generating Functions Probability Generating Functions

P(0) = 00 p0 + 01 p1 + 02 p2 + . . . = p0

∞ ∞

0 0

X X

k−1

P (s) = ks pk = ksk−1 pk ⇒ P (0) = p1

k=0 k=1

∞

00 00

X

P (s) = k(k − 1)sk−2 pk ⇒ P (0) = 2p2

k=2

∞

X

P(n) (s) = k(k − 1) . . . (k − n + 1)sk−n pk ⇒ P(n) (0) = n!pn

k=n

P(k) (0)

Then, pk = , k = 0, 1, 2, . . .

k!

Generating Functions Probability Generating Functions

∞ ∞

0 0

X X

P (s) = ksk−1 pk ⇒ P (1) = kpk = E[X]

k=1 k=0

∞

00

X

P (s) = k(k − 1)sk−2 pk

k=2

∞

00

X

⇒ P (1) = k(k − 1)pk = E[X(X − 1)] = E[X 2 ] − E[X]

k=0

P(n) (1)

= E X(X − 1) . . . (X − n + 1)

00 0 0

Var(X) = E[X 2 ] − (E[X])2 = P (1) + P (1) − (P (1))2

Generating Functions Probability Generating Functions

Ex:

◦ P(s) = p/(1 − qs)

0

◦ P (s) = qp/(1 − qs)2

⇒ P0 (0) = (1 − p)p = P(X = 1)

⇒ P0 (1) = q/p = E[X]

00 00

⇒ P (s) = (2(1 − qs)q2 p)/(1 − qs)4 ⇒ P (0) = 2q2 p

00

Then, P(X = 2) = (1/2!)P (0) = (1 − p)2 p

g.f. of a RV and its probability distribution. We can recover a

probability distribution and all moments of a RV from g.f.

Generating Functions Probability Generating Functions

where Xi has g.f. PXi (s). We are interested in g.f. of

X1 + X2 + . . . + Xn

PX1 +X2 +...+Xn (s) = E[sX1 +X2 +...+Xn ]

n

= Πi=1 PXi (s)

Generating Functions Probability Generating Functions

PX1 +X2 (s) = e−λ1 (1−s) e−λ2 (1−s) = e−(λ1 +λ2 )(1−s)

⇒ X1 + X2 ∼ Poisson(λ1 + λ2 )

P(Xi = 1) = p = 1 − P(Xi = 0).

◦ PXi (s) = s0 q + s1 p = q + sp

n

◦ PX1 +X2 +...+Xn (s) = Πi=1 (q + sp) = (q + sp)n

◦ X1 + X2 + . . . + Xn ∼ Binom(n, p)

Note that if X1 , X2 , . . . , Xn are iid RVs, then

PXi (s) = P(s)

PX1 +X2 +...+Xn (s) = (P(s))n

M Cevik MIE1605 - Probability Review 82 / 96

Random Sums

R ANDOM S UMS

Let X1 , X2 , X3 , . . . be iid (non-negative integer valued) RVs with

P(Xi = k) = pk , k = 0, 1, 2...

Let N be a non-negative integer valued RV which is independent

of {X1 , X2 , . . .} where P(N = k) = αk , k ≥ 0. Define

◦ S0 = 0P

◦ Sn = ni=1 Xi , n = 1, 2, . . .

P

X∞ ∞

X

P(SN = j) = P(SN = j, N = k) = P(SN = j|N = k)P(N = k)

k=0 k=0

∞

X

= P(Sk = j)P(N = k)

k=0

Ex: Consider people who are coming to a mall.

N : # of people coming. Xi : money spent by ith customer.

M Cevik MIE1605 - Probability Review 83 / 96

Random Sums

R ANDOM S UMS

E[N]

X

E[SN ] = E[N]E[X1 ] 6= E[Xi ] → E[N] may not be integer!

i=1

∞

X

◦ E[SN ] = EN ESN [SN | N] = E SN | N = k P(N = k)

k=0

∞

X k

hX i ∞

X

= E Xi αk = kE[X1 ]αk = E[X1 ]E[N]

k=0 i=1 k=0

Random Sums

∞

X ∞

X ∞

X

j j

PSN (s) = s P(SN = j) = s P(Sk = j)αk

j=0 j=0 k=0

∞

X ∞

X ∞

X

= αk sj P(Sk = j) = αk (PX1 (s))k

k=0 j=0 k=0

Random Sums

Computing E[SN ] by using g.f.

∂PSN (s)

PSN (s) = PN (PX1 (s)) ⇒ |s=1 = E[SN ]

∂s

∂PSN (s) 0 0

= (PX1 (s)) (PN (PX1 (s))) .

∂s

0

(PX1 (s)) |s=1 = E[X1 ],

0 0

PX1 (1) = 1 ⇒ (PN (PX1 (1))) = (PN (1)) = E[N]

⇒ E[SN ] = E[X1 ]E[N]

Calculate PSN (s).

◦ PSN (s) = PN (PX1 (s)) = PN (ps + q)

= e−λ(1−(ps+q)) = e−λp(1−s)

⇒ SN ∼ Poisson(λp)

M Cevik MIE1605 - Probability Review 86 / 96

Simple Branching Process

who forms generation zero. The progenitor splits into k offsprings

with probability pk and the offsprings constitute the first generation.

number of offsprings, again with the same mass function {pk }.

This process continues until extinction, if occurs.

An example could be family generations or in queuing, a branch

may be number of arrivals during 1st job is processed.

M Cevik MIE1605 - Probability Review 87 / 96

Simple Branching Process

Zn,j corresponds to the number of members of the nth generation

that are offsprings of the jth member of the (n − 1)st generation.

Z0 = 1 → generation zero

Z1 = Z1,1

Z2 = Z2,1 + Z2,2 + . . . + Z2,Z1

Zn−1

X

Zn = Zn,1 + Zn,2 + . . . + Zn,Zn−1 = Zn,j

j=1

Simple Branching Process

use the generating functions to determine pmf of Zn .

PZn (s) = E[szn ]

∞

X

Let P(s) = sk pk , and E[sZ1 ] = P1 (s).

k=0

Pn (s) = Pn−1 (P(s)) → from rand. sum of RVs

P2 (s) = P(P(s))

P3 (s) = P(P2 (s)) = P(P(P(s)))

⇒ Pn (s) = P(Pn−1 (s))

Simple Branching Process

P(s) = P1 (s) = q + ps

∞

X

Pn+1 (s) = (q + pq + p2 q + . . . + pn q)s0 + pn+1 s1 = sk pk

k=0

2 n

P(Zn+1 = 0) = q + pq + p q + . . . + p q

P(Zn+1 = 1) = pn+1

P(Zn+1 = 2) = 0

∞

X

lim P(Zn+1 = 0) = q pi = q/(1 − p) = 1 ⇒ this family will extinct!

n→∞

i=0

Simple Branching Process

⇒ It’s the measure of extinction.

∞

X

Define mn = E[Zn ]. Let m1 = m = E[Z1 ] = kpk

k=0

0 0 0

P2 (s) = P (P(s))P (s) → chain rule for derivatives

0 0

P2 0 (1) = P (1)P (1) = m2

0 0 0 0

P3 (s) = P (P2 (s))P2 (s) ⇒ P3 (1) = m3

0

⇒ Pn (s) = mn = mn

Simple Branching Process

Our purpose is to be able to characterize the probability of

extinction of a branching process.

πn = P(Zn = 0) = Pn (0)

π = P{extinction}

πn = P{the extinction occurs on generation n or before}

{extinction} = ∪∞

n=1 {Zn = 0}, and {Zn = 0} ⊂ {Zn+1 = 0}

π = P{∪∞ n

k=1 {Zk = 0}} = P( lim ∪k=1 {Zk = 0})

n→∞

= lim P(Zn = 0) = lim πn

n→∞ n→∞

Note that there are trivial cases such as:

if p0 = 0 ⇒ π = 0 if p0 = 1 ⇒ π = 1

M Cevik MIE1605 - Probability Review 92 / 96

Simple Branching Process

Theorem

Suppose 0 < p0 < 1.

◦ If m = E[z1 ] ≤ 1, then π = 1.

◦ If m > 1, then π < 1 is the unique non-negative solution of

s = P(s).

Simple Branching Process

Ex: Consider an operator of a sales booth at a computer show

that takes orders. Each order takes three minutes to fill. While

each order is being filled, there is probability pj that j more

customers will arrive and join the line. Assume

p0 = 0.2, p0 = 0.2, p0 = 0.6. The operator cannot take a break

until a service is completed and no one is waiting in line to order.

If present conditions persist, what is the probability that the

operator will ever take a break?

Simple Random Walk

P(X1 = 1) = 1 − P(X1 = −1) = p.

{Sn , n ≥ 0} with S0 = 0.

Sn = X1 + X2 + . . . + Xn = Sn−1 + Xn .

→ First passage time or hitting time.

b = min{n : Sn = 0}, S0 = −1

N

→ Starting point shifted to level -1.

N and N̂ are identical RVs.

Also, N = 1 ⇐⇒ Xi = 1 w.p. p

M Cevik MIE1605 - Probability Review 95 / 96

Simple Random Walk

φn = P(N = n), n ≥ 0

φ0 = 0

φ1 = p

n−2

X

φn = qφj φn−j−1

j=1

∞

X

Φ(s) = sn φn

n=0

E[N] =?

- Lecture 9 Risk and Managerial Options in Capital Budgeting (1)Uploaded byZunairaAslam
- Brownian Motion(Schilling)Uploaded byChisn Lin Chisn
- Lecture on Risk TheoryUploaded byangelutd
- Stochastic Signal ProcessingUploaded bykiwi83vn
- Stochastic processes for financeUploaded byEmilio
- Decision MakingUploaded byMiiss Cathymiao
- ito's lemmaUploaded bysambhu_n
- DroughtUploaded byLaura Torres
- Analysis Narthana.docxUploaded byYasith Weerasinghe
- PROBABILITY MODELS OF WASTEWATER TREATMENT PLANT OPERATIONUploaded byTherese Araymond
- Summary Introduction to ProbabilityUploaded byIndra D Bakhtiar
- A New Hplc Method for Azithromycin QuantitationUploaded byziaddd
- Morphometric Parameters of Gray and Yellow Honey Bee in Serbia - M. Mladenović, V. Pešev, R. Radoš, S. RašićUploaded byCk_psih
- Manufacturing layout analysis - Comparing Flexsim with Excel spreadsheetsUploaded bymano7428
- Chance-Constrained Problems With Stochastic QuadraticUploaded byPaz Ochoa
- Ch10 Probability and Counting.pdfUploaded bysuitup100
- EE 564-Stochastic Systems-Momin UppalUploaded byWaseem Abbas
- DR Test on Energy Predictor Shootout ChallengeUploaded bymaxmanfren
- Probability & Queuing Theory FormulasUploaded byrahul.n90
- tele.pdfUploaded bySiva Kumar
- Numerical Optimization of LikelihoodsUploaded byAli Hamieh
- Zone Models for EnclosuresUploaded byfirevin
- QMM_3.pptxUploaded bySandeep Vijayakumar
- Y0606188191.pdfUploaded byAJER JOURNAL
- Control Charts EXAMPLESUploaded byТаймурАбдуллах
- III-Discrete DistributionsUploaded byIshna Anhsi
- US Federal Reserve: ifdp838Uploaded byThe Fed
- Assessment of Sampling and Analytical UncertaintyUploaded bymichypao
- 01333199 dUploaded bydebasishmee5808
- Heavy Metal Contamination of Ganga River at VaranasiUploaded bySaumya Shukla

- Introduction to Macroeconomics (2)Uploaded byJessica Tang
- Aggregates ENCI579 Lecture4Uploaded byVibhanshu Mishra
- MIE1605_2b_DTMC.pdfUploaded byJessica Tang
- MIE1605_HW2(1)Uploaded byJessica Tang
- MIE1605_HW1(1)Uploaded byJessica Tang
- Unilateral CalculationsUploaded byJessica Tang
- Staff Report on Electricity Markets and ReliabilityUploaded byStephen Loiaconi
- World Energy Balances 2017 OverviewUploaded byJessica Tang
- Horizontal Flight Performance - Jet AircraftUploaded byJessica Tang
- w6 L1 Willingnesstopay HandoutUploaded bymvpfautz
- 12324230501Teacher_Manual_english_(2)Uploaded byAbel Segawa
- SampleSolutions.pdfUploaded byvenkat
- FieldOlewiler3ce Sample Midterm Answer KeyUploaded byJessica Tang
- doc1Uploaded byJessica Tang
- MIT16_07F09_Lec17Uploaded byTialoc
- asnt4Uploaded byJessica Tang
- Review Class Nov 25 2014Uploaded byJessica Tang
- page_3Uploaded byJessica Tang
- page_2Uploaded byJessica Tang
- 3%2E46122Uploaded byJessica Tang
- Aerodynamics, Aeronautics & Flight MechanicsUploaded bybbairplanes_michael
- Homework Assignment 4_15Uploaded byJessica Tang
- soln06Uploaded byJessica Tang
- Hello WorldUploaded byJessica Tang

- lec20-21_compressed SCATTER.pdfUploaded byBobby Ward
- Peri-operative Cardiac Arrhythmias - Part 1Uploaded bymladja77
- boeing sm analysisUploaded byapi-395629767
- ISO 6637. Determinación de Hg Por AA en Frutas y VegetalesUploaded byClaudia Mireya Higuera
- ICDL TestUploaded byHend Ghanem
- Magnesium Versus Sulphur in Ductile IronUploaded byarnaldorcr8646
- Follow Grammar HaskellUploaded bykhorefilm
- Installation Guide for Oracle f 3 Ssc sUploaded byJuan Aguilar
- CHEMISTRY SPM FORM 4 Short Notes Chapter 7 ACIDS AND BASESUploaded byJay Bee
- UOBDs_WritingT1.pdfUploaded byAlec Liu
- File 127365Uploaded byDrNaveed Ul Haq
- ms certUploaded bythomasfinley44
- Audit Trail ReportingUploaded byNagaraj Gunti
- 28159843-Delhi-MetroUploaded byDeepak Singh Rautela
- Sullivan,KK 2017Uploaded byMihaiIliescu
- Banking Awareness PptUploaded bySai Praveen
- Marfan's Syndrome -nHow is the Body AffectedUploaded byElok Nur Farida Anggraini
- 3rd set wUploaded byRyan Uy
- IRC545011 MayUploaded byRezwan Haque
- UntitledUploaded byapi-245510753
- UCF_-_SOLIDWORKS_I.pdfUploaded byphruiz1890
- 130562424 Bhairava Stotra Shiva Stuti of Sri AbhinavaguptaUploaded byBanibrata Bhattacharya
- DemographicUploaded byflyhighsachin
- 5070_w05_qp_1Uploaded bymstudy123456
- WLCA2-2-TSUploaded byLuis Salinas
- linuxwksp-outline day2Uploaded byanon-990812
- adria-2019-vans-catalogue.pdfUploaded byAnonymous d1KLF1
- MOE, Forms and CPM Issue 3 From 01.03.2011 Revision 00 From 01.03.2011Uploaded byArtur Levikin
- EmpireofNeoMemorybyHeribertoYepez.pdfUploaded bydiazbarbaradiaz
- CI_2014_Sternberg_Decentralized intelligence in freight transportation_A critical review.pdfUploaded bySara Vidal