Credit chaitanyachegg

© All Rights Reserved

6 views

Credit chaitanyachegg

© All Rights Reserved

- Compendium of Common Probability Distributions
- 14hwk3
- Data Tables & Monte Carlo Simulations in Excel – a Comprehensive Guide
- 7.2.1 Finding Areas Under the Standard Normal Curve
- Chapter 5
- Probability and stats Notes
- BITS pilani advance statistic
- SMMD Assignment 1
- error analysis lectures
- US Federal Trade Commission: wp264
- Lesson 1
- p57 Agarwal
- lec 3 (SII)
- EXAM2
- Distribuição Amplitude
- Homework 3 Sewffewolution
- L6 Normal Distribution
- 2016 Student MAM Course Outline (Version 1)
- Collaborative Statistics Teacher's Guide
- bbq2

You are on page 1of 35

Tieming Ji

Fall 2012

1 / 35

Denition: A random variable X is continuous if it can take

uncountable number of values on the real line. Most often, the

support set for a continuous random variable is an interval.

For example, X = {x : 0 x 1}, X = {x : 0 < x < },

X = {x : < x < }, etc.

Continuous random variable X can describe height, time,

weight, etc. which can take continuous real values.

For example, the city bus can come any time between 8 to 9

am. Let X denote the time the bus comes, then X can take

any real value in X = {x : 8 < x < 9}.

2 / 35

Compared to the discrete cases, we have:

Denition: (A rst denition of pdf for continuous r.v.s) A

function f is a probability density function of a continuous

variable if and only if

f (x) 0, for x , and

f (x)dx = 1.

In the discrete cases, we use summation

X can take uncountable number of values, we use integration

.

3 / 35

Example 1. f (x) =

x

2

1 x 1;

0 otherwise.

Is f (x) a density function?

Solution:

1. For x , f (x) 0;

2.

f (x)dx =

1

1

f (x)dx =

1

1

x

2

dx =

1

3

x

3

1

1

=

2

3

= 1.

Since f only satises the rst condition, but not the second, f is not a

density function.

4 / 35

Compared to the discrete cases, we also have:

Denition: The cumulative distribution function F for a

random variable X is dened as

F(x) = P(X x) =

Similarly, in discrete cases, we use summation. In continuous

cases, we use integration.

5 / 35

Given density function f , we could compute the probability of

X in any given interval.

Theorem: Given the probability density function f ,

P(a X b) =

b

a

f (x)dx, for any a, b .

Extension: Given the probability density function f ,

P(X A) =

A

f (x)dx, for any set A.

6 / 35

Note: (Very Important)

1. In the discrete cases, f (x) = P(X = x).

Both f (x) and P(X = x) are the probability that X takes value x.

2. In the continuous cases, f (x) = P(X = x).

P(X = x) is the probability that X takes value x, and P(X = x) = 0 for

any x (Why? What is the probability that the city bus arrives exactly at

8:30 am between 8 to 9 am? Picking up a needle from an ocean).

f (x) has a dierent meaning, otherwise

f (x)dx = 0 = 1.

To understand this, we need a second denition for f for

continuous cases.

7 / 35

Remember F(x) = P(X x) =

Denition: (A second denition of pdf for continuous r.v.s)

The pdf of a continuous random variable X is given by

f (x) = F

(x)

= lim

0

F(x +) F(x)

= lim

0

P(X x +) P(X x)

= lim

0

P(x X x +)

8 / 35

Example: X is a random variable with density

f (x) =

1

2

0 x 2;

0 o.w.

1. Verify that f is a density function.

2. What is P(X = 1)? Since X can take any real value between 0 and 2,

the probability to pick up value 1 is 0, i.e. P(X = 1) = 0.

3. What is f (1)? f (1) =

1

2

. Note that f (1) = P(X = 1).

4. What is F(1)?

F(1) = P(X 1) =

f (x)dx =

1

0

1

2

dx =

1

2

x

1

0

=

1

2

.

5. Compute P(1 X < 3).

6. Graph f (x) and F(x).

9 / 35

Denition: The expectation of a continuous r.v. X, E(X) or

X

, is given by

E(X) =

xf (x)dx.

Extension: The expectation of a function of a continuous r.v.

X, E(h(X)) or

h(X)

, is given by

E(h(X)) =

h(x)f (x)dx,

provided

10 / 35

The denitions for variance and moment generating function

are the same for the discrete and continuous variables. That is

Var(X) = E(X E(X))

2

= E(X

2

) (EX)

2

, and

m

X

(t) = E(e

tX

).

11 / 35

Example 4.2.1: (Book page 105) Random variable X denotes the lead

concentration in gasoline in grams per liter, and it has density

f (x) = 12.5x 1.25, for 0.1 x 0.5; and 0, o.w.

1. E(X)=?

2. Var(X)=?

3. =?

12 / 35

Example 4.2.2: (Book page 106) The spontaneous ipping of a bit stored

in a computer emery is called a soft fail. Let X denote the time in

millions of hours before the rst soft fail is observed. The density for X is

f (x) = e

x

, x > 0; and 0, o.w.

Find E(X) and Var(X).

13 / 35

Common continuous distributions:

Uniform Distribution.

Normal Distribution.

Exponential Distribution.

Gamma Distribution.

2

Distribution.

14 / 35

Continuous Uniform Distribution

Denition: A random variable follows a continuous uniform

distribution on an interval (a, b) if its probability density

function is given by

f (x) =

1

ba

, for any x (a, b);

0, o.w.

E(X) =

a+b

2

;

Var(X) =

(ba)

2

12

;

15 / 35

Derive the cdf of X U(a, b).

Solution: We want to compute F(x) = P(X x) for all x .

(1) When x a,

F(x) =

f (t)dt =

0dt = 0.

(2) When a < x < b,

F(x) =

f (t)dt =

0dt+

x

a

1

b a

dt = 0+

t

b a

x

a

=

x a

b a

.

(3) When b x,

F(x) =

f (t)dt =

0dt+

b

a

1

b a

dt+

x

b

0dt = 0+

t

b a

b

a

+0 = 1.

So, F(x) =

0, x a;

xa

ba

, a < x < b;

1, b x.

16 / 35

When a = 0 and b = 1, it is a Standard Uniform Distribution.

E(X) =

1

2

, Var(X) =

1

12

, F(x) =

0, x 0;

x, 0 < x < 1;

1, 1 x.

.

17 / 35

Normal Distribution

Denition: A random variable X follows a normal distribution

with location parameter (< < ) and scale

parameter ( > 0) if its probability density function is given

by

f (x) =

1

2

e

(x)

2

2

2

, < x < .

E(X) = ;

Var(X) =

2

;

m

X

(t) = e

t+

2

t

2

2

.

18 / 35

4 2 0 2 4 6

0

.

0

0

.

1

0

.

2

0

.

3

0

.

4

x

f

(

x

)

N(0,1)

N(1,2)

19 / 35

4 2 0 2 4 6

0

.

0

0

.

2

0

.

4

0

.

6

0

.

8

1

.

0

x

F

(

x

)

N(0,1)

N(1,2)

20 / 35

Extension: If X follows a normal distribution with mean and

variance

2

, then Z =

X

distribution with mean 0 and variance 1.

If Z N(0, 1), then:

E(Z) = 0;

Var(Z) = 1;

m

Z

(t) = e

t

2

2

;

F(z) could be read from the Table of cdf for standard normal

distribution.

21 / 35

Example Let X denote the number of grams of hydrocarbons emitted by

an automobile per mile. Assuming that X is normal with =1 and

=0.25. Find the probability that a randomly selected automobile will

emit between 0.9 and 1.54 grams of hydrocarbons per mile.

Solution

P(0.9 X 1.54)

= P(

0.9 1

0.25

Z

1.54 1

0.25

)

= P(0.4 Z 2.16)

= F(2.16) F(0.4)

= 0.9846 0.3446 (read from table)

= 0.64

22 / 35

Example Let X denote the amount of radiation that can be absorbed by

an individual before death ensues. Assume that X is normal with a mean

of 500 roentgens and a standard deviation of 150 roentgens. Above what

dosage level will only 5% of those exposed survive?

Solution Suppose P(X x

0

) = 0.05, and we want to nd x

0

.

P(X x

0

) = P(

X 500

150

x

0

500

150

)

(transfer to standard normal)

= P(Z

x

0

500

150

) = 0.05.

By reading from the cdf table for standard normal, we have

x

0

500

150

= 1.645. Thus, x

0

= 746.75.

23 / 35

Property: Suppose X N(,

2

), then

P( < X < ) 0.68,

P(2 < X < 2) 0.95,

P(3 < X < 3) 0.997.

This is often called the 3- rule.

Picture courtesy of Wikipedia Normal distribution.

24 / 35

Theorem: (Chebyshevs inequality) X is a random variable

with mean and variance

2

, then

P(|X | < k) 1

1

k

2

.

Note:

Both the 3- rule and the Chebyshevs inequality can be used to

approximate probability covered integer times of standard deviation

away from mean.

Chebysheves inequality does not require the r.v. follows a normal

distribution.

25 / 35

Example The safety record of an industrial plant is measured in terms of

M, the total stang-hours worked without a serious accident. M has a

mean of 2 million and standard deviation of 0.1 million. A serious

accident has just occurred. By using Chebyshevs inequality, what is the

strongest conclusion we can make about the probability of having the

next serious accident to occur within the next 1.6 million stang-hours?

Solution: We want to draw conclusions for P(M 1.6). We know,

M

= 2,

M

= 0.1.

According to Chebyshevs Theorem,

P(|M

M

| < 4

M

) = P(1.6 < M < 2.4) 1

1

4

2

= 0.9375.

So, P(M 1.6) + P(M 2.4) = 1 P(1.6 < M < 2.4) 0.0625. Since

P(M 2.4) > 0, the strongest conclusion we can make for P(M 1.6)

by using Chebyshevs Theorem is that P(M 1.6) < 0.0625.

26 / 35

Exponential Distribution

Denition: Random variable X following an exponential

distribution with parameter ( > 0) if its density function is

given by

f (x) =

, x > 0;

0, o.w.

.

X: denote the time to wait for the rst event to occur. For example, the

survival time of patients before death occur; the lighting time of a new

bulb before dead; the time to wait for the rst incoming call; the time to

wait between two accidents, ... .

E(X) = . This is the expected time to wait for event to happen;

Var(X) =

2

;

m

X

(t) = (1 t)

1

.

27 / 35

0 1 2 3 4 5

0

.

0

0

.

5

1

.

0

1

.

5

2

.

0

x

f

(

x

)

exp(0.5)

exp(1)

exp(2)

28 / 35

Example 1 Engineers have collected data from compressors on natural

gas pipelines and found that the average life of compressors is 5.75 years.

Let X denote the lifetime of a compressor and it follows the exponential

distribution.

(1) What is the probability that a compressor fails during the rst year

after installation.

Solution: X Exp( = 5.75).

P(X 1) = F(1) =

f (x)dx

=

1

0

1

5.75

e

x

5.75

dx

= e

x

5.75

1

0

= e

1

5.75

(1)

0.16

29 / 35

(2) Compute the probability of failure prior to the average life.

Solution:

P(X 5.75) = F(5.75) =

5.75

f (x)dx

=

5.75

0

1

5.75

e

x

5.75

dx

= e

x

5.75

5.75

0

= e

5.75

5.75

(1)

0.632

30 / 35

Theorem: Relationship between a Poisson variable and an

Exponential variable.

If

X Poisson(),

then, X denote the number of occurrence in a period of time.

Furthermore, let Y denote the time to wait for the rst event

to occur, then

Y Exp( =

1

).

31 / 35

Gamma Distribution

Before introducing the Gamma distribution, we need to rst introduce

the Gamma function.

Denition: The function is dened as

() =

0

z

1

e

z

dz,

and is called the gamma function.

Properties:

(1) = 1;

For any > 1, () = ( 1)( 1).

32 / 35

Denition: A random variable X following a Gamma

distribution with parameters ( > 0) and ( > 0) if the

density function is given by

f (x) =

1

()

x

1

e

, x > 0;

0, o.w.

Important Note: When = 1, X Exp().

E(X) = ;

Var(X) =

2

;

m

X

(t) = (1 t)

.

33 / 35

2

Distribution

Denition: Let X Gamma(, ). When = 2, X is said to

follow a

2

E(X) = ;

Var(X) = 2;

m

X

(t) = (1 2t)

2

.

So, we have:

X Gamma(, ) =

When = 1, X Exp();

When = 2, X

2

=2

.

34 / 35

Chapter Summary

Note: the chapter summary only lists important points of a chapter.

Understand the meanings of f (x), P(X = x), F(x) for a continuous

r.v. X.

Given X f (x), know how to compute E(X), Var(X), m

X

(t), F(x),

and P(x

1

< X < x

2

). And then use m

X

(t) to compute E(X

k

).

Know how to graph a f (x) and F(x).

In applications, given that X follows a normal or exponential

distribution, know what do the questions ask in applications, and

how to compute them. For this, see examples.

Know how to read the cdf table for the standard normal distribution

for calculation, even when X is not standardized.

Know the 3- rule and Chebyshevs Theorem for approximating

probabilities.

Use the continuous distribution summary table to nd the mean,

variance, pdf, and mgf for a commonly used distribution.

Know the relationships between a gamma random variable, an

exponential random variable and a

2

random variable.

35 / 35

- Compendium of Common Probability DistributionsUploaded byRMolina65
- 14hwk3Uploaded byChandan Hegde
- Data Tables & Monte Carlo Simulations in Excel – a Comprehensive GuideUploaded byhomomysticus
- 7.2.1 Finding Areas Under the Standard Normal CurveUploaded byZaenal Muttaqin
- Chapter 5Uploaded byJuly Zerg
- Probability and stats NotesUploaded byMichael Garcia
- BITS pilani advance statisticUploaded bylucky2010
- SMMD Assignment 1Uploaded byShubham Chauhan
- error analysis lecturesUploaded byLev Dun
- US Federal Trade Commission: wp264Uploaded byftc
- Lesson 1Uploaded byIna Syazana
- p57 AgarwalUploaded byTushar Gupta
- lec 3 (SII)Uploaded bySAna KhAn
- EXAM2Uploaded byrachelstirling
- Distribuição AmplitudeUploaded byTIto
- Homework 3 SewffewolutionUploaded byJung Yoon Song
- L6 Normal DistributionUploaded byDonald Yum
- 2016 Student MAM Course Outline (Version 1)Uploaded byGGYman
- Collaborative Statistics Teacher's GuideUploaded byjojoj
- bbq2Uploaded byDevid Bita
- Lecture 3 Review ProbabilityUploaded byAnonymous ep7LE5ZdP5
- Week 06 Part 1Uploaded byJane Doe
- notes_aUploaded bygangesh.dubey8616
- Introprob ProbUploaded bySyed Tabrez
- Gamma-02Uploaded byNamdev
- Econometrics Complete NotesUploaded byMunnah Bhai
- 3-normal distribution-SL-2017 (1).pptxUploaded byDaner Baxhija
- KrokhmalUploaded byAnonymous fwgFo3e77
- assighnment.docxUploaded byShubham Singh
- TSA Tutorial SolutionUploaded byBishal Shrestha

- f10_mudUploaded byLeia Seungho
- Zip FeddyUploaded byLeia Seungho
- Their aimUploaded byLeia Seungho
- Assignment 23467Uploaded byLeia Seungho
- HM1-18Uploaded byLeia Seungho
- 9uhUploaded byLeia Seungho
- f18 prandtl meyer oblique shockUploaded byKiprop Arap Koech
- ch01Uploaded byfang7591
- WaiUploaded byLeia Seungho
- 6 Machining IIEx2Uploaded byLeia Seungho
- ch10-4710Uploaded byLeia Seungho
- ch9-4710Uploaded byLeia Seungho
- ch8-4710Uploaded byLeia Seungho
- ch7-4710Uploaded byLeia Seungho
- ch6-4710Uploaded byLeia Seungho
- ch5-4710Uploaded byLeia Seungho
- ch3-4710Uploaded byLeia Seungho
- ch2-4710Uploaded byLeia Seungho
- ch2-3-4510Uploaded byLeia Seungho
- ch2-2-4510Uploaded byLeia Seungho
- ch2-1-4510Uploaded byLeia Seungho
- ch1-4-4510Uploaded byLeia Seungho
- ch1-4710Uploaded byLeia Seungho
- ch1-3-4510Uploaded byLeia Seungho
- ch1-2-4510Uploaded byLeia Seungho
- ch1-1-4510Uploaded byLeia Seungho
- Ch 03Uploaded bychaitanyachegg
- Sol AssignmentUploaded byLeia Seungho
- Thesis ChaptersUploaded byLeia Seungho

- SQQS1013-CHP04Uploaded byJia Yi S
- Reasoning Under UncertaintyUploaded byFaisal Fazal
- Statistical DistributionsUploaded byMahidhara Davangere
- Full NotesUploaded byFoo Bar
- 9780387241579-c2jUploaded byStavros
- ProbabilityUploaded byAnonymous HU37mJ
- CHAPTER 5 Reviewer (1) (1) (1)Uploaded byLica Falculan
- Joint Probability FunctionsUploaded byBran Relieve
- lecture30 central limit theorem.pdfUploaded bysourav kumar ray
- Ditlevsen - MathematicalModels for Structural Reliability AnalysisUploaded byEnson Portela
- statistics (1).pptxUploaded bySameer Shafqat
- Revision MCII 08-02-2015.pdfUploaded byVanusha
- PQT PREVIOUS QUESTIONS @ DCEUploaded byDhilip Prabakaran
- IntroductionUploaded byGabriela Badea
- Describing Clinical DataUploaded byFlorentina Smeureanu
- IandF CT3 2017 SyllabusUploaded byNathan Rk
- Chapter 3Uploaded byizzathanif
- STC Problemset1Uploaded byVignesh Sathya
- Qit Hs13 ScriptUploaded bySyedZain1993
- Discrete Random Variables and Their Probability DistributionsUploaded by03435013877
- Statistic MatlabUploaded byMujeeb Abdullah
- Stat 116Uploaded byoguzdagci
- add math probability distributionUploaded bykamil muhammad
- Undergrad CurriculumUploaded byOlmayana Ergi
- DP Barbora KucerovaUploaded bySatria Nugraha
- (Springer Texts in Statistics ) Hung T. Nguyen, Gerald S. Rogers (auth.)-Fundamentals of Mathematical Statistics_ Probability for Statistics-Springer New York (1989).pdfUploaded bysivakumar
- 1.4 Probability ConceptsUploaded byMario
- Basic Statistics CalculationUploaded bycostasn
- Basic Stats and ProbabilityUploaded byMustafa Ghafouri
- Lecture 4 Disc r Prob f 2016Uploaded byMobasher Messi

## Much more than documents.

Discover everything Scribd has to offer, including books and audiobooks from major publishers.

Cancel anytime.