You are on page 1of 4

Lecture 5: Continuous Probability Distributions

(Text Sections 2.32.4)


Normal Distribution
A random variable X has a Normal(, 2 ) distribution if it has pdf
f (x) =

1
2
2
e(x) /2 ,
2 2

< x, < , > 0.

The normal cdf does not have a closed form.


Example: Many dimensions in nature can be reasonably modelled by a normal distribution.
E.g., mens heights and womens heights. Check out the living histogram on the department home page (http://www.stat.sfu.ca)! Its a mixture of two approximately normal
distributions (one for men, one for women).
Example: Central Limit Theorems show how the normal distribution plays an important
role in many applications. Specifically, whenever the random variable of interest, X, can be
written as a sum of iid random variables Z1 , . . . , Zn (with Zi having almost any distribution!),
the distribution of X gets closer to normal as n . The variable Zi might represent a
fluctuation in a stock price at month i around a deterministic trend, or the error term in
a linear regression model. Central limit theorems (with more stringent conditions) also exist
for sums of dependent and non-identically distributed random variables.
Exponential Distribution
The exponential distribution will be very important in our study of continuous time Markov
chains (Chapter 6).
A random variable, X has an Exponential() distribution if its pdf is
(

f (x) =

ex , x 0
.
0,
otherwise

for > 0.
NOTE: Sometimes the Exponential() distribution is specified as
f (x) =

1 x/
e
,

x 0.

In the first case, is the rate parameter, while in the second, it is the mean parameter. For
this reason, it is often better to avoid the term Exponential() and use instead a term such
as exponential with rate or simply specify the density explicitly.
Using the first specification, the exponential cdf is
F (x) = P (X x) = 1 ex ,
1

> 0.

Example: Many survival time data can be reasonably described by the exponential distribution, such as time till death after diagnosis of a fatal disease, time till diagnosis of AIDS
after acquiring HIV, etc. In a manufacturing context, such data are often called failure time
or reliability data, e.g., time till a machine requires maintenance, lifetime of a car, etc.
NOTE: The exponential distribution is the only memoryless continuous distribution: for
y < x,
P (X > x | X > y) =
=
=
=
=

P (X > x, X > y)
P (X > y)
P (X > x)
P (X > y)
ex
ey
e(xy)
P (X > x y).

Thus, for example, if the lifetime of a light bulb has an Exponential() distribution and we
know that it has already lasted for y hours, its remaining lifetime has an Exponential()
distribution as well. I.e., its remaining lifetime is not affected by how long its already lasted.
Gamma Distribution
The gamma distribution is important in the study of Poisson processes (Chapter 5).
A random variable X has a Gamma(, ) distribution if
( ex (x)1

f (x) =

()

0,

, x0
.
otherwise

for , > 0. The parameter is called the shape parameter, and the parameter is called
the rate parameter (as in the exponential distribution). The exponential distribution is a
special case of the gamma distribution with = 1.
NOTE: The gamma distribution also has different specifications, so its a good idea to
specify the density explicitly when you define a Gamma distributed random variable.
The cdf of the gamma distribution does not have a closed form.
Example: The sum of n independent Exponential() random variables has a Gamma(n, )
distribution. Thus, if we replace a light bulb immediately after it burns out, the total time
that n successive light bulbs last has a Gamma(n, ) distribution.

Uniform Distribution
A random variable X taking on continuous values in [m, n] has a Uniform[m, n] distribution
if it has pdf
(
1/(n m), m x n
f (x) =
.
0,
otherwise
The cdf of the Uniform[m, n] distribution is
F (x) = P (X x) =

(x m)/(n m), m x n

0,
1,

x<m
x>n

Illustration:

Expectation of a Continuous Random Variable


Definition: The expected value, expectation, or mean of a continuous random variable X is
defined by
Z
E[X] = xf (x)dx.
x

Again, E[X] is a weighted average of all the values that X can take on. It is a measure of
central tendency. It is not, however, an indication of the most likely value that a random
variable can take on (since P (X = x) = 0 for all x).
Example:
Example: Let X Normal(, 2 ). Then
E[X] =

Z
x=

1
2
2
e(x) /2 dx
2
2

Denoting the Normal(, 2 ) density by f (x),


x
df (x)
2
2
e(x) /2
=
dx
2 2 2
x

2
2
2
2
=
e(x) /2 +
e(x) /2
2 2 2
2 2 2
3

So,
Z +
df (x)

E[X]

+ 2
2
dx

E[X]

lim f (x) lim f (x) = 2 + 2


x
x

E[X]
0 = 2 + 2

E[X] =
dx =

Thus, the parameter of a Normal(, 2 ) distributed random variable represents its mean.
Expectation of a Function of a Continuous Random Variable
If X is a continuous random variable with density f (x),
Z

E[g(X)] =

g(x)f (x)dx

for any function g.


Example: Compute the variance of a random variable X with Normal(, 2 ) distribution.
Z

Var[X] =

(x )2

y y

1
2
2
e(x) /2 dx
2 2

1
2
2
ey /2 dy
2
2

1
Using integration by parts ( udv = uv vdu, with u = y and dv = y 2
ey
2

2 /2 2

):

Z
2 y2 /22
2
y 2 /2 2

Var[X] = y
e
e
dy

y
2 2
2 2

= 0 + 2
= 2

So, the parameter 2 of a Normal(, 2 ) distributed random variable represents its variance.
Example: Expectation of an indicator function. Let X Exponential(). Define the indicator function Y = 1{X 5}. Then
E[Y ] =
=

Z
x=0
Z 5
x=0

1{x 5} ex dx
ex dx

= P (X 5)
= P (Y = 1)
In general, the expectation of an indicator random variable is the probability that this
random variable equals 1.
4

You might also like