Professional Documents
Culture Documents
Cumulative Distribution
Functions
A possible CDF for a continuous random variable
Probability Density Functions
The derivative of the CDF is the probability
()
density function (pdf ), f X x
d
dx
FX x . ( ( ))
Probability density can also be defined by
()
f X x dx = P x < X x + dx
Properties
()
f X x 0 , < x < + f ( x ) dx = 1
X
x x2
( ) f ( ) d
FX x = X ()
P x1 < X x2
= f X x dx
x1
Probability Density Functions
We can also apply the concept of a pdf to a discrete random
variable if we allow the use of the impulse. Consider the
CDF for tossing a die illustrated below. If pdf is the
derivative of CDF what does the pdf for tossing a die look
like? It is zero everywhere except at the points x = 1,2,3,4,5 or 6.
At those points, strictly speaking mathematically, the derivative
does not exist.
Introduction to the Impulse
1 / a , t < a / 2
Define a function a ( t ) =
0 , t >a/2
g(0) = ( t ) g ( t ) dt
The Unit Step and Unit Impulse
As a approaches zero, g ( t ) approaches a unit
step and g ( t ) approaches a unit impulse
g ( t ) ( t t ) dt = g ( t )
0 0
( a ( t t0 )) = ( t t0 )
1
a
This property illustrates that the impulse is different from
ordinary mathematical functions.
Probability Density Functions
() ( ) ()
Bernoulli - f X x = 1 p x + p x 1 ( )
n k
() ( ) ( )
n
x =
p 1 p
n k
Binomial - f X xk
k
k =0
()
Geometric - f X x = p 1 p ( ) ( )
k 1
xk
k =1
k 1 r
( ) y = ( ) ( )
k r
Negative Binomial (Pascal) - fY
p 1 p yk
r 1 k =r
k
Poisson - f X ()
x = e xk ( )
k =0 k!
Expectation and Moments
For a discrete random variable, the expected value of X is
( ) ( )
M
E X = P X = xi xi
i=1
( ) x f ( x ) dx
E X = X
Similarly
( ( )) = g ( x ) f ( x ) dx
E g X
X
( ) x f ( x ) dx
E Xn = n
X
Expectation and Moments
The first moment of a continuous random variable is its expected
( ) x f ( x ) dx.
value E X = X
( )
mean-squared value E X 2 = ()
x 2 f X x dx.
( )
= E X E X = x E ( X ) ()
2 2
2
f X x dx
X
Expectation and Moments
( ) ( )
2X = E X 2 E 2 X ( )
, Var aX + b = a 2 Var X( )
Probability Density Functions
()
fX x =
1
e (
x μ X )2 /2 2X
X 2
( )
μ X = E X and X = E X E X
( )
2
Probability Density Functions
The Gaussian pdf
Its maximum value occurs at the mean value of its argument.
It is symmetrical about the mean value.
The points of maximum absolute slope occur at one standard
deviation above and below the mean.
Its maximum value is inversely proportional to its standard
deviation.
The limit as the standard deviation approaches zero is a unit
impulse.
( )
x μ x = lim
1
e (
x μ X )2 /2 2X
X 0
X 2
Probability Density Functions
The normal pdf is a Gaussian pdf with a mean of zero and
a variance of one.
()
fX x =
1
e x 2 /2
2
The central moments of the Gaussian pdf are
0
( )
n
, n odd
E X E X =
( )
1 35… n 1 n , n even
X
Functions of a Random Variable
Let X and Y be continuous random variables and let Y = g X . ( )
Also, let the function g be invertible, meaning that an inverse
( )
function X = g 1 Y exists and is single-valued as in the
illustrations below. Then it can be shown that the pdf’s of
X and Y are related by
( )
fY y =
( ( )) .
f X g 1 y
dy / dx
Functions of a Random Variable
( ) ( ) (( ) )
Let the pdf of X be f X x = 1 / 4 rect x 1 / 4 and let
Y 1
Y = 2 X + 1. Then X = g (Y ) =
1 dY
and = 2.
2 dX
Functions of a Random Variable
( ) ( )
y 1 / 2 1
(( ) )
1 / 4 rect
y 1 / 2
( ) (( ) )
fX 4
( )
fY y =
2
=
2
= 1 / 8 rect y 3 / 8
Functions of a Random Variable
Now let
5Y
Y = 2 X + 5 X = g 1
( )
Y =
2
,
dY
dX
= 2 .
( )
fY y =
(( ) )=
fX 5 y / 2
(1 / 8) rect ((3 y ) / 8) = (1 / 8) rect (( y 3) / 8)
2
Functions of a Random Variable
( )
Now let Y = g X = X 2 . This is more complicated because
{ } { }
the event 1 < Y 4 is caused by the event 1 < X 2 but it
{ }
is also caused by the event 2 X < 1 . If we make the
transformation from the pdf of X to the pdf of Y in two steps
the process is simpler to see. Let Z = X .
Functions of a Random Variable
z 3 / 2
()
fZ z =
1
4
( ) 1
rect z 1 / 2 + rect
4 3
dY
Z = Y , Y 0 , = 2Z = 2 Y , Y 0
dZ
y 3 / 2
1
f g y
Z ( ( ))
1
, y0 4
1
rect
3
+ rect
4
( y 1/ 2 )
( )
fY y =
dy / dz = ( )
u y
2 y
0 , y < 0
1 2 y 3
1
( )
fY y = rect ( )
+ rect y u y
2
8 y 6
Functions of a Random Variable
f ( y ) dy = 1
Y
Functions of a Random Variable
( )
In general, if Y = g X and the real solutions of this equation are
x1 , x2 ,x N then, for those ranges of Y for which there is a
( )
corresponding X through Y = g X we can find the pdf of Y.
Notice that for some ranges of X and Y there are multiple real
solutions and for other ranges there may be fewer.
In this figure there are three real values of x which produce ,
y1. But there is only one value of x which produces y2 .
Only the real solutions are
used. So in some transformations,
the transformation used may
depend on the range of values
of X and Y being considered.
Functions of a Random Variable
For those ranges of Y for which there is a corresponding real X
through Y = g X ( )
( )
fY y =
( )
f X x1
+
( )
f X x2
+ +
( )
f X xN
dY
dY
dY
dX dX dX
X =x
1 X =x 2 X =x N
( )
In the previous example Y = g X = X 2 and x1,2 = ± y.
y 1
y 1
f
X
y
+
( ) ( )
fX y
, y 0
rect
4
+ rect
( )
4
fY y = 2 y 2 y =
,y0
8 y
0 , y < 0
0 ,y<0
Functions of a Random Variable
One problem that arises when transforming continuous random
variables occurs when the derivative is zero. This occurs in
any type of transformation for which Y is constant for a non-zero
range of X. Since division by zero is undefined, the formula
fY ( y) = (
f X x1 ) +
(
f X x2 ) + +
(
f X xN )
( dy / dx ) x= x1
( dy / dx ) x= x2
( dy / dx ) x= x N
fY ( y) = (
f X x1 ) +
(
f X x2 ) + +
(
f X xN )
( dy / dx ) x= x1
( dy / dx ) x= x2
( dy / dx ) x= x N
Functions of a Random Variable
For this example, the pdf of Y would be
( ) ( ) ( ) ( ) (( ) ) ( )
fY y = 2 / 3 y 1 + 1 / 18 rect y + 2 / 18 u y 1
or, simplifying,
( ) ( ) ( ) ( ) (( ) )
fY y = 2 / 3 y 1 + 1 / 18 rect y 4 / 6
Conditional Probability
Distribution Function
( )
P X x A
FX |A ()
x = P X x | A =
P A
This is the distribution function for x given that the condition
A also occurs.
()
0 FX |A x 1 , < x <
( ) ( )
FX |A = 0 and FX |A + = 1
(x ) F (x )
P x1 < X x2 | A = FX |A 2 X |A 1
()
FX |A x is a monotonic function of x.
Conditional Probability
{ }
Let the condition A be A = X a where a is a constant.
(
P X x X a ) ( )
FX |A ()
x = P X x | X a =
P X a
( ) ( )
If a x then P X x X a = P X a and
P X a
() ( ) (
FX |A x = P X x X a = )
P X a
=1
( ) ( )
If a x then P X x X a = P X x and
() ( ) (
FX |A x = P X x X a =
P X x FX x
) =
()
P X a FX a ()
Conditional Probability
Conditional pdf
()
f X |A x =
d
dx
( ( ))
FX |A x
( ( ) ) g ( x ) f ( x ) dx
E g X |A =
X |A
Conditional mean
(
E X|A =) x f ( x ) dx
X |A