You are on page 1of 108

P1: IML/FFX

MOBK042-FM

P2: IML/FFX

QC: IML/FFX

MOBK042-Enderle.cls

T1: IML

October 30, 2006

19:55

Advanced Probability Theory


for Biomedical Engineers

Copyright 2006 by Morgan & Claypool

All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted in
any form or by any meanselectronic, mechanical, photocopy, recording, or any other except for brief quotations
in printed reviews, without the prior permission of the publisher.
Advanced Probability Theory for Biomedical Engineers
John D. Enderle, David C. Farden, and Daniel J. Krause
www.morganclaypool.com
ISBN-10: 1598291505
ISBN-13: 9781598291506

paperback
paperback

ISBN-10: 1598291513
ISBN-13: 9781598291513

ebook
ebook

DOI 10.2200/S00063ED1V01Y200610BME011
A lecture in the Morgan & Claypool Synthesis Series
SYNTHESIS LECTURES ON BIOMEDICAL ENGINEERING #11
Lecture #11
Series Editor: John D. Enderle, University of Connecticut

Series ISSN: 1930-0328

print

Series ISSN: 1930-0336

electronic

First Edition
10 9 8 7 6 5 4 3 2 1
Printed in the United States of America

P1: IML/FFX

P2: IML/FFX

MOBK042-FM

QC: IML/FFX

MOBK042-Enderle.cls

T1: IML

October 30, 2006

19:55

Advanced Probability Theory


for Biomedical Engineers
John D. Enderle
Program Director & Professor for Biomedical Engineering,
University of Connecticut

David C. Farden
Professor of Electrical and Computer Engineering,
North Dakota State University

Daniel J. Krause
Emeritus Professor of Electrical and Computer Engineering,
North Dakota State University

SYNTHESIS LECTURES ON BIOMEDICAL ENGINEERING #11

M
&C

Mor gan

& Cl aypool

iii

Publishers

P1: IML/FFX

P2: IML/FFX

MOBK042-FM

QC: IML/FFX

MOBK042-Enderle.cls

T1: IML

October 30, 2006

19:55

iv

ABSTRACT
This is the third in a series of short books on probability theory and random processes for
biomedical engineers. This book focuses on standard probability distributions commonly encountered in biomedical engineering. The exponential, Poisson and Gaussian distributions are
introduced, as well as important approximations to the Bernoulli PMF and Gaussian CDF.
Many important properties of jointly Gaussian random variables are presented. The primary
subjects of the final chapter are methods for determining the probability distribution of a function of a random variable. We first evaluate the probability distribution of a function of one
random variable using the CDF and then the PDF. Next, the probability distribution for a
single random variable is determined from a function of two random variables using the CDF.
Then, the joint probability distribution is found from a function of two random variables using
the joint PDF and the CDF.
The aim of all three books is as an introduction to probability theory. The audience
includes students, engineers and researchers presenting applications of this theory to a wide
variety of problemsas well as pursuing these topics at a more advanced level. The theory
material is presented in a logical mannerdeveloping special mathematical skills as needed.
The mathematical background required of the reader is basic knowledge of differential calculus.
Pertinent biomedical engineering examples are throughout the text. Drill problems, straightforward exercises designed to reinforce concepts and develop problem solution skills, follow
most sections.

KEYWORDS
Probability Theory, Random Processes, Engineering Statistics, Probability and Statistics for
Biomedical Engineers, Exponential distributions, Poisson distributions, Gaussian distributions
Bernoulli PMF and Gaussian CDF. Gaussian random variables

P1: IML/FFX
MOBK042-FM

P2: IML/FFX

QC: IML/FFX

MOBK042-Enderle.cls

T1: IML

October 30, 2006

19:55

Contents
5.

Standard Probability Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1


5.1 Uniform Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
5.2 Exponential Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
5.3 Bernoulli Trials . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
5.3.1 Poisson Approximation to Bernoulli . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
5.3.2 Gaussian Approximation to Bernoulli . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
5.4 Poisson Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
5.4.1 Interarrival Times. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .18
5.5 Univariate Gaussian Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
5.5.1 Marcums Q Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
5.6 Bivariate Gaussian Random Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
5.6.1 Constant Contours. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .32
5.7 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
5.8 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36

6.

Transformations of Random Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45


6.1 Univariate CDF Technique . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
6.1.1 CDF Technique with Monotonic Functions . . . . . . . . . . . . . . . . . . . . . . . . 45
6.1.2 CDF Technique with Arbitrary Functions . . . . . . . . . . . . . . . . . . . . . . . . . . 46
6.2 Univariate PDF Technique . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
6.2.1 Continuous Random Variable . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
6.2.2 Mixed Random Variable . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
6.2.3 Conditional PDF Technique . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
6.3 One Function of Two Random Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
6.4 Bivariate Transformations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
6.4.1 Bivariate CDF Technique . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
6.4.2 Bivariate PDF Technique . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
6.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
6.6 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75

P1: IML/FFX
MOBK042-FM

P2: IML/FFX

QC: IML/FFX

MOBK042-Enderle.cls

T1: IML

October 30, 2006

19:55

vi

P1: IML/FFX
MOBK042-FM

P2: IML/FFX

QC: IML/FFX

MOBK042-Enderle.cls

T1: IML

October 30, 2006

19:55

vii

Preface
This is the third in a series of short books on probability theory and random processes for
biomedical engineers. This text is written as an introduction to probability theory. The goal
was to prepare students at the sophomore, junior or senior level for the application of this
theory to a wide variety of problems - as well as pursue these topics at a more advanced
level. Our approach is to present a unified treatment of the subject. There are only a few key
concepts involved in the basic theory of probability theory. These key concepts are all presented
in the first chapter. The second chapter introduces the topic of random variables. The third
chapter focuses on expectation, standard deviation, moments, and the characteristic function.
In addition, conditional expectation, conditional moments and the conditional characteristic
function are also discussed. The fourth chapter introduces jointly distributed random variables,
along with joint expectation, joint moments, and the joint characteristic function. Convolution
is also developed. Later chapters simply expand upon these key ideas and extend the range of
application.
This short book focuses on standard probability distributions commonly encountered in
biomedical engineering. Here in Chapter 5, the exponential, Poisson and Gaussian distributions
are introduced, as well as important approximations to the Bernoulli PMF and Gaussian CDF.
Many important properties of jointly distributed Gaussian random variables are presented.
The primary subjects of Chapter 6 are methods for determining the probability distribution of
a function of a random variable. We first evaluate the probability distribution of a function of
one random variable using the CDF and then the PDF. Next, the probability distribution for a
single random variable is determined from a function of two random variables using the CDF.
Then, the joint probability distribution is found from a function of two random variables using
the joint PDF and the CDF.
A considerable effort has been made to develop the theory in a logical manner - developing
special mathematical skills as needed. The mathematical background required of the reader is
basic knowledge of differential calculus. Every effort has been made to be consistent with
commonly used notation and terminologyboth within the engineering community as well as
the probability and statistics literature.
The applications and examples given reflect the authors background in teaching probability theory and random processes for many years. We have found it best to introduce this
material using simple examples such as dice and cards, rather than more complex biological

P1: IML/FFX

P2: IML/FFX

MOBK042-FM

QC: IML/FFX

MOBK042-Enderle.cls

viii

T1: IML

October 30, 2006

19:55

PREFACE

and biomedical phenomena. However, we do introduce some pertinent biomedical engineering


examples throughout the text.
Students in other fields should also find the approach useful. Drill problems, straightforward exercises designed to reinforce concepts and develop problem solution skills, follow most
sections. The answers to the drill problems follow the problem statement in random order.
At the end of each chapter is a wide selection of problems, ranging from simple to difficult,
presented in the same general order as covered in the textbook.
We acknowledge and thank William Pruehsner for the technical illustrations. Many of the
examples and end of chapter problems are based on examples from the textbook by Drake [9].

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:51

CHAPTER 5

Standard Probability Distributions


A surprisingly small number of probability distributions describe many natural probabilistic
phenomena. This chapter presents some of these discrete and continuous probability distributions that occur often enough in a variety of problems to deserve special mention. We will see
that many random variables and their corresponding experiments have similar properties and
can be described by the same probability distribution. Each section introduces a new PMF or
PDF. Following this, the mean, variance, and characteristic function are found. Additionally,
special properties are pointed out along with relationships among other probability distributions. In some instances, the PMF or PDF is derived according to the characteristics of the
experiment. Because of the vast number of probability distributions, we cannot possibly discuss
them all here in this chapter.

5.1

UNIFORM DISTRIBUTIONS

Definition 5.1.1. The discrete RV x has a uniform distribution over n points (n > 1) on the
interval [a, b] if x is a lattice RV with span h = (b a)/(n 1) and PMF

1/n, = kh + a, k = 0, 1, . . . , n 1
p x () =
(5.1)
0,
otherwise.
The mean and variance of a discrete uniform RV are easily computed with the aid of
Lemma 2.3.1:
x =
and
x2

n1
1 b a n(n 1)
b +a
1
h n[2]
+a =
+a =
,
(kh + a) =
n k=0
n 2
nn1
2
2



n1 
n1 
1
k
1
b a 2 (b a)2 
k2
=
=

+
.
kh
n k=0
2
n
(n 1)2 n 1 4
k=0

(5.2)

(5.3)

Simplifying,
x2 =

(b a)2 n + 1
.
12 n 1

(5.4)

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:51

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS


px ( )

x (t )

.05

.5

10

(a)

30

50

(b)

FIGURE 5.1: (a) PMF and (b) characteristic function magnitude for discrete RV with uniform distribution over 20 points on [0, 1].

The characteristic function can be found using the sum of a geometric series:
x (t) =

n1
e jat 
e jat 1 e jhnt
(e jht )k =
.
n k=0
n 1 e jht

(5.5)

Simplifying with the aid of Eulers identity,



x (t) = exp

a +b
t
j
2

 ba

n
t
2 n1
 ba
.
1
n sin 2 n1 t

sin

(5.6)

Figure 5.1 illustrates the PMF and the magnitude of the characteristic function for a discrete RV
which is uniformly distributed over 20 points on [0, 1]. The characteristic function is plotted
over [0, /h], where the span h = 1/19. Recall from Section 3.3 that x (t) = x (t) and that
x (t) is periodic in t with period 2/h. Thus, Figure 5.1 illustrates one-half period of |x ()|.
Definition 5.1.2. The continuous RV x has a uniform distribution on the interval [a, b] if x has
PDF

1/(b a), a b
(5.7)
f x () =
0,
otherwise.
The mean and variance of a continuous uniform RV are easily computed directly:
1
x =
b a

b
d =
a

b2 a2
b +a
=
,
2(b a)
2

(5.8)

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:51

STANDARD PROBABILITY DISTRIBUTIONS

x (t )

f x ()
1

.5

10

30

50

(b)

(a)

FIGURE 5.2: (a) PDF and (b) characteristic function magnitude for continuous RV with uniform
distribution on [0, 1].

and
x2

1
=
b a

b 
a

b +a

2
d =

(b a)2
.
12

(5.9)

The characteristic function can be found as


1
x (t) =
b a

b
a

 b+a 
exp
j 2 t
e jt d =
b a

(ba)/2


e jt d .
(ba)/2

Simplifying with the aid of Eulers identity,






t
sin ba
a +b
2
x (t) = exp j
t
.
ba
2
t
2

(5.10)

Figure 5.2 illustrates the PDF and the magnitude of the characteristic function for a continuous
RV uniformly distributed on [0, 1]. Note that the characteristic function in this case is not
periodic but x (t) = x (t).
Drill Problem 5.1.1. A pentahedral die (with faces labeled 0,1,2,3,4) is tossed once. Let x be a
random variable equaling ten times the number tossed. Determine: (a) p x (20), (b) P(10 x 50),
(c) E(x), (d) x2 .
Answers: 20, 0.8, 200, 0.2.
Drill Problem 5.1.2. Random variable x is uniformly distributed on the interval [1, 5]. Determine: (a) Fx (0), (b) Fx (5), (c) x , (d) x2 .
Answers: 1, 1/6, 3, 2.

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:51

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

5.2

EXPONENTIAL DISTRIBUTIONS

Definition 5.2.1. The discrete RV x has a geometric distribution or discrete exponential distribution with parameter p(0 < p < 1) if x has PMF

p(1 p)1 , = 1, 2, . . .
(5.11)
p x () =
0,
otherwise.
The characteristic function can be found using the sum of a geometric series (q = 1 p):
x (t) =

 j t k
p
pe j t
qe
=
.
q k=1
1 q e jt

(5.12)

The mean and variance of a discrete RV with a geometric distribution can be computed using
the moment generating property of the characteristic function. The results are
x =

1
,
p

x2 =

and

q
.
p2

(5.13)

Figure 5.3 illustrates the PMF and the characteristic function magnitude for a discrete RV with
geometric distribution and parameter p = 0.18127.
It can be shown that a discrete exponentially distributed RV has a memoryless property:
p x|x> (|x > ) = p x ( ),

 0.

(5.14)

Definition 5.2.2. The continuous RV x has an exponential distribution with parameter ( > 0)
if x has PDF
f x () = e u(),

(5.15)

where u() is the unit step function.


x (t )

px ()
1

.18

.09

10
(a)

20

(b)

FIGURE 5.3: (a) PMF and (b) characteristic function magnitude for discrete RV with geometric distribution [ p = 0.18127].

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:51

STANDARD PROBABILITY DISTRIBUTIONS

x (t )

f x ()
1

.2

.1

10

15

10

30

50

(b)

(a)

FIGURE 5.4: (a) PDF and (b) characteristic function magnitude for continuous RV with exponential
distribution and parameter = 0.2.

The exponential probability distribution is also a very important probability density function
in biomedical engineering applications, arising in situations involving reliability theory and
queuing problems. Reliability theory, which describes the time to failure for a system or component, grew primarily out of military applications and experiences with multicomponent systems.
Queuing theory describes the waiting times between events.
The characteristic function can be found as

x (t) =

e ( j t) d =

.
jt

(5.16)

Figure 5.4 illustrates the PDF and the magnitude of the characteristic function for a continuous
RV with exponential distribution and parameter = 0.2.
The mean and variance of a continuous exponentially distributed RV can be obtained
using the moment generating property of the characteristic function. The results are
x =

1
,

x2 =

1
.
2

(5.17)

A continuous exponentially distributed RV, like its discrete counterpart, satisfies a memoryless
property:
f x|x> (|x > ) = f x ( ),

0.

(5.18)

Example 5.2.1. Suppose a system contains a component that has an exponential failure rate. Reliability engineers determined its reliability at 5000 hours to be 95%. Determine the number of hours
reliable at 99%.

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:51

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

Solution. First, the parameter is determined from



0.95 = P (x > 5000) =

e d = e 5000 .

5000

Thus
=

ln(0.95)
= 1.03 105 .
5000

Then, to determine the number of hours reliable at 99%, we solve for from
P (x > ) = e = 0.99
or
=

ln(0.99)
= 980 hours.

Drill Problem 5.2.1. Suppose a system has an exponential failure rate in years to failure with
= 0.02. Determine the number of years reliable at: (a) 90%, (b) 95%, (c) 99%.
Answers: 0.5, 2.6, 5.3.
Drill Problem 5.2.2. Random variable x, representing the length of time in hours to complete an
examination in Introduction to Random Processes, has PDF
4 4
f x () = e 3 u().
3
The examination results are given by

75,

g (x) = 75 + 39.44(x 4/3),

0,

0 < x < 4/3


x 4/3
otherwise.

Determine the average examination grade.


Answer: 80.

5.3

BERNOULLI TRIALS

A Bernoulli experiment consists of a number of repeated (independent) trials with only two
possible events for each trial. The events for each trial can be thought of as any two events which
partition the sample space, such as a head and a tail in a coin toss, a zero or one in a computer

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:51

STANDARD PROBABILITY DISTRIBUTIONS

bit, or an even and odd number in a die toss. Let us call one of the events a success, the other
a failure. The Bernoulli PMF describes the probability of k successes in n trials of a Bernoulli
experiment. The first two chapters used this PMF repeatedly in problems dealing with games
of chance and in situations where there were only two possible outcomes in any given trial.
For biomedical engineers, the Bernoulli distribution is used in infectious disease problems and
other applications. The Bernoulli distribution is also known as a Binomial distribution.
Definition 5.3.1. A discrete RV x is Bernoulli distributed if the PMF for x is


n p k q nk , k = 0, 1, . . . , n
p x (k) =
k

0,
otherwise,

(5.19)

where p = probability of success and q = 1 p.


The characteristic function can be found using the binomial theorem:
x (t) =

n  

n
k=0

( pe j t )k q nk = (q + pe j t )n .

(5.20)

Figure 5.5 illustrates the PMF and the characteristic function magnitude for a discrete RV with
Bernoulli distribution, p = 0.2, and n = 30.
Using the moment generating property of characteristic functions, the mean and variance
of a Bernoulli RV can be shown to be
x = np,

x2 = npq .

(5.21)

x (t )

px ()
1

.18

.09

10

20
(a)

(b)

FIGURE 5.5: (a) PMF and (b) characteristic function magnitude for discrete RV with Bernoulli distribution, p = 0.2 and n = 30.

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:51

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

Unlike the preceding distributions, a closed form expression for the Bernoulli CDF is not
easily obtained. Tables A.1A.3 in the Appendix list values of the Bernoulli CDF for p =
0.05, 0.1, 0.15, . . . , 0.5 and n = 5, 10, 15, and 20. Let k {0, 1, . . . , n 1} and define

k

n
G(n, k, p) =
p  (1 p)n .

=0
Making the change of variable m = n  yields
n


G(n, k, p) =

m=nk

n
nm


p nm (1 p)m .

Now, since

G(n, k, p) =

n


n
nm

m=0

n
m

n!
=
=
m! (n m)!


p

nm

(1 p)
m

nk1

m=0

n
m


,

n
m


p nm (1 p)m .

Using the Binomial Theorem,


G(n, k, p) = 1 G(n, n k 1, 1 p).

(5.22)

This result is easily applied to obtain values of the Bernoulli CDF for values of p > 0.5 from
Tables A.1A.3.
Example 5.3.1. The probability that Fargo Polytechnic Institute wins a game is 0.7. In a 15 game
season, what is the probability that they win: (a) at least 10 games, (b) from 9 to 12 games, (c) exactly
11 games? (d) With x denoting the number of games won, find x and x2 .
Solution. With x a Bernoulli random variable, we consult Table A.2, using (5.22) with n = 15,
k = 9, and p = 0.7, we find
a) P (x 10) = 1 Fx (9) = 1.0 0.2784 = 0.7216,
b) P (9 x 12) = Fx (12) Fx (8) = 0.8732 0.1311 = 0.7421,
c) p x (11) = Fx (11) Fx (10) = 0.7031 0.4845 = 0.2186.
d) x = np = 10.5, x2 = np(1 p) = 3.15.

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:51

STANDARD PROBABILITY DISTRIBUTIONS

We now consider the number of trials needed for k successes in a sequence of Bernoulli trials.
Let
p(k, n) =
P (k
in n trials)

successes

p k q nk ,
k = 0, 1, . . . , n
=
k

0,
otherwise,

(5.23)

where p = p(1, 1) and q = 1 p. Let RV nr represent the number of trials to obtain exactly
r successes (r 1). Note that
P (success in th trial |r 1 successes in previous  1 trials) = p;

(5.24)

hence, for  = r, r + 1, . . . , we have


P (nr = ) = p(r 1,  1) p.

(5.25)

Discrete RV nr thus has PMF




 1 p r q r ,
p nr () =
r 1

0,

 = r, r + 1, . . .

(5.26)

otherwise,

where the parameter r is a positive integer. The PMF for the RV nr is called the negative
binomial distribution, also known as the Polya and the Pascal distribution. Note that with
r = 1 the negative binomial PMF is the geometric PMF.
The moment generating function for nr can be expressed as
Mnr () =


( 1)( 2) ( r + 1) r r 
p q e .
(r 1)!
=r

Letting m =  r , we obtain
Mnr () =

e r p r 
(m + r 1)(m + r 2) (m + 1)(q e )m .
(r 1)! m=0

With
s (x) =


k=0

xk =

1
,
1x

|x| < 1,

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

10

October 30, 2006

19:51

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

we have
s () (x) =

k(k 1) (k  + 1)x k

k=


m=0

(m + )(m +  1) (m + 1)x m
!

(1 x)+1

Hence

Mnr () =

pe
1 qe

r
,

q e < 1.

(5.27)

The mean and variance for nr are found to be


n r =

r
,
p

n2r =

and

rq
.
p2

(5.28)

We note that the characteristic function is simply


nr (t) = Mnr ( j t) = xr (t),

(5.29)

where RV x has a discrete geometric distribution. Figure 5.6 illustrates the PMF and the
characteristic function magnitude for a discrete RV with negative binomial distribution, r = 3,
and p = 0.18127.
x (t )

px ()
1

.06

.03

20
(a)

40

(b)

FIGURE 5.6: (a) PMF and (b) magnitude characteristic function for discrete RV with negative binomial
distribution, r = 3, and p = 0.18127.

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:51

STANDARD PROBABILITY DISTRIBUTIONS

5.3.1

11

Poisson Approximation to Bernoulli

When n becomes large in the Bernoulli PMF in such a way that np = = constant, the
Bernoulli PMF approaches another important PMF known as the Poisson PMF. The Poisson
PMF is treated in the following section.
Lemma 5.3.1. We have

k

e
n
,
k nk
p(k) =
lim
p q
=
k!
n,np= k

0,

Proof. Substituting p =

k = 0, 1, . . .

(5.30)

otherwise,

and q = 1 n ,
k 

k1
nk 

1
(n i).
k
n
i=0

1
p(k) = lim
n k!
Note that
lim n

1
n

k 
k1

so that
k
p(k) = lim
n k!



n
.
1
n

Now,


lim ln 1
n
n
so that

n
= lim


lim

from which the desired result follows.

(n i) = 1,

i=0

1
n

n



ln 1 n
1
n

= e ,


We note that the limiting value p(k) may be used as an approximation for the Bernoulli
PMF when p is small by substituting = np. While there are no prescribed rules regarding the
values of n and p for this approximation, the larger the value of n and the smaller the value of p,
the better the approximation. Satisfactory results are obtained with np < 10. The motivation
for using this approximation is that when n is large, Tables A.1A.3 are useless for finding
values for the Bernoulli CDF.

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

12

October 30, 2006

19:51

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

Example 5.3.2. Suppose x is a Bernoulli random variable with n = 5000 and p = 0.001. Find
P (x 5).
Solution. Our solution involves approximating the Bernoulli PMF with the Poisson PMF
since n is quite large (and the Bernoulli CDF table is useless), and p is very close to zero.
Since = np = 5, we find from Table A.5 (the Poisson CDF table is covered in Section 4) that
P (x 5) = 0.6160.

Incidentally, if p is close to one, we can still use this approximation by reversing our definition of
success and failure in the Bernoulli experiment, which results in a value of p close to zerosee
(5.22).

5.3.2

Gaussian Approximation to Bernoulli

Previously, the Poisson PMF was used to approximate a Bernoulli PMF under certain conditions,
that is, when n is large, p is small and np < 10. This approximation is quite useful since the
Bernoulli table lists only CDF values for n up to 20. The Gaussian PDF (see Section 5.5)
is also used to approximate a Bernoulli PMF under certain conditions. The accuracy of this
approximation is best when n is large, p is close to 1/2, and npq > 3. Notice that in some
circumstances np < 10 and npq > 3. Then either the Poisson or the Gaussian approximation
will yield good results.
Lemma 5.3.2. Let
x np
y=
,
npq

(5.31)

where x is a Bernoulli RV. Then the characteristic function for y satisfies


(t) = lim y (t) = e t
n

Proof. We have

/2

 


t
np
t x
.
y (t) = exp j
npq
npq

Substituting for x (t),

y (t) = exp j
Simplifying,

np
t
q



n
t
q + p exp j
.
npq

  n


 
p
q
+ p exp j t
.
y (t) = q exp j t
qn
np

(5.32)

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:51

STANDARD PROBABILITY DISTRIBUTIONS

Letting

=

q
,
p

and

13

1
,
n

we obtain
ln( p 2 e j t/ + pe j t )
.
0
2

lim ln y (t) = lim

Applying LHospitals Rule twice,


t 2 p t 2 2 p
t2
j tpe j t/ + j tpe j t
=
= .
0
2
2
2

lim ln y (t) = lim

Consequently,

lim y (t) = exp lim ln y (t) = e t

/2

.


The limiting (t) in the above lemma is the characteristic function for a Gaussian RV
with zero mean and unit variance. Hence, for large n and a < b
P (a < x < b) = P (a < y < b ) F(b ) F(a ),

(5.33)

where
1
F( ) =
2

/2

d = 1 Q( )

(5.34)

b np
b =
,
npq

(5.35)

is the standard Gaussian CDF,


a np
a =
,
npq

and Q() is Marcums Q function which is tabulated in Tables A.8 and A.9 of the Appendix.
Evaluation of the above integral as well as the Gaussian PDF are treated in Section 5.5.
Example 5.3.3. Suppose x is a Bernoulli random variable with n = 5000 and p = 0.4. Find
P (x 2048).
Solution. The solution involves approximating the Bernoulli CDF with the Gaussian CDF
since npq = 1200 > 3. With np = 2000, npq = 1200 and b = (2048 2000)/34.641 =
1.39, we find from Table A.8 that
P (x 2048) F(1.39) = 1 Q(1.39) = 0.91774.

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

14

October 30, 2006

19:51

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

When approximating the Bernoulli CDF with the Gaussian CDF, a continuous distribution is
used to calculate probabilities for a discrete RV. It is important to note that while the approximation is excellent in terms of the CDFsthe PDF of any discrete RV is never approximated
with a continuous PDF. Operationally, to compute the probability that a Bernoulli RV takes an
integer value using the Gaussian approximation we must round off to the nearest integer.
Example 5.3.4. Suppose x is a Bernoulli random variable with n = 20 and p = 0.5. Find
P (x = 8).
Solution. Since npq = 5 > 3, the Gaussian approximation
is used to evaluate the Bernoulli

PMF,
p x (8). With np = 10, npq = 5, a = (7.5 10)/ 5 = 1.12, and b = (8.5 10)/

5 = 0.67, we have
p x (8) = P (7.5 < x < 8.5) F(0.67) F(1.12) = 0.25143 0.13136;
hence, p x (8) 0.12007. From the Bernoulli table, p x (8) = 0.1201, which is very close to the
above approximation.

Drill Problem 5.3.1. A survey of residents in Fargo, North Dakota revealed that 30% preferred a
white automobile over all other colors. Determine the probability that: (a) exactly five of the next 20
cars purchased will be white, (b) at least five of the next twenty cars purchased will be white, (c) from
two to five of the next twenty cars purchased will be white.
Answers: 0.1789, 0.4088, 0.7625.
Drill Problem 5.3.2. Prof. Rensselaer is an avid albeit inaccurate marksman. The probability she
will hit the target is only 0.3. Determine: (a) the expected number of hits scored in 15 shots, (b) the
standard deviation for 15 shots, (c) the number of times she must fire so that the probability of hitting
the target at least once is greater than 1/2.
Answers: 2, 4.5, 1.7748.

5.4

POISSON DISTRIBUTION

A Poisson PMF describes the number of successes occurring on a continuous line, typically a
time interval, or within a given region. For example, a Poisson random variable might represent
the number of telephone calls per hour, or the number of errors per page in this textbook.
In the previous section, we found that the limit (as n and constant mean np) of a
Bernoulli PMF is a Poisson PMF. In this section, we derive the Poisson probability distribution
from two fundamental assumptions about the phenomenon based on physical characteristics.

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:51

STANDARD PROBABILITY DISTRIBUTIONS

15

The following development makes use of the order notation o (h) to denote any function
g (h) which satisfies
g (h)
= 0.
h0 h
lim

(5.36)

For example, g (h) = 15h 2 + 7h 3 = o (h).


We use the notation
p(k, ) = P (k successes in interval [0, ]).

(5.37)

The Poisson probability distribution is characterized by the following two properties:


(1) The number of successes occurring in a time interval or region is independent of the
number of successes occurring in any other non-overlapping time interval or region. Thus, with
A = {k successes in interval I1 },

(5.38)

B = { successes in interval I2 },

(5.39)

and

we have
P (A B) = P (A)P (B),

if I1 I2 = .

(5.40)

As we will see, the number of successes depends only on the length of the time interval
and not the location of the interval on the time axis.
(2) The probability of a single success during a very small time interval is proportional to
the length of the interval. The longer the interval, the greater the probability of success. The
probability of more than one success occurring during an interval vanishes as the length of the
interval approaches zero. Hence
p(1, h) = h + o (h),

(5.41)

p(0, h) = 1 h + o (h).

(5.42)

and

This second property indicates that for a series of very small intervals, the Poisson process is
composed of a series of Bernoulli trials, each with a probability of success p = h + o (h).
Since [0, + h] = [0, ] (, + h] and [0, ] (, + h] = , we have
p(0, + h) = p(0, ) p(0, h) = p(0, )(1 h + o (h)).

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

16

October 30, 2006

19:51

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

Noting that
p(0, + h) p(0, )
h p(0, ) + o (h)
=
h
h
and taking the limit as h 0,
d p(0, )
= p(0, ),
d

p(0, 0) = 1.

(5.43)

This differential equation has solution


p(0, ) = e u( ).

(5.44)

Applying the above properties, it is readily seen that


p(k, + h) = p(k 1, ) p(1, h) + p(k, ) p(0, h) + o (h),
or
p(k, + h) = p(k 1, )h + p(k, )(1 h) + o (h),
so that
o (h)
p(k, + h) p(k, )
+ p(k, ) = p(k 1, ) +
.
h
h
Taking the limit as h 0
d p(k, )
+ p(k, ) = p(k 1, ),
d

k = 1, 2, . . . ,

(5.45)

with p(k, 0) = 0. It can be shown ([7, 8]) that


p(k, ) = e

e t p(k 1, t)d t

(5.46)

and hence that


p(k, ) =

( )k e
u( ),
k!

k = 0, 1, . . . .

(5.47)

The RV x = number of successes thus has a Poisson distribution with parameter and PMF
p x (k) = p(k, ). The rate of the Poisson process is and the interval length is .
For ease in subsequent development, we replace the parameter with . The characteristic function for a Poisson RV x with parameter is found as (with parameter , p x (k) = p(k, 1))
x (t) = e


(e j t )k
k=0

k!

= e exp(e j t ) = exp((e j t 1)).

(5.48)

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:51

STANDARD PROBABILITY DISTRIBUTIONS

17

x (t )

px ()
1

.12

.06

10

20

(b)

(a)

FIGURE 5.7: (a) PMF and (b) magnitude characteristic function for Poisson distributed RV with
parameter = 10.

Figure 5.7 illustrates the PMF and characteristic function magnitude for a discrete RV with
Poisson distribution and parameter = 10.
It is of interest to note that if x1 and x2 are independent Poisson RVs with parameters 1
and 2 , respectively, then
x1 +x2 (t) = exp((1 + 2 )(e j t 1));

(5.49)

i.e., x1 + x2 is also a Poisson with parameter 1 + 2 .


The moments of a Poisson RV are tedious to compute using techniques we have seen so
far. Consider the function
x ( ) = E( x )
and note that


x(k) ( ) = E xk

k1


(5.50)


(x i) ,

i=0

so that

k1



(x i) = x(k) (1).

(5.51)

i=0

If x is Poisson distributed with parameter , then


x ( ) = e ( 1) ,
so that
x(k) ( ) = k e ( 1) ;

(5.52)

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

18

October 30, 2006

19:51

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

hence,

k1

(x i) = k .
E

(5.53)

i=0

In particular, E(x) = , E(x(x 1)) = 2 = E(x 2 ) , so that x2 = 2 + 2 = .


While it is quite easy to calculate the value of the Poisson PMF for a particular number
of successes, hand computation of the CDF is quite tedious. Therefore, the Poisson CDF is
tabulated in Tables A.4-A.7 of the Appendix for selected values of ranging from 0.1 to 18.
From the Poisson CDF table, we note that the value of the Poisson PMF increases as the number
of successes k increases from zero to the mean, and then decreases in value as k increases from
the mean. Additionally, note that the table is written with a finite number of entries for each
value of because the PMF values are written with six decimal place accuracy, even though an
infinite number of Poisson successes are theoretically possible.
Example 5.4.1. On the average, Professor Rensselaer grades 10 problems per day. What is the
probability that on a given day (a) 8 problems are graded, (b) 810 problems are graded, and (c) at
least 15 problems are graded?
Solution. With x a Poisson random variable, we consult the Poisson CDF table with = 10,
and find
a) p x (8) = Fx (8) Fx (7) = 0.3328 0.2202 = 0.1126,
b) P (8 x 10) = Fx (10) Fx (7) = 0.5830 0.2202 = 0.3628,
c) P (x 15) = 1 Fx (14) = 1 0.9165 = 0.0835.

5.4.1

Interarrival Times

In many instances, the length of time between successes, known as an interarrival time, of a
Poisson random variable is more important than the actual number of successes. For example,
in evaluating the reliability of a medical device, the time to failure is far more significant to the
biomedical engineer than the fact that the device failed. Indeed, the subject of reliability theory
is so important that entire textbooks are devoted to the topic. Here, however, we will briefly
examine the subject of interarrival times from the basis of the Poisson PMF.
Let RV tr denote the length of the time interval from zero to the r th success. Then
p( h < tr ) = p(r 1, h) p(1, h)
= p(r 1, h)h + o (h)

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:51

STANDARD PROBABILITY DISTRIBUTIONS

19

so that
o (h)
Ftr ( ) Ftr ( h)
= p(r 1, h) +
.
h
h
Taking the limit as h 0 we find that the PDF for the r th order interarrival time, that is, the
time interval from any starting point to the r th success after it, is
f tr ( ) =

r r 1 e
u( ),
(r 1)!

r = 1, 2, . . . .

(5.54)

This PDF is known as the Erlang PDF. Clearly, with r = 1, we have the exponential PDF:
f t ( ) = e u( ).

(5.55)

The RV t is called the first-order interarrival time.


The Erlang PDF is a special case of the gamma PDF:
f x () =

r r 1 e
u(),
(r )

(5.56)

for any real r > 0, > 0, where  is the gamma function



(r ) =

r 1 e d .

(5.57)

Straightforward integration reveals that (1) = 1 and (r + 1) = r (r ) so that if r is a positive
integer then (r ) = (r 1)!for this reason the gamma function is often called the factorial
function. Using the above definition for (r ), it is easily shown that the moment generating
function for a gamma-distributed RV is
r


Mx () =
, for < .
(5.58)

The characteristic function is thus


x (t) =

It follows that the mean and variance are


r
x = ,

jt

and

r

x2 =

(5.59)

r
.
2

(5.60)

Figure 5.8 illustrates the PDF and magnitude of the characteristic function for a RV with
gamma distribution with r = 3 and = 0.2.

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

20

October 30, 2006

19:51

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS


x (t )

f x ()
1

.06

.03

20

40

(a)

(b)

FIGURE 5.8: (a) PDF and (b) magnitude characteristic function for gamma distributed RV with r = 3
and parameter = 0.2.

Drill Problem 5.4.1. On the average, Professor S. Rensselaer makes five blunders per lecture.
Determine the probability that she makes (a) less than six blunders in the next lecture: (b) exactly five
blunders in the next lecture: (c) from three to seven blunders in the next lecture: (d) zero blunders in
the next lecture.
Answers: 0.6160, 0.0067, 0.7419, 0.1755.
Drill Problem 5.4.2. A process yields 0.001% defective items. If one million items are produced,
determine the probability that the number of defective items exceeds twelve.
Answer: 0.2084.
Drill Problem 5.4.3. Professor S. Rensselaer designs her examinations so that the probability of at
least one extremely difficult problem is 0.632. Determine the average number of extremely difficult
problems on a Rensselaer examination.
Answer: 1.

5.5

UNIVARIATE GAUSSIAN DISTRIBUTION

The Gaussian PDF is the most important probability distribution in the field of biomedical
engineering. Plentiful applications arise in industry, research, and nature, ranging from instrumentation errors to scores on examinations. The PDF is named in honor of Gauss (17771855),
who derived the equation based on an error study involving repeated measurements of the same
quantity. However, De Moivre is first credited with describing the PDF in 1733. Applications
also abound in other areas outside of biomedical engineering since the distribution fits the
observed data in many processes. Incidentally, statisticians refer to the Gaussian PDF as the
normal PDF.

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:51

STANDARD PROBABILITY DISTRIBUTIONS

21

Definition 5.5.1. A continuous RV z is a standardized Gaussian RV if the PDF is


1
1 2
f z() = e 2 .
2

(5.61)

The moment generating function for a standardized Gaussian RV can be found as follows:
1
Mz() =
2
1
=
2

e 2 d
1

e 2 (() ) d .
1

Making the change of variable = we find


Mz() = e

1 2
2

f z()d = e 2 ,
1 2

(5.62)

for all real . We have made use of the fact that the function f z is a bona fide PDF, as treated
in Problem 42. Using the Taylor series expansion for an exponential,
e 2 =
1 2



2k
Mx(n) (0)n
=
,
2k k!
n!
k=0
n=0

so that all moments of z exist and


E(z 2k ) =

(2k)!
,
2k k!

k = 0, 1, 2, . . . ,

(5.63)

and
E(z 2k+1 ) = 0,

k = 0, 1, 2, . . . .

(5.64)

Consequently, a standardized Gaussian RV has zero mean and unit variance. Extending the
range of definition of Mz() to include the finite complex plane, we find that the characteristic
function is
z(t) = Mz( j t) = e 2 t .
1 2

Letting the RV x = z + we find that E(x) = and x2 = 2 . For > 0


Fx () = P ( z + ) = Fz(( )/ ),

(5.65)

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

22

October 30, 2006

19:51

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

so that x has the general Gaussian PDF



1
2
f x () =
exp 2 ( ) .
2
2 2
1

(5.66)

Similarly, with > 0 and x = z + we find


Fx () = P ( z + ) = 1 Fz(( )/ ),
so that f x is as above. We will have occasion to use the shorthand notation x G(, 2 ) to
denote that the RV has a Gaussian PDF with mean and variance 2 . Note that if x G(, 2 )
then (x = z + )
x (t) = e j t e 2 t .
1

2 2

(5.67)

The Gaussian PDF, illustrated with = 75 and 2 = 25, as well as with = 75 and 2 = 9
in Figure 5.9, is a bell-shaped curve completely determined by its mean and variance. As can
be seen, the Gaussian PDF is symmetrical about the vertical axis through the expected value.
If, in fact, = 25, identically shaped curves could be drawn, centered now at 25 instead of
75. Additionally, the maximum value of the Gaussian PDF, (2 2 )1/2 , occurs at = . The
PDF approaches zero asymptotically as approaches . Naturally, the larger the value of the
variance, the more spread in the distribution and the smaller the maximum value of the PDF.
For any combination of the mean and variance, the Gaussian PDF curve must be symmetrical
as previously described, and the area under the curve must equal one.
Unfortunately, a closed form expression does not exist for the Gaussian CDF, which
necessitates numerical integration. Rather than attempting to tabulate the general Gaussian
CDF, a normalization is performed to obtain a standardized Gaussian RV (with zero mean
f x ()

.12

.08

.04
65

75

85

FIGURE 5.9: Gaussian probability density function for = 75 and 2 = 9, 25.

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:51

STANDARD PROBABILITY DISTRIBUTIONS

23

and unit variance). If x G(, ), the RV z = (x )/ is a standardized Gaussian RV:


z G(0, 1). This transformation is always applied when using standard tables for computing
probabilities for Gaussian RVs. The probability P (1 < x 2 ) can be obtained as
2

P (1 < x 2 ) = Fx (2 ) Fx (1 ),

(5.68)

Fx () = Fz(( )/ ).

(5.69)

using the fact that

Note that
1
Fz() =
2

e 2 d = 1 Q(),
1 2

(5.70)

where Q() is Marcums Q function:


1
Q() =
2

e 2 d .
1 2

(5.71)

Marcums Q function is tabulated in Tables A.8 and A.9 for 0 < 4 using the approximation
presented in Section 5.5.1. It is easy to show that
Q() = 1 Q() = Fz().

(5.72)

The error and complementary error functions, defined by


2
erf() =

e t d t

(5.73)

e t d t = 1 erf()

(5.74)

and
2
erfc() =

are also often used to evaluate the standard normal integral. A simple change of variable reveals
that

erfc() = 2Q(/ 2).


(5.75)
Example 5.5.1. Compute Fz(1.74), where z G(0, 1).

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

24

October 30, 2006

19:51

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

Solution. To compute Fz(1.74), we find


Fz(1.74) = 1 Q(1.74) = Q(1.74) = 0.04093,


using (5.72) and Table A.8.

While the value a Gaussian random variable takes on is any real number between negative
infinity and positive infinity, the realistic range of values is much smaller. From Table A.9,
we note that 99.73% of the area under the curve is contained between 3.0 and 3.0. From
the transformation z = (x )/ , the range of values random variable x takes on is then
approximately 3 . This notion does not imply that random variable x cannot take on a value
outside this interval, but the probability of it occurring is really very small (2Q(3) = 0.0027).
Example 5.5.2. Suppose x is a Gaussian random variable with = 35 and = 10. Sketch the
PDF and then find P (37 x 51). Indicate this probability on the sketch.
Solution. The PDF is essentially zero outside the interval [ 3, + 3 ] = [5, 65]. The
sketch of this PDF is shown in Figure 5.10 along with the indicated probability. With
z=

x 35
10

we have
P (37 x 51) = P (0.2 z 1.6) = Fz(1.6) Fz(0.2).
Hence P (37 x 51) = Q(0.2) Q(1.6) = 0.36594 from Table A.9.

f x ()
.04
.03
.02
.01

FIGURE 5.10: PDF for Example 5.5.2.

30

60

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:51

STANDARD PROBABILITY DISTRIBUTIONS

25

Example 5.5.3. A machine makes capacitors with a mean value of 25 F and a standard deviation
of 6 F. Assuming that capacitance follows a Gaussian distribution, find the probability that the value
of capacitance exceeds 31 F if capacitance is measured to the nearest F.
Solution. Let the RV x denote the value of a capacitor. Since we are measuring to the nearest
F, the probability that the measured value exceeds 31 F is
P (31.5 x) = P (1.083 z) = Q(1.083) = 0.13941,
where z = (x 25)/6 G(0, 1). This result is determined by linear interpolation of the CDF
between equal 1.08 and 1.09.


5.5.1

Marcums Q Function

Marcums Q function, defined by


1
Q( ) =
2

e2 d
1

(5.76)

has been extensively studied. If the RV z G(0, 1) then


Q( ) = 1 Fz( );

(5.77)

i.e., Q( ) is the complement of the standard Gaussian CDF. Note that Q(0) = 0.5, Q() = 0,
and that Fz( ) = Q( ). A very accurate approximation to Q( ) is presented in [1, p. 932]:
Q( ) e 2 h(t),
1

> 0,

(5.78)

where
t=

1
,
1 + 0.2316419

(5.79)

and
1
h(t) = t(a 1 + t(a 2 + t(a 3 + t(a 4 + a 5 t)))).
2
The constants are
i
1
2
3
4
5

ai
0.31938153
0.356563782
1.781477937
1.821255978
1.330274429

The error in using this approximation is less than 7.5 108 .

(5.80)

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

26

October 30, 2006

19:51

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

A very useful bound for Q() is [1, p. 298]




1 2
1 2
e2
e2
2
2
< Q()
.

+ 2 + 4
+ 2 + 0.5

(5.81)

The ratio of the upper bound to the lower bound is 0.946 when = 3 and 0.967 when = 4.
The bound improves as increases.
Sometimes, it is desired to find the value of for which Q( ) = q . Helstrom [14] offers
an iterative procedure which begins with an initial guess 0 > 0. Then compute
ti =

1
1 + 0.2316419i

(5.82)

and


i+1

h(ti )
= 2 ln
q

1/2
,

i = 0, 1, . . . .

(5.83)

The procedure is terminated when i+1 i to the desired degree of accuracy.


Drill Problem 5.5.1. Students attend Fargo Polytechnic Institute for an average of four years with a
standard deviation of one-half year. Let the random variable x denote the length of attendance and assume that x is Gaussian. Determine: (a) P (1 < x < 3), (b)P (x > 4), (c )P (x = 4), (d )Fx (4.721).
Answers: 0.5, 0, 0.02275, 0.92535.
Drill Problem 5.5.2. The quality point averages of 2500 freshmen at Fargo Polytechnic Institute
follow a Gaussian distribution with a mean of 2.5 and a standard deviation of 0.7. Suppose grade
point averages are computed to the nearest tenth. Determine the number of freshmen you would expect
to score: (a) from 2.6 to 3.0, (b) less than 2.5, (c) between 3.0 and 3.5, (d) greater than 3.5.
Answers: 167, 322, 639, 1179.
Drill Problem 5.5.3. Professor Rensselaer loves the game of golf. She has determined that the distance
the ball travels on her first shot follows a Gaussian distribution with a mean of 150 and a standard
deviation of 17. Determine the value of d so that the range, 150 d , covers 95% of the shots.
Answer: 33.32.

5.6

BIVARIATE GAUSSIAN RANDOM VARIABLES

The previous section introduced the univariate Gaussian PDF along with some general characteristics. Now, we discuss the joint Gaussian PDF and its characteristics by drawing on our
univariate Gaussian PDF experiences, and significantly expanding the scope of applications.

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:51

STANDARD PROBABILITY DISTRIBUTIONS

27

Numerous applications of this joint PDF are found throughout the field of biomedical engineering and, like the univariate case, the joint Gaussian PDF is considered the most important
joint distribution for biomedical engineers.
Definition 5.6.1. The bivariate RV z = (x, y) is a bivariate Gaussian RV if every linear combination of x and y has a univariate Gaussian distribution. In this case we also say that the RVs x and
y are jointly distributed Gaussian RVs.
Let the RV w = ax + by, and let x and y be jointly distributed Gaussian RVs. Then
w is a univariate Gaussian RV for all real constants a and b. In particular, x G(x , x2 ) and
y G( y , y2 ); i.e., the marginal PDFs for a joint Gaussian PDF are univariate Gaussian. The
above definition of a bivariate Gaussian RV is sufficient for determining the bivariate PDF,
which we now proceed to do.
The following development is significantly simplified by considering the standardized
versions of x and y. Also, we assume that |x,y | < 1, x = 0, and y = 0. Let
z1 =

x x
x

and

z2 =

y y
,
y

(5.84)

so that z1 G(0, 1) and z2 G(0, 1). Below, we first find the joint characteristic function
for the standardized RVs z1 and z2 , then the conditional PDF f z2 |z1 and the joint PDF f z1 ,z2 .
Next, the results for z1 and z2 are applied to obtain corresponding quantities x,y , f y|x and f x,y .
Finally, the special cases x,y = 1, x = 0, and y = 0 are discussed.
Since z1 and z2 are jointly Gaussian, the RV t1 z1 + t2 z2 is univariate Gaussian:


t1 z1 + t2 z2 G 0, t12 + 2t1 t2 + t22 .
Completing the square,
t12 + 2t1 t2 + t22 = (t1 + t2 )2 + (1 2 )t22 ,
so that
z1 ,z2 (t1 , t2 ) = E(e j t1 z1 + j t2 z2 ) = e 2 (1
1

)t22 12 (t1 +t2 )2

(5.85)

From (6) we have


1
f z1 ,z2 (, ) =
2

I (, t2 )e jt2 d t2 ,

(5.86)

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

28

October 30, 2006

19:51

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

where
1
I (, t2 ) =
2

z1 ,z2 (t1 , t2 )e j t1 d t1 .

Substituting (5.85) and letting = t1 + t2 , we obtain


I (, t2 ) = e

12 (1 2 )t22

1
2

e 2 e j ( t2 ) d ,
1 2

or
I (, t2 ) = (t2 ) f z1 (),
where
(t2 ) = e j t2 e 2 (1
1

)t22

Substituting into (5.86) we find


1
f z1 ,z2 (, ) = f z1 ()
2

(t2 )e jt2 d t2

and recognize that is the characteristic function for a Gaussian RV with mean and variance
1 2 . Thus


f z1 ,z2 (, )
1
( )2
= f z2 |z1 (|) = 
,
(5.87)
exp
f z1 ()
2(1 2 )
2 (1 2 )
so that
E(z2 | z1 ) = z1

(5.88)

z22 |z1 = 1 2 .

(5.89)



1
2 2 + 2
f z1 ,z2 (, ) =
exp
.
2 (1 2 )1/2
2(1 2 )

(5.90)

and

After some algebra, we find

We now turn our attention to using the above results for z1 and z2 to obtain similar results for
x and y. From (5.84) we find that
x = x z1 + x

and

y = y z2 + y ,

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:51

STANDARD PROBABILITY DISTRIBUTIONS

29

so that the joint characteristic function for x and y is


x,y (t1 , t2 ) = E(e j t1 x+ j t2 y ) = E(e j t1 x z1 + j t2 y z2 )e j t1 x + j t2 y .
Consequently, the joint characteristic function for x and y can be found from the joint characteristic function of z1 and z2 as
x,y (t1 , t2 ) = z1 ,z2 (x t1 , y t2 )e j x t1 e j y t2 .

(5.91)

Using (4.66), the joint characteristic function x,y can be transformed to obtain the joint PDF
f x,y (, ) as
1
f x,y (, ) =
(2)2

 

z1 ,z2 (x t1 , y t2 )e j (x )t1 e j ( y )t2 d t1 d t2 .

(5.92)

Making the change of variables 1 = x t1 , 2 = y t2 , we obtain




x y
1
f x,y (, ) =
.
f z ,z
,
x y 1 2
x
y

(5.93)

Since
f x,y (, ) = f y|x (|) f x ()
and
1
fz
f x () =
x 1

we may apply (5.93) to obtain


1
f z |z
f y|x (|) =
y 2 1

x
x


,



y  x
.
y  x

Substituting (5.90) and (5.87) into (5.93) and (5.94) we find




2(x )( y )
(x )2
1
exp 2(1

+
2)
x y
x2
f x,y (, ) =
2x y (1 2 )1/2
and


exp
f y|x (|) =

x 2
x
2(1 2 ) y2

y y


2 y2 (1 2 )

(5.94)

( y )2
y2


(5.95)


.

(5.96)

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

30

October 30, 2006

19:51

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

It follows that
E(y|x) = y + y

x x
x

(5.97)

and
2
y|x
= y2 (1 2 ).

By interchanging x with y and with ,

exp
f x|y (|) =

y
y
2(1 2 ) y2

x x

2x2 (1 2 )

E(x|y) = x + x

(5.98)
2

(5.99)

y y
,
y

(5.100)

and
2
x|y
= x2 (1 2 ).

(5.101)

A three-dimensional plot of a bivariate Gaussian PDF is shown in Figure 5.11.


The bivariate characteristic function for x and y is easily obtained as follows. Since x and
y are jointly Gaussian, the RV t1 x + t2 y is a univariate Gaussian RV:


t1 x + t2 y G t1 x + t2 y , t12 x2 + 2t1 t2 x,y + t22 y2 .
fx ,y (, )
.265

=3

= 3
= 3
=3

FIGURE 5.11: Bivariate Gaussian PDF f x,y (, ) with x = y = 1, x = y = 0, and = 0.8.

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:51

STANDARD PROBABILITY DISTRIBUTIONS

31

Consequently, the joint characteristic function for x and y is


x,y (t1 , t2 ) = e 2 (t1 x +2t1 t2 x,y +t2 y ) e j t1 x + j t2 y ,
1

(5.102)

which is valid for all x,y , x and y .


We now consider some special cases of the bivariate Gaussian PDF. If = 0 then (from
(5.95))
f x,y (, ) = f x () f y ();

(5.103)

i.e., RVs x and y are independent.


As 1, from (5.97) and (5.98) we find
E(y|x) y y

x x
x

2
and y|x
0. Hence,

y y y
in probability1 . We conclude that

x x
x

x
f x,y (, ) = f x () y y
x


(5.104)

for = 1. Interchanging the roles of x and y we find that the joint PDF for x and y may also
be written as


y
f x,y (, ) = f y () x x
(5.105)
y
when = 1. These results can also be obtained directly from the joint characteristic function
for x and y.
A very special property of jointly Gaussian RVs is presented in the following theorem.
Theorem 5.6.1. The jointly Gaussian RVs x and y are independent iff x,y = 0.
Proof. We showed previously that if x and y are independent, then x,y = 0.
Suppose that = x,y = 0. Then f y|x (|) = f y ().

Example 5.6.1. Let x and y be jointly Gaussian with zero means, x2 = y2 = 1, and = 1.
Find constants a and b such that v = ax + by G(0, 1) and such that v and x are independent.
1

As the variance of a RV decreases to zero, the probability that the RV deviates from its mean by more than an
arbitrarily small fixed amount approaches zero. This is an application of the Chebyshev Inequality.

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

32

October 30, 2006

19:51

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

Solution. We have E(v) = 0. We require


2
v2 = a 2 + b 2 + 2abx,y
=1

and
E(vx) = a + bx,y = 0.
2
Hence a = bx,y and b 2 = 1/(1 x,y
), so that

y x,y x
v= 
2
1 x,y
is independent of x and v2 = 1.

5.6.1

Constant Contours

Returning to the normalized jointly Gaussian RVs z1 and z2 , we now investigate the shape of
the joint PDF f z1 ,z2 (, ) by finding the locus of points where the PDF is constant. We assume
that || < 1. By inspection of (5.90), we find that f z1 ,z2 (, ) is constant for and satisfying
2 2 + 2 = c 2 ,

(5.106)

where c is a positive constant.


If = 0 the contours where the joint PDF is constant is a circle of radius c centered at
the origin.
Along the line = q we find that
2 (1 2q + q 2 ) = c 2

(5.107)

so that the constant contours are parameterized by


c
=
,
1 2q + q 2

(5.108)

and
=

c q
1 2q + q 2

(5.109)

The square of the distance from a point (, ) on the contour to the origin is given by
d 2 (q ) = 2 + 2 =

c 2 (1 + q 2 )
.
1 2q + q 2

(5.110)

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:51

STANDARD PROBABILITY DISTRIBUTIONS

33

Differentiating, we find that d (q ) attains its extremal values at q = 1. Thus, the line =
intersects the constant contour at
c
= = 
.
(5.111)
2(1 )
2

Similarly, the line = , intersects the constant contour at


= = 

c
2(1 )

(5.112)

Consider the rotated coordinates = ( + )/ 2 and = ( )/ 2, so that


+
=

(5.113)


= .

(5.114)

and

The rotated coordinate system is a rotation by /4 counterclockwise. Thus


2 2 + 2 = c 2

(5.115)

2
c2
2
+
=
.
1+
1
1 2

(5.116)

is transformed into

The above equation represents an ellipse with major axis length 2c / 1 || and minor axis

length 2c / 1 + ||. In the plane, the major and minor axes of the ellipse are along the
lines = .
From (5.93), the constant contours for f x,y (, ) are solutions to




 

y
y 2
x 2
x
2
= c 2.
(5.117)
+
x
x
y
y
Using the transformation


y
1 x

=
,
+
x
y
2

1
=
2

y
x

y
x

transforms the constant contour to (5.116). With = 0 we find that one axis is along
y
x
=
x
y

(5.118)

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

34

October 30, 2006

19:51

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

with endpoints at
c y
=
+ y ;
2 1+

c x
=
+ x ,
2 1+
the length of this axis in the plane is

2c

x2 + y2
1+

With = 0 we find that the other axis is along


y
x
=
x
y
with endpoints at
c y
=
+ y ;
2 1

c x
=
+ x ,
2 1
the length of this axis in the plane is

2c

x2 + y2
1

Points on this ellipse in the plane satisfy (5.117); the value of the joint PDF f x,y on this
curve is


1
c2

exp
.
(5.119)
2(1 2 )
2x y 1 2
A further transformation

,
=
1+

transforms the ellipse in the plane to a circle in the plane:


2 + 2 =

c2
.
1 2

This transformation provides a straightforward way to compute the probability that the joint
Gaussian RVs x and y lie within the region bounded by the ellipse specified by (5.117). Letting
A denote the region bounded by the ellipse in the plane and A denote the image (a circle)

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:51

STANDARD PROBABILITY DISTRIBUTIONS


35

in the plane, we have


 
 
f x,y (, ) d d =

1
e 2 ( + )

d d ,
2x y 1 2 |J (, )|
1

where the Jacobian of the transformation is





J (, ) = 



Computing the indicated derivatives, we have




1

 2 1+
 x
J (, ) = 
1


 x 2 1





.


1

y 2 1 +
1

y 2 1






,




so that
J (, ) =

1

.
x y 1 2

Substituting and transforming to polar coordinates, we find


 
 
1 1 ( 2 + 2 )
f x,y (, ) d d =
e 2
d d
2

A
A
2
2 c /  1
1
1 2
=
r e 2 r dr d
2
0

2
c 2 /(22
)


e u d u

=
0

= 1 e c

/(22 2 )

This bivariate Gaussian probability computation is one of the few which is easily accomplished.
Additional techniques for treating these computations are given in [1, pp. 956958].
Drill Problem 5.6.1. Given that x and y are jointly distributed Gaussian random variables with
2
E(y|x) = 2 + 1.5x, E(x|y) = 7/6 + y/6, and x|y
= 0.75. Determine: (a) x , (b) y , (c) x2 , (d)
y2 , and (e) x,y .
Answers: 0.5, 1, 9, 2, 5.

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

36

October 30, 2006

19:51

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

Drill Problem 5.6.2. Random variables x and y are jointly Gaussian with x = 2, y = 3,
x2 = 21, y2 = 31, and = 0.3394. With c 2 = 0.2212 in (154), find: (a) the smallest angle
that either the minor or major axis makes with the positive axis in the plane, (b) the length
of the minor axis, (c) the length of the major axis.
Answers: 3, 2, 30 .
Drill Problem 5.6.3. Random variables x and y are jointly Gaussian with x = 2, y = 3,
x2 = 21, y2 = 31, and = 0.3394. Find: (a) E(y | x = 0), (b)P (1 < y 10 | x = 0),
(c )P (1 < x < 7).
Answers: 0.5212, 0.3889, 2.1753.

5.7

SUMMARY

This chapter introduces certain probability distributions commonly encountered in biomedical


engineering. Special emphasis is placed on the exponential, Poisson and Gaussian distributions.
Important approximations to the Bernoulli PMF and Gaussian CDF are developed.
Bernoulli event probabilities may be approximated by the Poisson PMF when np < 10
or by the Gaussian PDF when npq > 3. For the Poisson approximation use = np. For the
Gaussian approximation use = np and 2 = npq .
Many important properties of jointly Gaussian random variables are presented.
Drill Problem 5.7.1. The length of time William Smith plays a video game is given by random
variable x distributed exponentially with a mean of four minutes. His play during each game is
independent from all other games. Determine: (a) the probability that William is still playing after
four minutes, (b) the probability that, out of five games, he has played at least one game for more than
four minutes.
Answers: exp(1), 0.899.

5.8

PROBLEMS
1. Assume x is a Bernoulli random variable. Determine P (x 3) using the Bernoulli
CDF table if: (a) n = 5, p = 0.1; (b) n = 10, p = 0.1; (c) n = 20, p = 0.1; (d) n = 5,
p = 0.3; (e) n = 10, p = 0.3; (f ) n = 20, p = 0.3; (g) n = 5, p = 0.6; (h) n = 10,
p = 0.6; (i) n = 20, p = 0.6.
2. Suppose you are playing a game with a friend in which you roll a die 10 times. If the
die comes up with an even number, your friend gives you a dollar and if the die comes
up with an odd number you give your friend a dollar. Unfortunately, the die is loaded

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:51

STANDARD PROBABILITY DISTRIBUTIONS

37

so that a 1 or a 3 are three times as likely to occur as a 2, a 4, a 5 or a 6. Determine:


(a) how many dollars your friend can expect to win in this game; (b) the probability of
your friend winning more than 4 dollars.
3. The probability that a basketball player makes a basket is 0.4. If he makes 10 attempts,
what is the probability he will make: (a) at least 4 baskets; (b) 4 baskets; (c) from 7 to
9 baskets; (d) less than 2 baskets; (e) the expected number of baskets.
4. The probability that Professor Rensselaer bowls a strike is 0.2. Determine the probability
that: (a) 3 of the next 20 rolls are strikes; (b) at least 4 of the next 20 rolls are strikes;
(c) from 3 to 7 of the next 20 rolls are strikes. (d) She is to keep rolling the ball until
she gets a strike. Determine the probability it will take more than 5 rolls. Determine
the: (e) expected number of strikes in 20 rolls; (f ) variance for the number of strikes in
20 rolls; (g) standard deviation for the number of strikes in 20 rolls.
5. The probability of a man hitting a target is 0.3. (a) If he tries 15 times to hit the target,
what is the probability of him hitting it at least 5 but less than 10 times? (b) What
is the average number of hits in 30 tries? (c) What is the probability of him getting
exactly the average number of hits in 30 tries? (d) How many times must the man try
to hit the target if he wants the probability of hitting it to be at least 2/3? (e) What is
the probability that no more than three tries are required to hit the target for the first
time?
6. In Junior Bioinstrumentation Lab, one experiment introduces students to the transistor.
Each student is given only one transistor to use. The probability of a student destroying
a transistor is 0.7. One lab class has 5 students and they will perform this experiment
next week. Let random variable x show the possible numbers of students who destroy
transistors. (a) Sketch the PMF for x. Determine: (b) the expected number of destroyed
transistors, (c) the probability that fewer than 2 transistors are destroyed.
7. On a frosty January morning in Fargo, North Dakota, the probability that a car parked
outside will start is 0.6. (a) If we take a sample of 20 cars, what is the probability that
exactly 12 cars will start and 8 will not? (b) What is the probability that the number of
cars starting out of 20 is between 9 and 15.
8. Consider Problem 7. If there are 20,000 cars to be started, find the probability that: (a)
at least 12,100 will start; (b) exactly 12,000 will start; (c) the number starting is between
11,900 and 12,150; (d) the number starting is less than 12,500.
9. A dart player has found that the probability of hitting the dart board in any one throw
is 0.2. How many times must he throw the dart so that the probability of hitting the
dart board is at least 0.6?

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

38

October 30, 2006

19:51

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

10. Let random variable x be Bernoulli with n = 15 and p = 0.4. Determine E(x 2 ).
11. Suppose x is a Bernoulli random variable with = 10 and 2 = 10/3. Determine:
(a) q, (b) n, (c) p.
12. An electronics manufacturer is evaluating its quality control program. The current
procedure is to take a sample of 5 from 1000 and pass the shipment if not more than 1
component is found defective. What proportion of 20% defective components will be
shipped?
13. Repeat Problem 1, when appropriate, using the Poisson approximation to the Bernoulli
PMF.
14. A certain intersection averages 3 traffic accidents per week. What is the probability that
more than 2 accidents will occur during any given week?
15. Suppose that on the average, a student makes 6 mistakes per test. Determine the
probability that the student makes: (a) at least 1 mistake; (b) from 3 to 5 mistakes;
(c) exactly 2 mistakes; (d) more than the expected number of mistakes.
16. On the average, Professor Rensselaer gives 11 quizzes per quarter in Introduction to
Random Processes. Determine the probability that: (a) from 8 to 12 quizzes are given
during the quarter; (b) exactly 11 quizzes are given during the quarter; (c) at least 10
quizzes are given during the quarter; (d) at most 9 quizzes are given during the quarter.
17. Suppose a typist makes an average of 30 mistakes per page. (a) If you give him a one
page letter to type, what is the probability that he makes exactly 30 mistakes? (b) The
typist decides to take typing lessons, and, after the lessons, he averages 5 mistakes per
page. You give him another one page letter to type. What is the probability of him
making fewer than 5 mistakes. (c) With the 5 mistakes per page average, what is the
probability of him making fewer than 50 mistakes in a 25 page report?
18. On the average, a sample of radioactive material emits 20 alpha particles per minute.
What is the probability of 10 alpha particles being emitted in: (a) 1 min, (b) 10 min?
(c) Many years later, the material averages 6 alpha particles emitted per minute. What
is the probability of at least 6 alpha particles being emitted in 1 min?
19. At Fargo Polytechnic Institute (FPI), a student may take a course as many times as
desired. Suppose the average number of times a student takes Introduction to Random
Processes is 1.5. (Professor Rensselaer, the course instructor, thinks so many students
repeat the course because they enjoy it so much.) (a) Determine the probability that a
student takes the course more than once. (b) The academic vice-president of FPI wants
to ensure that on the average, 80% of the students take the course at most one time. To
what value should the mean be adjusted to ensure this?

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:51

STANDARD PROBABILITY DISTRIBUTIONS

39

20. Suppose 1% of the transistors in a box are defective. Determine the probability that
there are: (a) 3 defective transistors in a sample of 200 transistors; (b) more than 15
defective transistors in a sample of 1000 transistors; (c) 0 defective transistors in a
sample of 20 transistors.
21. A perfect car is assembled with a probability of 2 105 . If 15,000 cars are produced
in a month, what is the probability that none are perfect?
22. FPI admits only 1000 freshmen per year. The probability that a student will major in
Bioengineering is 0.01. Determine the probability that fewer than 9 students major in
Bioengineering.
23. (a) Every time a carpenter pounds in a nail, the probability that he hits his thumb is
0.002. If in building a house he pounds 1250 nails, what is the probability of him hitting
his thumb at least once while working on the house? (b) If he takes five extra minutes
off every time he hits his thumb, how many extra minutes can he expect to take off in
building a house with 3000 nails?
24. The manufacturer of Leaping Lizards, a bran cereal with milk expanding (exploding)
marshmallow lizards, wants to ensure that on the average, 95% of the spoonfuls will
each have at least one lizard. Assuming that the lizards are randomly distributed in the
cereal box, to what value should the mean of the lizards per spoonful be set at to ensure
this?
25. The distribution for the number of students seeking advising help from Professor Rensselaer during any particular day is given by
P (x = k) =

3k e 3
,
k!

k = 0, 1, . . . .

The PDF for the time interval between students seeking help for Introduction to
Random Processes from Professor Rensselaer during any particular day is given by
f t ( ) = e u( ).
If random variable z equals the total number of students Professor Rensselaer helps
each day, determine: (a) E(z), (b) z.
26. This year, on its anniversary day, a computer store is going to run an advertising campaign in which the employees will telephone 5840 people selected at random from the
population of North America. The caller will ask the person answering the phone
if its his or her birthday. If it is, then that lucky person will be mailed a brand
new programmable calculator. Otherwise, that person will get nothing. Assuming that

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

40

October 30, 2006

19:51

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

the person answering the phone wont lie and that there is no such thing as leap year,
find the probability that: (a) the computer store mails out exactly 16 calculators, (b) the
computer store mails out from 20 to 40 calculators.
27. Random variable x is uniform between 2 and 3. Event A = {0 < x 2} and B =
{1 < x 0} {1 < x 2}. Find: (a) P (1 < x < 0), (b) x , (c) x , (d) f x|A (|A),
(e) Fx|A (|A), (f ) f x|B (|B), (g) Fx|B (|B).
28. The time it takes a runner to run a mile is equally likely to lie in an interval from 4.0 to
4.2 min. Determine: (a) the probability it takes the runner exactly 4 min to run a mile,
(b) the probability it takes the runner from 4.1 to 4.15 min.
29. Assume x is a standard Gaussian random variable. Using Tables A.9 and A.10, determine: (a) P (x = 0), (b) P (x < 0), (c) P (x < 0.2), (d) P (1.583 x < 1.471),
(e) P (2.1 < x 0.5), (f ) P (x is an integer).
30. Repeat Problem 1, when appropriate, using the Gaussian approximation to the
Bernoulli PMF.
31. A light bulb manufacturer distributes light bulbs that have a length of life that is
normally distributed with a mean equal to 1200 h and a standard deviation of 40 h.
Find the probability that a bulb burns between 1000 and 1300 h.
32. A certain type of resistor has resistance values that are Gaussian distributed with a
mean of 50 ohms and a variance of 3. (a) Write the PDF for the resistance value. (b)
Find the probability that a particular resistor is within 2 ohms of the mean. (c) Find
P (49 < r < 54).
33. Consider Problem 32. If resistances are measured to the nearest ohm, find: (a) the
probability that a particular resistor is within 2 ohms of the mean, (b) P (49 < r < 54).
34. A battery manufacturer has found that 8.08% of their batteries last less than 2.3 years
and 2.5% of their batteries last more than 3.98 years. Assuming the battery lives are
Gaussian distributed, find: (a) the mean, (b) variance.
35. Assume that the scores on an examination are Gaussian distributed with mean 75 and
standard deviation 10. Grades are assigned as follows: A: 90100, B: 8090, C: 7080,
D: 6070, and F: below 60. In a class of 25 students, what is the probability that grades
will be equally distributed?
36. A certain transistor has a current gain, h, that is Gaussian distributed with a mean of 77
and a variance of 11. Find: (a) P (h > 74), (b) P (73 < h 80), (c) P (|h h | < 3h ).
37. Consider Problem 36. Find the value of d so that the range 77 + d covers 95% of the
current gains.

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:51

STANDARD PROBABILITY DISTRIBUTIONS

41

38. A 250 question multiple choice final exam is given. Each question has 5 possible answers
and only one correct answer. Determine the probability that a student guesses the correct
answers for 2025 of 85 questions about which the student has no knowledge.
39. The average golf score for Professor Rensselaer is 78 with a standard deviation of 3.
Assuming a Gaussian distribution for random variable x describing her golf game,
determine: (a) P (x = 78), (b) P (x 78), (c) P (70 < x 80), (d) the probability that
x is less than 75 if the score is measured to the nearest unit.
40. Suppose a system contains a component whose length is normally distributed with a
mean of 2.0 and a standard deviation of 0.2. If 5 of these components are removed from
different systems, what is the probability that at least 2 have a length greater than 2.1?
41. A large box contains 10,000 resistors with resistances that are Gaussian distributed. If
the average resistance is 1000 ohms with a standard deviation of 200 ohms, how many
resistors have resistances that are within 10% of the average?
42. The RV x has PDF



1
2
f x () = a exp 2 ( ) .
2

(a) Find the constant a. (Hint: assume RV y is independent of x and has PDF f y () =
f x () and evaluate Fx,y (, ).) (b) Using direct integration, find E(x). (c) Find x2
using direct integration.
43. Assume x and y are jointly distributed Gaussian random variables with x G(2, 4),
y G(3, 9), and x,y = 0. Find: (a) P (1 < y < 7 | x = 0), (b) P (1 < y < 7),
(c) P (1 < x < 1, 1 < y < 7).
44. Suppose x and y are jointly distributed Gaussian random variables with E(y|x) =
2.8 + 0.32x, E(x|y) = 1 + 0.5y, and y|x = 3.67. Determine: (a) x , (b) y , (c) x ,
(d) y , (e) x,y , (f ) x,y .
45. Assume x G(3, 1), y G(2, 1), and that x and y are jointly Gaussian with x,y =
0.5. Draw a sketch of the joint Gaussian contour equation showing the original and
the translated-rotated sets of axes.
46. Consider Problem 45. Determine: (a) E(y|x = 0), (b) f y|x (|0), (c) P (0 < y < 4|x = 0),
(d) P (3 < x < 10).
47. Assume x and y are jointly Gaussian with x G(2, 13), y G(1, 8), and x,y =
5.8835. (a) Draw a sketch of the constant contour equation for the standardized RVs
z1 and z2 . (b) Using the results of (a), Draw a sketch of the joint Gaussian constant
contour for x and y.

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

42

October 30, 2006

19:51

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

48. Consider Problem 47. Determine: (a) E(y | x = 0), (b) f y | x ( | 0), (c) P (0 <
y < 4|x = 0), (d) P (3 < x < 10).
49. Assume x and y are jointly Gaussian with x G(1, 4), y G(1.6, 7), and x,y =
0.378. (a) Draw a sketch of the constant contour equation for the standardized RVs z1
and z2 . (b) Using the results of (a), Draw a sketch of the joint Gaussian contour for x
and y.
50. Consider Problem 49. Determine: (a) E(y | x = 0), (b) f y|x ( | 0), (c) P (0 <
y < 4|x = 0), (d) P (3 < x < 0).
51. A component with an exponential failure rate is 90% reliable at 10,000 h. Determine
the number of hours reliable at 95%.
52. Suppose a system has an exponential failure rate in years to failure with = 2.5. Determine the number of years reliable at: (a) 90%, (b) 95%, (c) 99%.
53. Consider Problem 52. If 20 of these systems are installed, determine the probability
that 10 are operating at the end of 2.3 years.
54. In the circuit shown in Figure 5.12, each of the four components operate independently
of one another and have an exponential failure rate (in hours) with = 105 . For successful operation of the circuit, at least two of these components must connect A with
B. Determine the probability that the circuit is operating successfully at 10,000 h.
55. The survival rate of individuals with a certain type of cancer is assumed to be exponential
with = 4 years. Five individuals have this cancer. Determine the probability that at
most three will be alive at the end of 2.0433 years.
56. Random variable t is exponential with = 2. Determine: (a) P (t > ), (b) f z() if
z = t T, where T is a constant.
57. William Smith is a varsity wrestler on his high school team. Without exception, if he
does not pin his opponent with his trick move, he loses the match on points. Williams

B
3

FIGURE 5.12: Circuit for Problem 54.

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:51

STANDARD PROBABILITY DISTRIBUTIONS

43

trick move also prevents him from ever getting pinned. The length of time it takes
William to pin his opponent in each period of a wrestling match is given by:
period 1 : f x () = 0.4598394 exp(0.4598394)u(),
period 2 : f x|A (|A) = 0.2299197 exp(0.2299197)u(),
period 3 : f x|A (|A) = 0.1149599 exp(0.1149599)u(),
where A = {Smith did not pin his opponent during the previous periods}. Assume
each period is 2 min and the match is 3 periods. Determine the probability that William
Smith: (a) pins his opponent during the first period: (b) pins his opponent during the
second period: (c) pins his opponent during the third period: (d) wins the match.
58. Consider Problem 57. Find the probability that William Smith wins: (a) at least 4 of
his first 5 matches, (b) more matches than he is expected to during a 10 match season.
59. The average time between power failures in a Coop utility is once every 1.4 years.
Determine: (a) the probability that there will be at least one power failure during the
coming year, (b) the probability that there will be at least two power failures during the
coming year.
60. Consider Problem 59 statement. Assume that a power failure will last at least 24 h. Suppose Fargo Community Hospital has a backup emergency generator to provide auxilary
power during a power failure. Moreover, the emergency generator has an expected time
between failures of once every 200 h. What is the probability that the hospital will be
without power during the next 24 h?
61. The queue for the cash register at a popular package store near Fargo Polytechnic
Institute becomes quite long on Saturdays following football games. On the average,
the queue is 6.3 people long. Each customer takes 4 min to check out. Determine: (a)
your expected waiting time to make a purchase, (b) the probability that you will have
less than four people in the queue ahead of you, (c) the probability that you will have
more than five people in the queue ahead of you.
62. Outer Space Adventures, Inc. prints brochures describing their vacation packages as
Unique, Unexpected and Unexplained. The vacations certainly live up to their advertisement. In fact, they are so unexpected that the length of the vacations is random,
following an exponential distribution, having an average length of 6 months. Suppose
that you have signed up for a vacation trip that starts at the end of this quarter. What
is the probability that you will be back home in time for next fall quarter (that is, 9
months later)?

P1: IML/FFX
MOBK042-05

P2: IML
MOBK042-Enderle.cls

44

October 30, 2006

19:51

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

63. A certain brand of light bulbs has an average life-expectancy of 750 h. The failure rate
of these light bulbs follows an exponential PDF. Seven-hundred and fifty of these bulbs
were put in light fixtures in four rooms. The lights were turned on and left that way for
a different length of time in each room as follows:
ROOM
1
2
3
4

TIME BULBS LEFT ON, HOURS

NUMBER OF BULBS

1000
750
500
1500

125
250
150
225

After the specified length of time, the bulbs were taken from the fixtures and placed
in a box. If a bulb is selected at random from the box, what is the probability that it is
burnt out?

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:53

45

CHAPTER 6

Transformations of Random Variables


Functions of random variables occur frequently in many applications of probability theory. For
example, a full wave rectifier circuit produces an output that is the absolute value of the input.
The input/output characteristics of many physical devices can be represented by a nonlinear
memoryless transformation of the input.
The primary subjects of this chapter are methods for determining the probability distribution of a function of a random variable. We first evaluate the probability distribution of
a function of one random variable using the CDF and then the PDF. Next, the probability
distribution for a single random variable is determined from a function of two random variables
using the CDF. Then, the joint probability distribution is found from a function of two random
variables using the joint PDF and the CDF.

6.1

UNIVARIATE CDF TECHNIQUE

This section introduces a method of computing the probability distribution of a function of


a random variable using the CDF. We will refer to this method as the CDF technique. The
CDF technique is applicable for all functions z = g (x), and for all types of continuous, discrete,
and mixed random variables. Of course, we require that the function z : S  R , with z( ) =
g (x( )), is a random variable on the probability space (S, , P ); consequently, we require z to
be a measurable function on the measurable space (S, ) and P (z( ) {, +}) = 0.
The ease of use of the CDF technique depends critically on the functional form of g (x).
To make the CDF technique easier to understand, we start the discussion of computing the
probability distribution of z = g (x) with the simplest case, a continuous monotonic function
g . (Recall that if g is a monotonic function then a one-to-one correspondence between g (x)
and x exists.) Then, the difficulties associated with computing the probability distribution of
z = g (x) are investigated when the restrictions on g (x) are relaxed.

6.1.1

CDF Technique with Monotonic Functions

Consider the problem where the CDF Fx is known for the RV x, and we wish to find the CDF
for random variable z = g (x). Proceeding from the definition of the CDF for z, we have for a

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

46

October 30, 2006

19:53

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

monotonic increasing function g (x)


Fz( ) = P (z = g (x) ) = P (x g 1 ( )) = Fx (g 1 ( )),

(6.1)

where g 1 ( ) is the value of for which g () = . As (6.1) indicates, the CDF of random
variable z is written in terms of Fx (), with the argument replaced by g 1 ( ).
Similarly, if z = g (x) and g is monotone decreasing, then
Fz( ) = P (z = g (x) ) = P (x g 1 ( )) = 1 Fx (g 1 ( ) ).

(6.2)

The following example illustrates this technique.


Example 6.1.1. Random variable x is uniformly distributed in the interval 0 to 4. Find the CDF
for random variable z = 2x + 1.
Solution. Since random variable x is uniformly distributed, the CDF of x is

<0
0,
Fx () = /4, 0 < 4

1,
4 .
Letting g (x) = 2x + 1, we see that g is monotone increasing and that g 1 ( ) = ( 1)/2.
Applying (6.1), the CDF for z is given by



0,
( 1)/2 < 0

1
Fz( ) = Fx
= ( 1)/8,
0 ( 1)/2 < 4

1,
4 ( 1)/2.
Simplifying, we obtain
Fz( ) =

0,
( 1)/8,

1,

<1
1 <9
9 ,

which is also a uniform distribution.

6.1.2

CDF Technique with Arbitrary Functions

In general, the relationship between x and z can take on any form, including discontinuities.
Additionally, the function does not have to be monotonic, more than one solution of z = g (x)
can existresulting in a many-to-one mapping from x to z. In general, the only requirement
on g is that z = g (x) be a random variable. In this general case, Fz( ) is no longer found by
simple substitution. In fact, under these conditions it is impossible to write a general expression
for Fz( ) using the CDF technique. However, this case is conceptually no more difficult than

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:53

TRANSFORMATIONS OF RANDOM VARIABLES

47

the previous case, and involves only careful book keeping.


Let
A( ) = {x : g (x) }.

(6.3)

Note that A( ) = g 1 ((, ]). Then


Fz( ) = P (g (x) ) = P (x A( )).

(6.4)

Partition A( ) into disjoint intervals {Ai ( ) : i = 1, 2, . . .} so that


A( ) =

Ai ( ).

(6.5)

i=1

Note that the intervals as well as the number of nonempty intervals depends on . Since the
Ai s are disjoint,
Fz( ) =

P (x Ai ( )).

(6.6)

i=1

The above probabilities are easily found from the CDF Fx . If interval Ai ( ) is of the form
Ai ( ) = (a i ( ), b i ( )],

(6.7)

P (x Ai ( )) = Fx (b i ( )) Fx (a i ( )).

(6.8)

then

Similarly, if interval Ai ( ) is of the form


Ai ( ) = [a i ( ), b i ( )],

(6.9)

P (x Ai ( )) = Fx (b i ( )) Fx (a i ( ) ).

(6.10)

then

The success of this method clearly depends on our ability to partition A( ) into disjoint intervals.
Using this method, any function g and CDF Fx is amenable to a solution for Fz( ). The
following examples illustrate various aspects of this technique.
Example 6.1.2. Random variable x has the CDF

0,

Fx () = (3 3 + 2)/4,

1,
Find the CDF for the RV z = x 2 .

< 1
1 < 1
1 .

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

48

October 30, 2006

19:53

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

Solution. Letting g (x) = x 2 , we find


A( ) = g

so that

((, ]) =

[ , ],

<0
0.

Fz( ) = Fx ( ) Fx (( ) ).

Noting that Fx () is continuous and has the same functional form for 1 < < 1, we obtain

0,
<0

3
Fz( ) = (3 ( ) )/2,
0 <1


1
1 .
Example 6.1.3. Random variable x is uniformly distributed in the interval 3 to 3. Random
variable z is defined by

1,
x < 2

2 x < 1
3x + 5,
z = g (x) = 3x 1,
1 x < 0

3x 1,
0x<1

2,
1 x.
Find Fz( ).
Solution. Plots for this example are given in Figure 6.1. The CDF for random variable x is

0,
< 3

Fx () = ( + 3)/6,
3 < 3

1,
3 .
Let A( ) = g 1 ((, )). Referring to Figure 6.1 we find

,



5
1 + 1
A( ) =
,

,
,

3
3
3

< 1
1 < 2
2 .

Consequently,

Fz( ) =


Fx

5
3


+ Fx

0,


 
1
+1
,
Fx
3
3
1

< 1
1 < 2
2 .

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:53

TRANSFORMATIONS OF RANDOM VARIABLES

49

g(x)
4
2

1 1

Fx ()
1

1
Fz ()
1

FIGURE 6.1: Plots for Example 6.1.3.

Substituting, we obtain

0,


1  5
+ 1 1
+2
Fz( ) =
+3+

=
,
6
3
3
3
6

1
Note that Fz( ) has jumps at = 1 and at = 2.

< 1
1 < 2
2 .


Whenever random variable x is continuous and g (x) is constant over some interval or
intervals, then random variable z can be continuous, discrete or mixed, dependent on the CDF
for random variable x. As presented in the previous example, random variable z is mixed due to
the constant value of g (x) in the intervals 3 x < 2 and 1 x < 3. In fact, z is a mixed
random variable whenever g () is constant in an interval where f x () = 0. This results in a
jump in Fz and an impulse function in f z. Moreover, z = g (x) is a discrete random variable if
g () changes values only on intervals where f x () = 0. Random variable z is continuous if x is
continuous and g (x) is not equal to a constant over any interval where f x () = 0.

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

50

October 30, 2006

19:53

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

Example 6.1.4. Random variable x has the PDF

(1 + 2 )/6,
f x () =
0,

1 < < 2
otherwise.

Find the PDF of random variable z defined by

x 1,
z = g (x) =
0,

1,

x0
0 < x 0.5
0.5 < x.

Solution. To find f z, we evaluate Fz first, then differentiate this result. The CDF for random
variable x is

0,
< 1

3
Fx () = ( + 3 + 4)/18,
1 < 2

1,
2 .
Figure 6.2 shows a plot of g (x). With the aid of Figure 6.2, we find

(, + 1],
1

(, 0],
1 < 0
A( ) = {x : g (x) } =

(, 0.5],
0 <1

(, ),
1 .
Consequently,

Fx ( + 1),

F (0),
x
Fz( ) =

F
x (0.5),

1,

1
1 < 0
0 <1
1 .

g( x )
1

1
2

FIGURE 6.2: Transformation for Example 6.1.4.

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:53

TRANSFORMATIONS OF RANDOM VARIABLES

51

Substituting,

0,

(( + 1) + 3 + 7)/18,
Fz( ) =
2/9,

5/16,

1,

< 2
2 < 1
1 < 0
0 <1
1 .

Note that Fz( ) has jumps at = 0 and = 1 of heights 13/144 and 11/16, respectively.
Differentiating Fz,
f z( ) =

13
11
3 2 + 6 + 6
(u( + 2) u( + 1)) +
( ) + ( 1).
18
144
16

Example 6.1.5. Random variable x is uniformly distributed in the interval from 0 to 10. Find the
CDF for random variable z = g (x) = ln(x).
Solution. The CDF for x is

0,
Fx () = /10,

1,

<0
0 < 10
10 .

For > 0, we find


A( ) = {x : ln(x) } = (e , ),
so that Fz( ) = 1 Fx ((e ) ). Note that P (x 0) = 0, as required since g (x) = ln(x) is
not defined (or at least not realvalued) for x 0. We find

0,
< ln(0.1)
Fz( ) =
1 e 0.1 ,
ln(0.1) .

The previous examples illustrated evaluating the probability distribution of a function
of a continuous random variable using the CDF technique. This technique is applicable for
all functions z = g (x), continuous and discontinuous. Additionally, the CDF technique is
applicable if random variable x is mixed or discrete. For mixed random variables, the CDF
technique is used without any changes or modifications as shown in the next example.
Example 6.1.6. Random variable x has PDF
f x () = 0.5(u() u( 1)) + 0.5( 0.5).
Find the CDF for z = g (x) = 1/x 2 .

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

52

October 30, 2006

19:53

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

Solution. The mixed random variable x has CDF

0,
<0

0.5,
0 < 0.5
Fx () =

0.5
+
0.5,
0.5
<1

1,
1 .
For < 0, Fz( ) = 0. For > 0,
A( ) = {x : x
so that

1
} = ,


1
, ,

Fz( ) = Fx (1/ ) + 1 Fx ((1/ ) ).

Since Fx (1/ ) = 0 for all real , we have

0,

0.5 1/2 ,
Fz( ) = 1

0.5 + 0.5 1/2 ,

1,
After some algebra,

(1/ ) < 0

0 (1/ ) < 0.5

0.5 (1/ ) < 1

1 (1/ ) .

0,
<1
1/2
Fz( ) = 0.5 0.5
, 1 <4

1 0.5 1/2 ,
4 .

Drill Problem 6.1.1. Random variable x is uniformly distributed in the interval 1 to 4. Random
variable z = 3x + 2. Determine: (a) Fz(0), (b)Fz(1), (c ) f z(0), (c ) f z(15).
Answers: 0, 2/15, 1/15, 1/15.
Drill Problem 6.1.2. Random variable x has the PDF
f x () = 0.5(u() u( 2)).
Random variable z is defined by

1,
z=
x,

1,

x<1
1 x 1
x > 1.

Determine: (a) Fz(1/2), (b)Fz(1/2), (c )Fz(3/2), (d ) f z(1/2).


Answers: 0, 1, 1/16, 1/4.

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:53

TRANSFORMATIONS OF RANDOM VARIABLES

53

Drill Problem 6.1.3. Random variable x has the PDF


f x () = 0.5(u() u( 2)).
Random variable z is defined by
z=

1,
x + 0.5,

3,

x 0.5
0.5 < x 1
x > 1.

Determine: (a) Fz(1), (b)Fz(0), (c )Fz(3/2), (d )Fz(4).


Answers: 1/4, 1/16, 1/16, 1.
Drill Problem 6.1.4. Random variable x has PDF
f x () = e 1 u( + 1).
Random variable z = 1/x 2 . Determine: (a) Fz(1/8), (b)Fz(1/2), (c )Fz(4), (d ) f z(4).
Answers: 0.0519, 0.617, 0.089, 0.022.

6.2

UNIVARIATE PDF TECHNIQUE

The previous section solved the problem of determining the probability distribution of a function
of a random variable using the cumulative distribution function. Now, we introduce a second
method for calculating the probability distribution of a function z = g (x) using the probability
density function, called the PDF technique. The PDF technique, however, is only applicable
for functions of random variables in which z = g (x) is continuous and does not equal a constant
in any interval in which f x is nonzero. We introduce the PDF technique for two reasons. First,
in many situations it is much simpler to use than the CDF technique. Second, we will find the
PDF method most useful in extensions to multivariate functions. In this section, we discuss
a wide variety of situations using the PDF technique with functions of continuous random
variables. Then, a method for handling mixed random variables with the PDF technique is
introduced. Finally, we consider computing the conditional PDF of a function of a random
variable using the PDF technique.

6.2.1

Continuous Random Variable

Theorem 6.2.1. Let x be a continuous RV with PDF f x () and let the RV z = g (x) . Assume g
is continuous and not constant over any interval for which f x = 0 . Let
i = i ( ) = g 1 ( ),

i = 1, 2, . . . ,

(6.11)

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

54

October 30, 2006

19:53

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

denote the distinct solutions to g (i ) = . Then


f z( ) =


i=1

f x (i ( ))
,
|g (1) (i ( ))|

(6.12)

where we interpret
f x (i ( ))
= 0,
|g (1) (i ( ))|
if f x (i ( ) = 0.
Proof. Let h > 0 and define
I ( , h) = {x : h < g (x) }.
Partition I ( , h) into disjoint intervals of the form
Ii ( , h) = (a i ( , h), b i ( , h)),

i = 1, 2, . . . ,

such that
I ( , h) =

Ii ( , h).

i=1

Then
Fz( ) Fz( h) =

(Fx (b i ( , h)) Fx (a i ( , h)))

i=1

By hypothesis,
lim a i ( , h) = lim b i ( , h) = i ( ).

h0

h0

Note that (for all with f x (i ( )) = 0)


1
b i ( , h) a i ( , h)
b i ( , h) a i ( , h)
,
= lim
= (1)
h0
h0 |g (b i ( , h)) g (a i ( , h))|
h
|g (i ( ))|
lim

and that
lim

h0

Fx (b i ( , h)) Fx (a i ( , h))
= f x (i ( )).
b i ( , h) a i ( , h)

The desired result follows by taking the product of the above limits. The absolute value appears because by construction we have b i > a i and h > 0, whereas g (1) may be positive or
negative.


P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:53

TRANSFORMATIONS OF RANDOM VARIABLES

55

Example 6.2.1. Random variable x is uniformly distributed in the interval 04. Find the PDF for
random variable z = g (x) = 2x + 1.
Solution. In this case, there is only one solution to the equation = 2 + 1, given by 1 =
( 1)/2. We easily find g (1) () = 2. Hence

1/8,
1< <9
f z( ) = f x (( 1)/2)/2 =

0,
otherwise.
Example 6.2.2. Random variable x has PDF
f x () = 0.75(1 2 )(u( + 1) u( 1)).
Find the PDF for random variable z = g (x) = 1/x 2 .
Solution. For < 0, there are no solutions to g (i ) = , so that f z( ) = 0 for < 0. For
> 0 there are two solutions to g (i ) = :
1
1 ( ) = ,

and

1
2 ( ) = .

Since = g () = 2 , we have g (1) () = 2 3 ; hence, |g (1) (i )| = 2/|i |3 = 2| |3/2 , and


f z( ) =

f x ( 1/2 ) + f x ( 1/2 )
u( ).
2 3/2

Substituting,
f z( ) =

0.75(1 1 )(u(1 1/2 ) 0 + 1 u( 1/2 1))


u( ).
2 3/2

Simplifying,
f z( ) =

0.75(1 1 )(u( 1) 0 + 1 u(1 ))


u( ),
2 3/2

or
3

f z( ) = 0.75( 2 2 )u( 1).


Example 6.2.3. Random variable x has PDF

1
f x () = (1 + 2 )(u( + 1) u( 2)).
6
Find the PDF for random variable z = g (x) = x 2 .

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

56

October 30, 2006

19:53

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

Solution. For < 0 there are no solutions to g (i ) = , so that f z( ) = 0 for < 0. For
> 0 there are two solutions to g (i ) = :

1 ( ) = , and 2 ( ) = .

Since = g () = 2 , we have g (1) () = 2; hence, |g (1) (i )| = 2|i | = 2 , and

f x ( ) + f x ( )
u( ).
f z( ) =

2
Substituting,
f z( ) =

1+

(u(1 ) u(2 ) + u( + 1) u( 2))u( ).


12

Simplifying,
f z( ) =
or

6.2.2

1+
(u(1 ) 0 + 1 u( 4))u( ),
12

1/2

+ 1/2 )/6,
(
f z( ) = ( 1/2 + 1/2 )/12,

0,

0< <1
1< <4
elsewhere.

Mixed Random Variable

Consider the problem where random variable x is mixed, and we wish to find the PDF for
z = g (x). Here, we treat the discrete and continuous portions of f x separately, and then combine
the results to yield f z. The continuous part of the PDF of x is handled by the PDF technique.
To illustrate the use of this technique, consider the following example.
Example 6.2.4. Random variable x has PDF
1
1
3
f x () = (u( + 1) u( 1)) + ( + 0.5) + ( 0.5).
8
8
8
Find the PDF for the RV z = g (x) = e x .
Solution. There is only one solution to g () = :
1 ( ) = ln( ).
We have g (1) (1 ) = e ln( ) = . The probability masses of 1/8 for x at 0.5 and 0.5 are
mapped to probability masses of 1/8 for z at e 0.5 and e 0.5 , respectively. For all > 0 such that

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:53

TRANSFORMATIONS OF RANDOM VARIABLES

57

|1 ( ) 0.5| > 0 we have


f z( ) =

f x ( ln( ))
.

Combining these results, we find


f z( ) =

6.2.3

3
1
1
(u( e 1 ) u( e )) + ( e 0.5 ) + ( e 0.5 ).
8
8
8

Conditional PDF Technique

Since a conditional PDF is also a PDF, the above techniques apply to find the conditional PDF
for z = g (x), given event A. Consider the problem where random variable x has PDF f x , and
we wish to evaluate the conditional PDF for random variable z = g (x), given that event A
occurred. Depending on whether the event A is defined on the range or domain of z = g (x),
one of the following two methods may be used to determine the conditional PDF of z using
the PDF technique.
(i) If A is an event defined for an interval on z, the conditional PDF, f z|A , is computed by first
evaluating f z using the technique in this section. Then, by the definition of a conditional
PDF, we have
f z|A ( |A) =

f z( )
,
P (A)

A,

(6.13)

and f z|A ( |A) = 0 for A.


(ii) If A is an event defined for an interval on x, we will use the conditional PDF of x to evaluate
the conditional PDF for z as
f z|A ( |A) =


f z|A (i ( )|A)
.
|g (1) (i ( ))|
i=1

(6.14)

Example 6.2.5. Random variable x has the PDF


1
f x () = (1 + 2 )(u( + 1) u( 2)).
6
Find the PDF for random variable z = g (x) = x 2 , given A = {x : x > 0}.
Solution. First, we solve for the conditional PDF for x and then find the conditional PDF for
z, based on f x|A . We have

7
1 2
(1 + 2 )d = ,
P (A) =
6 0
9

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

58

October 30, 2006

19:53

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

so that
f x|A (|A) =

3
(1 + 2 )(u() u( 2)).
14

There is only one solution to = g () = 2 on the interval 0 < < 2 where f x|A = 0. We

have 1 ( ) = and |g (1) (1 ( ))| = 2 . Consequently,




3
1
f z|A ( |A) =
(u( ) u( 4)).
+

28

Drill Problem 6.2.1. Random variable x has a uniform PDF in the interval 08. Random variable
z = 3x + 1 .Use the PDF method to determine: (a) f z(0), (b) f z(6), (c )E(z), (d )z2 .
Answers: 13, 48, 0, 1/24.
Drill Problem 6.2.2. Random variable x has the PDF

0 < 0.5
9 ,
2
f x () = 3(1 ),
0.5 1

0,
otherwise.
Random variable z = x 3 .Use the PDF method to determine: (a) f z(1/27), (b) f z(1/4), (c ) f z|z>1/8
(1/4|z > 1/8), (d ) f z(2).
Answers: 1.52, 0, 3, 2.43.
Drill Problem 6.2.3. Random variable x has the PDF
2
f x () = (u() u( 3)).
9
Random variable z = (x 1)2 . Use the PDF method to determine: (a) f z(1/4), (b) f z(9/4), (c )
f z|z1 (1/4|z 1), (d )E(z|z 1).
Answers: 4/9, 5/27, 1/3, 1.
Drill Problem 6.2.4. Random variable x has the PDF
2
f x () = ( + 1)(u( + 1) u( 2)).
9
Random variable z = 2x 2 and event A = {x : x 0}. Determine: (a) P (A), (b) f x|A (1|A), (c )
f z|A (2|A),(d ) f z|A (9|A).
Answers: 0, 1/2, 1/8, 8/9.

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:53

TRANSFORMATIONS OF RANDOM VARIABLES

6.3

59

ONE FUNCTION OF TWO RANDOM VARIABLES

Consider a random variable z = g (x, y) created from jointly distributed random variables x
and y. In this section, the probability distribution of z = g (x, y) is computed using a CDF
technique similar to the one at the start of this chapter. Because we are dealing with regions in
a plane instead of intervals on a line, these problems are not as straightforward and tractable as
before.
With z = g (x, y), we have
Fz( ) = P (z ) = P (g (x, y) ) = P ((x, y) A( )),

(6.15)

A( ) = {(x, y) : g (x, y) }.

(6.16)

where

The CDF for the RV z can then be found by evaluating the integral

Fz( ) =
dFx,y (, ).

(6.17)

A( )

This result cannot be continued further until a specific Fx,y and g (x, y) are considered. Remember that in the case of a single random variable, our efforts primarily dealt with algebraic
manipulations. Here, our efforts are concentrated on evaluating Fz through integrals, with the
ease of solution critically dependent on g (x, y).
The ease in solution for Fz is dependent on transforming A( ) into proper limits of
integration. Sketching the support region for f x,y (the region where f x,y = 0, or Fx,y is not
constant) and the region A( ) is often most helpful, even crucial, in the problem solution. Pay
careful attention to the limits of integration to determine the range of integration in which the
integrand is zero because f x,y = 0. Let us consider several examples to illustrate the mechanics
of the CDF technique and also to provide further insight.
Example 6.3.1. Random variables x and y have joint PDF

1/4,
0 < < 2, 0 < < 2
f x,y (, ) =
0,
otherwise.
Find the CDF for z = x + y.
Solution. We have A( ) = {(, ) : + }. We require the volume under the surface
f x,y (, ) where :
 
f x,y (, )d d.
Fz( ) =

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

60

October 30, 2006

19:53

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

2
(a)0 < < 2

2
(b)2 < < 4

FIGURE 6.3: Plots for Example 6.3.1.

For < 0 we have Fz( ) = 0. For 0 < 2, with the aid of Figure 6.3(a) we obtain


Fz( ) =
0

1
1
d d = 2 .
4
8

For 2 < 4, referring to Figure 6.3(b), we consider the complementary region (to save some
work):

Fz( ) = 1

1
1
d d = 1 (4 )2 .
4
8


Finally, for 4 , Fz( ) = 1.


Example 6.3.2. Random variables x and y have joint PDF

1,
0 < < 1, 0 < < 1
f x,y (, ) =
0,
otherwise.
Find the PDF for z = x y.

Solution. We have A( ) = {(, ) : }. We require the volume under the surface


f x,y (, ) where + :

Fz( ) =

f x,y (, )d d.

For < 1 we have Fz( ) = 0 and f z( ) = 0. With the aid of Figure 6.4(a), for 1 < 0,

Fz( ) =


0

1
d d = (1 + )2 ,
2

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:53

TRANSFORMATIONS OF RANDOM VARIABLES

61

1
1+

1
(a) 1 < < 0

1
(b) 0 < < 1

FIGURE 6.4: Plots for Example 6.3.2.

so that f z( ) = 1 + . For 0 < 1, we consider the complementary region shown in Figure 6.4(b) (to save some work):
 1  1
1
d d = 1 (1 )2 ,
Fz( ) = 1
2
0
+
so that f z( ) = 1 . Finally, for 1 , Fz( ) = 1, so that f z( ) = 0.

Example 6.3.3. Find the CDF for z = x/y, where x and y have the joint PDF

1/,
0<<<1
f x,y (, ) =
0,
otherwise.
Solution. We have A( ) = {(, ) : / }. Inside the support region for f x,y , we have
/ > 1; hence, for < 1 we have Fz( ) = 0. As shown in Figure 6.5 it is easiest to integrate
with respect to first: for 1 ,
 1
1
1
dd = 1 .
Fz( ) =


0
/

1
1

FIGURE 6.5: Integration region for Example 6.3.3.

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

62

October 30, 2006

19:53

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

Example 6.3.4. Find the CDF for z = x 2 + y 2 , where x and y have the joint PDF

3, 0 < < < 1


f x,y (, ) =
0,
otherwise.
Solution. We have A( ) = {(, ) : 2 + 2 }. For < 0, we obtain Fz( ) = 0. Transforming to polar coordinates: = r cos( ), = r sin(),
 2 
Fz( ) =
f x,y (r cos(), r sin( ))r dr d .
r 2

Referring to Figure 6.6(a), for 0 < 1,


 /4 
1 3
Fz( ) =
3r 2 cos()dr d = 2 .
2
0
0
For 1 < 2, we split the integral into two parts: one with polar coordinates, the other using
rectangular coordinates. With

1
sin(1 ) =
,

we find with the aid of Figure 6.6(b) that


 /4 

2
3r cos()dr d +
Fz( ) =
1

3d d ,

or
3
1 3
Fz( ) = 2 ( 1) 2 .
2

Finally, we have Fz( ) = 1 for 2 .

(a) 0 < < 1

FIGURE 6.6: Integration regions for Example 6.3.4.

1
(b) 1 < < 2

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:53

TRANSFORMATIONS OF RANDOM VARIABLES

63

Drill Problem 6.3.1. Random variables x and y have joint PDF


f x,y (, ) = e e u()u().
Random variable z = x y . Find (a) Fz(1/3), (b) f z(1), (c )Fz(1), and (d) f z(1).
Answers: 14 e 1 , 34 e 1 , 1 34 e 1 , 34 e 1 .

6.4

BIVARIATE TRANSFORMATIONS

In this section, we find the joint distribution of random variables z = g (x, y) and w = h(x, y)
from jointly distributed random variables x and y. First, we consider a bivariate CDF technique.
Then, the joint PDF technique for finding the joint PDF for random variables z and w formed
from x and y is described. Next, the case of one function of two random variables is treated by
using the joint PDF technique with an auxiliary random variable. Finally, the conditional joint
PDF is presented.

6.4.1

Bivariate CDF Technique

Let x and y be jointly distributed RVs on the probability space (S, , P ), and let z = g (x, y)
and w = h(x, y). Define A( , ) to be the region of the x y plane for which z = g (x, y)
and w = h(x, y) ; i.e.,
A( , ) = {(x, y) : g (x, y) , h(x, y) }.

(6.18)

A( , ) = g 1 ((, ]) h 1 ((, ]).

(6.19)

Note that

Then


Fz,w ( , ) =


A( ,)

dFx,y (, ).

It is often difficult to perform the integration indicated in (6.20).


Example 6.4.1. Random variables x and y have the joint PDF

0.25,
0 2, 0 2
f x,y (, ) =
0,
otherwise.
With z = g (x, y) = x + y and w = h(x, y) = y, find the joint PDF f z,w .
Solution. We have
g 1 ((, ]) = {(x, y) : x + y }

(6.20)

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

64

October 30, 2006

19:53

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS


y

FIGURE 6.7: Integration region for Example 6.4.1.

and
h 1 ((, ]) = {(x, y) : y }.
The intersection of these regions is
A( , ) = {(x, y) : y min( x, )},
which is illustrated in Figure. 6.7. With the aid of Figure 6.7 and (6.20) we find

Fz,w ( , ) =

d Fx,y (, ).

Instead of carrying out the above integration and then differentiating the result, we differentiate
the above integral to obtain the PDF f z,w directly. We find
1
Fz,w ( , )
= lim
h 1 0 h 1

h 1

d Fx,y (, ),

and
2 Fz,w ( , )
1
= lim lim
h 2 0 h 1 0 h 1 h 2

h 2

h 1

dFx,y (, ).

Performing the indicated limits, we find that f z,w ( , ) = f x,y ( , ); substituting, we


obtain

0.25,
0 < < 2, 0 < < 2
f z,w ( , ) =
0,
otherwise.


f z,w

When the RVs x and y are jointly continuous, it is usually easier to find the joint PDF
than to carry out the integral indicated in (6.20).

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:53

TRANSFORMATIONS OF RANDOM VARIABLES

6.4.2

65

Bivariate PDF Technique

A very important special case of bivariate transformations is when the RVs x and y are jointly
continuous RVs and the mapping determined by g and h is continuous with g and h having
continuous partial derivatives. Let h 1 > 0, h 2 > 0 and define
I ( , h 1 , , h 2 ) = g 1 (( h 1 , ]) h 1 (( h 2 , ]).

(6.21)

Partition I into disjoint regions so that


I ( , h 1 , , h 2 ) =

Ii ( , h 1 , , h 2 ).

(6.22)

Let (i , i ) denote the unique element of


lim lim Ii ( , h 1 , , h 2 ).

h 1 0 h 2 0

Then for small h 1 and small h 2 we have


h 1 h 2 f v,w ( , )


f x,y (i , i )

Ii ( ,h 1 ,,h 2 )

d d.

(6.23)

Dividing both sides of the above by h 1 h 2 and letting h 1 0 and h 2 0, the approximation
becomes an equality. The result is summarized by the theorem below.
Theorem 6.4.1. Let x and y be jointly continuous RVs with PDF f x,y , and let z = g (x, y) and
w = h(x, y) . Let
(i ( , ), i ( , )),

i = 1, 2, . . . ,

(6.24)

be the distinct solutions to the simultaneous equations


g (i , i ) =

(6.25)

h(i , i ) = .

(6.26)

and

Define the Jacobian



 g (, )



J (, ) = 
 h(, )




g (, ) 

,
h(, ) 


(6.27)

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

66

October 30, 2006

19:53

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

where the indicated partial derivatives are assumed to exist. Then


f z,w ( , ) =


f ( , )
 x,y i i .
 J (i , i )

(6.28)

i=1

The most difficult aspects to finding f z,w using the joint PDF technique are solving the simultaneous equations = g (i , i ) and = h(i , i ) for i and i , and determining the
support region for f z,w . Let us consider several examples to illustrate the mechanics of this
PDF technique.
Example 6.4.2. Random variables x and y have the joint PDF

1/4,
0 < < 2, 0 < < 2
f x,y (, ) =
0,
otherwise.
Find the joint PDF for z = g (x, y) = x + y and w = h(x, y) = y.
Solution. There is only one solution to = + and = : 1 = and 1 = . The
Jacobian of the transformation is


1 1


J =
 = 1 1 1 0 = 1.
0 1
Applying Theorem 1, we find
f z,w ( , ) =

f x,y (1 , 1 )
= f x,y ( , ).
|J |

Substituting, we obtain

f z,w ( , ) =

1/4,
0,

0 < < 2, 0 < < 2


otherwise.

The region of support for f z,w is illustrated in Figure 6.8.


Example 6.4.3. Random variables x and y have the joint PDF

12(1 ),
0 < < 1, 0 < < 1
f x,y (, ) =
0,
otherwise.
Find the joint PDF for z = g (x, y) = x 2 y and w = h(x, y) = y.

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:53

TRANSFORMATIONS OF RANDOM VARIABLES

67

FIGURE 6.8: Support region for Example 6.4.2.

Solution. Solving = 2 and = , we find 1 = / and 1 = . The other solution


2 = 1 is not needed since f x,y (, ) = 0 for < 0. The Jacobian is


 2 2 

 1 1
1
J1 = 
 = 21 1 = 2 .
 0
1
Applying Theorem 1, we find

f x,y ( /, )
f z,w ( , ) =

2
6(1 /),
=
0,

0<

/ < 1, 0 < < 1


otherwise.

Example 6.4.4. Random variables x and y have the joint PDF

0.25( + 1),
1 < < 1, 1 < < 1
f x,y (, ) =
0,
otherwise.
Find the joint PDF for z = g (x, y) = xy and w = h(x, y) = y/x.
Solution. Solving = and = /, we find = / = /, so that 2 = . Let

ting 1 = we have 1 = /. Then 2 = and 2 = /. Note that the


solution (1 , 1 ) is in the first quadrant of the plane; the solution (2 , 2 ) is in the third
quadrant of the plane. We find






J =
=2 ,
2
 / 1/ 

so that J 1 = 2 = J 2 . Hence

f x,y ( /, )
f x,y ( /, )
f z,w ( , ) =
+
.
2||
2||

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

68

October 30, 2006

19:53

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

In the plane,

f x,y ( /, )
2||

has support region specified by


0<

/ < 1

and

0<

< 1,

or
0<
Similarly,

<1

and 0 < < 1.

f x,y ( /, )
2||

has support region specified by



1 < / < 0

and


1 < < 0,

or
0<

<1

and 0 < < 1.

Consequently, the two components of the PDF f z,w have identical regions of support in
the plane. In the first quadrant of the plane, this support region is easily seen to be
0 < < < 1 . Similarly, in the third quadrant, the support region is 1 < < < 0.
This support region is illustrated in Figure 6.9. Finally, we find

1/(4||),
0 < < < 1 , or 1 < < < 0
f z,w ( , ) =

0,
otherwise.
Auxiliary Random Variables
The joint PDF technique can also be used to transform random variables x and y with joint
PDF f x,y to random variable z = g (x, y) by introducing auxiliary random variable w = h(x, y)
to find f z,w and then finding the marginal PDF f z using

f z,w ( , )d .
(6.29)
f z( ) =

It is usually advisable to let the auxiliary random variable w equal a quantity which allows
a convenient solution of the Jacobian and/or the inverse equations. This method is an alternative
to the CDF technique presented earlier.

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:53

TRANSFORMATIONS OF RANDOM VARIABLES

69

3
2
1

1
1

1
2
3

FIGURE 6.9: Support region for Example 6.4.4.

Example 6.4.5. Random variables x and y have the joint PDF

4,
0 < < 1, 0 < < 1
f x,y (, ) =
0,
otherwise.
Find the PDF for z = g (x, y) = x 2 .
Solution. Let auxiliary variable w = h(x, y) = y. Solving = 2 and = , we find =

and = . The only solution inside the support region for f x,y is = and = .
The Jacobian of the transformation is


 2 0 


J =
 = 2,
 0 1
so that

f x,y ( , ) = 2,
2
f z,w ( , ) =
=

2
0,

We find the marginal PDF for z as



f z( ) =

0 < < 1, 0 < < 1


otherwise.


f z,w ( , )d =

for 0 < < 1 and f z( ) = 0, otherwise.

2 d .

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

70

October 30, 2006

19:53

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

FIGURE 6.10: Support region for Example 6.4.6.

Example 6.4.6. Random variables x and y have the joint PDF

4,
0 < < 1, 0 < < 1
f x,y (, ) =
0,
otherwise.
Find the PDF for z = g (x, y) = x + y.
Solution. Let auxiliary variable w = h(x, y) = y. Solving = + and = , we find
= and = . We find


1 1


J =
 = 1,
0 1
so that

f z,w ( , ) = f x,y ( , ) =

4( ),
0,

0 < < 1, 0 < < 1


otherwise.

The support region for f z,w is shown in Figure 6.10. Referring to Figure 6.10, for 0 < < 1,

2
f z( ) =
4( ) d = 3 .
3
0
For 1 < < 2,

f z( ) =
Otherwise, f z = 0.

2
8
4( ) d = 3 + 4 .
3
3
1


Conditional PDF Technique


Since a conditional PDF is also a PDF, the above techniques apply to find the conditional PDF
for z = g (x, y), given event A. Consider the problem where random variables x and y have joint

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:53

TRANSFORMATIONS OF RANDOM VARIABLES

71

PDF f x,y , and we wish to evaluate the conditional PDF for random variable z = g (x, y), given
that event A occurred. Depending on whether the event A is defined on the range or domain of
z = g (x, y), one of the following two methods may be used to determine the conditional PDF
of z using the bivariate PDF technique.
i. If A is an event defined for an interval on z, the conditional PDF, f z|A , is computed by first
evaluating f z using the technique in this section. Then, by the definition of a conditional
PDF, we have
f z|A ( |A) =

f z( )
,
P (A)

A.

(6.30)

ii. If A is an event defined for a region in the x y plane, we will use the conditional PDF
f x,y|A to evaluate the conditional PDF f z|A as follows. First, introduce an auxiliary random
variable w = h(x, y), and evaluate
f z,w|A ( , |A) =


f x,y|A (i ( , ), i ( , )|A)
,
|J (i ( , ), i ( , ))|
i=1

(6.31)

where (i , i ), i = 1, 2, . . ., are the solutions to = g (, ) and = h(, ) in region


A. Then evaluate the marginal conditional PDF

f z|A ( |A) =
f z,w|A ( , |A)d .
(6.32)

Example 6.4.7. Random variables x and y have joint PDF

3,
0<<<1
f x,y (, ) =
0,
otherwise.
Find the PDF for z = x + y, given event A = {max(x, y) 0.5}.
Solution. We begin by finding the conditional PDF for x and y, given A. From Figure 6.11(a)
we find
 
 0.5 
1
P (A) =
f x,y (, )d d =
3 d d = ;
8
0
0
A
consequently,

f x,y (, ) = 24,
f x,y|A (, |A) =
P (A)

0,

0 < < < 0.5


otherwise.

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

72

October 30, 2006

19:53

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

FIGURE 6.11: Support region for Example 6.4.7.

Letting the auxiliary random variable w = h(x, y) = y, the only solution to = g (, ) =


+ and = h(, ) = is = and = . The Jacobian of the transformation is
J = 1 so that
f x,y|A ( , |A)
.
1

f z,w|A ( , |A) =
Substituting, we find

f z,w|A ( , |A) =

24( ),
0,

0 < < < 0.5


otherwise.

The support region for f z,w|A is illustrated in Figure 6.11(b). The conditional PDF for z, given
A, can be found from

f z|A ( |A) =

f z,w|A ( , |A)d .

The integration is easily carried out with the aid of Figure 6.11(b). For 0.5 < < 1 we have


0.5

f z|A ( |A) =

24( )d = 9 2 .

For 0.5 < < 1 we obtain



f z|A ( |A) =
For < 0 or > 1, f z|A ( |A) = 0.

0.5
0.5

24( )d = 3(1 2 ).


P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:53

TRANSFORMATIONS OF RANDOM VARIABLES

73

Drill Problem 6.4.1. Random variables x and y have joint PDF


f x,y (, ) = e u()u().
Random variable z = x and w = xy. Find: (a) f z,w (1, 1), (b) f z,w (1, 1).
Answers: 0, e 2 .
Drill Problem 6.4.2. Random variables x and y have joint PDF

4,
0 < < 1, 0 < < 1
f x,y (, ) =
0,
otherwise.
Random variable z = x + y and w = y 2 . Determine: (a) f z,w (1, 1/4), (b) f z(1/2), (c ) f z(1/2),
(d ) f z(3/2).
Answers: 0, 1/12, 1, 13/12.
Drill Problem 6.4.3. Random variables x and y have joint PDF

1.2( 2 + ),
0 < < 1, 0 < < 1
f x,y (, ) =
0,
otherwise.
Random variable z = x 2 y. Determine: (a) f z(1), (b) f z(1/4), (c ) f z(1/2), (d ) f z(3/2).
Answers: 0.7172, 0, 0, 1.3.
Drill Problem 6.4.4. Random variables x and y have joint PDF

2, 0 < < 1, 0 < < 1


f x,y (, ) =
0,
otherwise.
Random variable z = x y and event A = {(x, y) : x + y 1}. Determine: (a) f z|A (3/
2|A), (b) f z|A (1/2|A), (c )Fz|A (0|A), (d ) f z|A (1/2|A).
Answers: 0, 3/4, 15/16, 3/16.

6.5

SUMMARY

This chapter presented a number of different approaches to find the probability distribution of
functions of random variables.
Two methods are presented to find the probability distribution of a function of one random
variable, z = g (x). The first, and the most general approach, is called the CDF technique. From
the definition of the CDF, we write
Fz( ) = P (z ) = P (g (x) ) = P (x A( )),

(6.33)

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

74

October 30, 2006

19:53

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

where A( ) = {x : g (x) }. By partitioning A( ) into disjoint intervals, the CDF Fz may be


found using the CDF Fx . The second approach, called the PDF technique, involves evaluating
f z( ) =


i=1

f ( ( ))
 x i
,
g (1) (i ( ))

(6.34)

where
i = i ( ) = g 1 ( ),

i = 1, 2, . . . ,

(6.35)

denote the distinct solutions to g (i ) = . Typically, the PDF technique is much simpler to
use than the CDF technique. However, the PDF technique is applicable only when z = g (x)
is continuous and does not equal a constant in any interval in which f x is nonzero.
Next, we evaluated the probability distribution of a random variable z = g (x, y) created
from jointly distributed random variables x and y using two approaches. The first approach, a
CDF technique, involves evaluating
Fz( ) = P (z ) = P (g (x, y) ) = P ((x, y) A( )),

(6.36)

A( ) = {(x, y) : g (x, y) }.

(6.37)

where

The CDF for the RV z can then be found by evaluating the integral

Fz( ) =
dFx,y (, ).

(6.38)

A( )

The ease of solution here involves transforming A( ) into proper limits of integration. We wish
to remind the reader that the special case of a convolution integral is obtained when random
variables x and y are independent and z = x + y. The second approach involves introducing
an auxiliary random variable and using the PDF technique applied to two functions of two
random variables.
To find the joint probability distribution of random variables z = g (x, y) and w = h(x, y)
from jointly distributed random variables x and y, a bivariate CDF as well as a joint PDF
technique were presented. Using the joint PDF technique, the joint PDF for z and w can be
found as


f ( , )
 x,y i i ,
f z,w ( , ) =
(6.39)
 J (i , i )
i=1

where
(i ( , ), i ( , )),

i = 1, 2, . . . ,

(6.40)

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:53

TRANSFORMATIONS OF RANDOM VARIABLES

75

are the distinct solutions to the simultaneous equations


g (i , i ) =

(6.41)

h(i , i ) = .

(6.42)

and

The Jacobian

 g (, )


J (, ) = 
 h(, )



g (, ) 


,
h(, ) 


(6.43)

where the indicated partial derivatives are assumed to exist. Typically, the most difficult aspect
of the joint PDF technique is in solving the simultaneous equations. The joint PDF technique
can also be used to find the probability distribution for z = g (x, y) from f x,y by introducing an
auxiliary random variable w = h(x, y) to find f z,w , and then integrating to obtain the marginal
PDF f z.

6.6

PROBLEMS
1. Let random variable x be uniform between 0 and 2 with z = exp(x). Find Fz using the
CDF technique.
2. Given

f x () =

0.5(1 + ),
0,

and

z=

x 1,
x + 1,

1 < < 1
otherwise,

x<0
x > 0.

Use the CDF technique to find Fz.


3. Suppose random variable x has the CDF

0,
Fx () = 2 ,

1,

<0
0<1
1

and z = exp(x). Using the CDF technique, find Fz.

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

76

October 30, 2006

19:53

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

4. The PDF of random variable x is


1 1
f x () = e 2 (+2) u( + 2).
2
Find Fz using the CDF technique when z = |x|.
5. Suppose random variable x has the PDF

2,
f x () =
0,

0<<1
otherwise.

Random variable z is defined by

1,
z=
x,

1,

x 1
1 < x < 1
x 1.

Using the CDF method, determine Fz.


6. Random variable x represents the input to a half-wave rectifier and z represents the
output, so that z = u(x). Given that x is uniformly distributed between 2 and 2, find:
(a) E(z) using f x , (b) Fz using the CDF method, (c) f z, (d) E(z) using f z (compare
with the answer to part a).
7. Random variable x represents the input to a full-wave rectifier and z represents the
output, so that z = |x|. Given that x is uniformly distributed between 2 and 2, find:
(a) E(z) using f x , (b) Fz using the CDF method, (c) f z, (d) E(z) using f z (compare
with the answer to part a).
8. Random variable x has the PDF f x () = e u(). Find Fz for z = x 2 using the CDF
method.
9. Given
f x () =
and

Determine: (a) a, (b) Fz.

1,
z=
x,

2,

a
1 + 2

x < 1
1 x 2
x > 2.

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:53

TRANSFORMATIONS OF RANDOM VARIABLES

77

10. Random variables x and z are the input and output of a quantizer. The relationship
between them is defined by:

0,

1,
z = 2,

3,

4,

x < 0.5
0.5 x < 1.5
1.5 x < 2.5
2.5 x < 3.5
x 3.5.

Given that the input follows a Gaussian distribution with x G(2.25, 0.49), find f z
using the CDF method.
11. Random variable x has the PDF f x () = e u(). With z = 100 25x, find: (a) Fz
using the CDF technique, (b) Fz using the PDF technique.
12. Random variable x has the following PDF

4 3 ,
f x () =
0,

0<<1
otherwise.

Find the PDFs for the following random variables: (a) z = x 3 , (b) z = (x 1/2)2 , (c)
z = (x 1/4)2 .
13. Random variable x has the CDF

0,

2
Fx () = ( + 2 + 1)/9,

1,
and

x,
z = 0,

x,

< 1
1 < 2
2

x < 0.5
0.5 x 0
0 < x.

Determine Fz.
14. Suppose

f x () =

(1 + 2 )/6,
0,

1 < < 2
otherwise.

Let z = 1/x 2 . Use the CDF technique to determine Fz.

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

78

October 30, 2006

19:53

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS


z = g( x )
1

FIGURE 6.12: Plot for Problem 15.

15. Random variable x has the CDF

Fx () =

0,
0.25( + 1),

1,

< 1
1 < 3
3 .

Find the CDF Fz, with RV z = g (x), and g (x) shown in Figure 6.12. Assume that
g (x) is a second degree polynomial for x 0.
16. The voltage x in Figure 6.13 is a random variable which is uniformly distributed from
1 to 2. Find the PDF f z. Assume the diode is ideal.
17. The voltage x in Figure 6.14 is a Gaussian random variable with mean x = 0 and
standard deviation x = 3. Find the PDF f z. Assume the diodes are ideal.
18. The voltage x in Figure 6.15 is a Gaussian random variable with mean x = 1 and
standard deviation x = 1. Find the PDF f z. Assume the diode and the operational
amplifier are ideal.
19. Random variable x is Gaussian with mean x = 0 and standard deviation x = 1. With
z = g (x) shown in Figure 6.16, find the PDF f z.

FIGURE 6.13: Circuit for Problem 16.

1K

3K

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:53

TRANSFORMATIONS OF RANDOM VARIABLES


+

79

z
2V

3V

FIGURE 6.14: Circuit for Problem 17.

20. Random variable x has the following PDF

+ 0.5( 0.5),
f x () =
0,

0<<1
otherwise.

Find Fz if z = x 2 .
21. Random variable x is uniform in the interval 0 to 12. Random variable z = 4x + 2.
Find f z using the PDF technique.
22. Find f z if z = 1/x 2 and x is uniform on 1 to 2.
23. Random variable x has the PDF

f x () =

2( + 1)/9,
0,

1 < < 2
otherwise.

Find the PDF of z = 2x 2 using the PDF technique.


24. Let z = cos(x). Find f z if:
(a)

f x () =

1/,
0,

|| < /2
otherwise.

5K
1K

FIGURE 6.15: Circuit for Problem 18.

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

80

October 30, 2006

19:53

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

z = g( x )
1.4

0.5

FIGURE 6.16: Transformation for Problem 19.

(b)
25. Given that x has the CDF

f x () =

8/ 2 ,
0,

0,
Fx () = ,

1,

0 < < /2
otherwise.

<0
0<1
1 .

Find the PDF of z = 2 ln(x) using the PDF technique.


26. Random variable x has the PDF
f x () =

2/9,
0,

0<<3
otherwise.

Random variable z = (x 1)2 and event A = {x : x 1/2}. Find the PDF of random
variable z, given event A.
27. Random variable x is uniform between 1 and 1. Random variable

x 2,
x<0
z=
x,
x 0.
Using the PDF technique, find f z.
28. A voltage v is a Gaussian random variable with v = 0 and v = 2. Random variable
w = v2 /R represents the power dissipated in a resistor of R with v volts across the
resistor. Find (a) f w , (b) f w|A if A = {v 0}.

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:53

TRANSFORMATIONS OF RANDOM VARIABLES

81

29. Random variable x has an exponential distribution with mean one. If z = e , use
the PDF technique to determine: (a) f z|A ( |A) if A = {x : x > 2}, (b) f z|B ( |B) if
B = {z : z < 1/2}.
30. Find f z if z = 1/x and
f x () =

1
.
+ 1)

( 2

31. Given that random variable x has the PDF

,
f x () = 2 ,

0,

0<<1
1<<2
otherwise.

Using the PDF technique, find the PDF of z = x 2 .


32. Suppose

f x () =

2,
0,

0<<1
otherwise,

and z = 8x 3 . Determine f z using the PDF technique.


33. Given

9 2 ,
f x () = 3(1 2 ),

0,

0 < < 0.5


0.5 < < 1
otherwise,

and z = 2 ln(x). Determine: (a) f z, (b) Fz, (c) E(z), (d) E(z2 ).
34. Using the PDF technique, find f z if z = |x| and x is a standard Gaussian random
variable.
35. Suppose RV x has PDF f x () = 0.5(u() u( 2)). Find a transformation g such
that z = g (x) has PDF
f z( ) = c 2 (u( ) u( 1)).
36. Let

f x () =

0.75(1 2 ),
0,

1 < < 1
otherwise,

and z = x 2 . Determine f z using the PDF technique.

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

82

October 30, 2006

19:53

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

37. Find f z if z = sin(x) and

f x () =

0 < 2
otherwise.

1/(2),
0,

38. Random variable x has the PDF

0.5,

f x () = 0.5 0.25,

0,
(a) Find the transformation z = g (x) so that

1 0.25 ,
f z( ) = 0.5 0.25 ,

0,

1 < < 0
0<<2
otherwise.

0< <1
1< <2
otherwise.

(b) Determine f z if z = (2x + 2)u(x).


39. Let random variable x have the PDF

1/12,
f x () =
1/4,

2 < < 1
1 < < 2;

in addition, P (x = 2) = P (x = 2) = 0.25. If z = 1/x, find f z.


40. Random variable x has PDF
1
1
f x () = ( + 1) + () + (u() u( 2)).
4
4
Random variable z = g (x), with g shown in Figure 6.17. Find the PDF f z.
41. Random variable x has PDF
f x () =

2
(u( + 1) u( 2)).
3
z = g(x)
2
1

FIGURE 6.17: Transformation for Problem 40.

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:53

TRANSFORMATIONS OF RANDOM VARIABLES

83

z = g( x )
2
1

FIGURE 6.18: Transformation for Problem 41.

Random variable z = g (x), with g shown in Figure 6.18. Use the PDF technique to
find: (a) f z, (b) f z|A with A = {x > 0}.
42. The PDF for random variable x is f x () = e u(). With z = e x , determine: (a)
f z|A ( |A) where A = {x : x > 2}, (b) f z|B ( |B) where B = {z : z < 1/2}.
43. The joint PDF of x and y is

f x,y (, ) =

e ,
0,

0<<
otherwise.

With z = x + y, write an expression(s) for Fz( ) (do not solve, just write the integral(s)
necessary to find Fz.
44. If z = x/y, find Fz when x and y have the joint PDF

f x,y (, ) =

0.25,
0,

0<<<1
otherwise.

45. Resistors R1 and R2 have values r 1 and r 2 which are independent RVs uniformly
distributed between 1 and 2 . With r denoting the equivalent resistance of a series
connection of R1 and R2 , find the PDF fr using convolution.
46. Resistors R1 and R2 have values r 1 and r 2 which are independent RVs uniformly
distributed between 1 and 2 . With g denoting the equivalent conductance of a
parallel connection of R1 and R2 , find the PDF f g using convolution.
47. Suppose the voltage v across a resistor is a random variable with PDF

f v () =

6(1 ),
0,

0<<1
otherwise,

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

84

October 30, 2006

19:53

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

and that resistance r is a random variable with PDF

1/12,
94 < < 106
fr () =
0,
otherwise.
Moreover, suppose that v and r are independent random variables. Find the PDF for
power, p = v2 /r .
48. The joint PDF for random variables x and y is

+ ,
0 < < 1, 0 < < 1
f x,y (, ) =
0,
otherwise.
Let z = 2x + y and w = x + 2y. Find f z,w .
49. The joint PDF for random variables x and y is

4,
0 < < 1, 0 < < 1
f x,y (, ) =
0,
otherwise.
Let z = x 2 and w = xy. Find: (a) f z,w , (b) f z.
50. The joint PDF for random variables x and y is

3( 2 + 2 ),
f x,y (, ) =
0,

0<<<1
otherwise.

Find the joint PDF for random variables z = x + y and w = x y.


51. The joint PDF for random variables x and y is

0.5e ,
0 < , 0 < < 2
f x,y (, ) =
0,
otherwise.
With z = y/x, find f z.
52. The joint PDF for random variables x and y is

8,
0 < 2 + 2 < 1, 0 < , 0 <
f x,y (, ) =
0,
otherwise.
Let A = {(x, y) : x > y}, z = x and w = x 2 + y 2 . Find: (a) f z,w|A , (b) f z|A .
53. The joint PDF for random variables x and y is

1/,
0<<<1
f x,y (, ) =
0,
otherwise.
Let z = x/y and w = y. Find f z,w .

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:53

TRANSFORMATIONS OF RANDOM VARIABLES

54. The joint PDF for random variables x and y is

sin(),
0 < < 1, 0 < <
f x,y (, ) =
0,
otherwise.
Let z = x 2 and w = cos(y). Find f z,w .
55. The joint PDF for random variables x and y is

12(1 ),
0 < < 1, 0 < < 1
f x,y (, ) =
0,
otherwise.
Let z = x 2 y. Find f z.
56. The joint PDF for random variables x and y is

e ,
f x,y (, ) =
0,

0 < , 0 <
otherwise.

Let z = x + y and w = x/(x + y). Find: (a) f z,w , (b) f z.


57. The joint PDF for random variables x and y is

0.25( + ),
f x,y (, ) =
0,

|| < 1, || < 1
otherwise.

Let z = xy and w = y/x. Find f z,w using the joint CDF technique.

85

P1: IML/FFX
MOBK042-06

P2: IML
MOBK042-Enderle.cls

October 30, 2006

19:53

86

P1: IML/FFX
P2: IML
MOBK042-Appe
MOBK042-Enderle.cls

October 30, 2006

19:54

87

APPENDIX A

Distribution Tables
TABLE A.1: Bernoulli CDF for n = 5 and n = 10

n=5
p

+0

+1

+2

+3

+4

0.05

0.773781

0.977408

0.998842

0.999970

1.000000

0.1

0.590490

0.918540

0.991440

0.999540

0.999990

0.15

0.443705

0.835210

0.973388

0.997773

0.999924

0.2

0.327680

0.737280

0.942080

0.993280

0.999680

0.25

0.237305

0.632813

0.896484

0.984375

0.999023

0.3

0.168070

0.528220

0.836920

0.969220

0.997570

0.35

0.116029

0.428415

0.764831

0.945978

0.994748

0.4

0.077760

0.336960

0.682560

0.912960

0.989760

0.45

0.050328

0.256218

0.593127

0.868780

0.981547

0.5

0.031250

0.187500

0.500000

0.812500

0.968750

n = 10
p

+0

+1

+2

+3

+4

0.05

0.598737

0.913862

0.988496

0.998971

0.999936

0.999997

1.000000

1.000000

1.000000

1.000000

0.348678

0.736099

0.929809

0.987205

0.998365

0.999853

0.999991

1.000000

1.000000

1.000000

0.196874

0.544300

0.820197

0.950030

0.990126

0.998617

0.999865

0.999991

1.000000

1.000000

0.1
0.15

(Continued )

P1: IML/FFX
P2: IML
MOBK042-Appe
MOBK042-Enderle.cls

88

October 30, 2006

19:54

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

TABLE A.1: Bernoulli CDF for n = 5 and n = 10 (Continued)

n = 10
p

+0

+1

+2

+3

+4

0.2

0.107374

0.375810

0.677800

0.879126

0.967206

0.993631

0.999136

0.999922

0.999996

1.000000

0.056314

0.244025

0.525593

0.775875

0.921873

0.980272

0.996494

0.999584

0.999970

0.999999

0.028248

0.149308

0.382783

0.649611

0.849732

0.952651

0.989408

0.998410

0.999856

0.999994

0.013463

0.085954

0.261607

0.513827

0.751495

0.905066

0.973976

0.995179

0.999460

0.999972

0.006047

0.046357

0.167290

0.382281

0.633103

0.833761

0.945238

0.987705

0.998322

0.999895

0.002533

0.023257

0.099560

0.266038

0.504405

0.738437

0.898005

0.972608

0.995498

0.999659

0.000977

0.010742

0.054688

0.171875

0.376953

0.623047

0.828125

0.945313

0.989258

0.999023

0.25
0.3
0.35
0.4
0.45
0.5

TABLE A.2: Bernoulli CDF for n = 15

n = 15
p

+0

+1

+2

+3

+4

0.05

0.463291

0.829047

0.963800

0.994533

0.999385

0.999947

0.999996

1.000000

1.000000

1.000000

10

1.000000

1.000000

1.000000

1.000000

1.000000

0.205891

0.549043

0.815939

0.944444

0.987280

0.997750

0.999689

0.999966

0.999997

1.000000

10

1.000000

1.000000

1.000000

1.000000

1.000000

0.1

P1: IML/FFX
P2: IML
MOBK042-Appe
MOBK042-Enderle.cls

October 30, 2006

19:54

DISTRIBUTION TABLES

TABLE A.2: Bernoulli CDF for n = 15 (Continued)

n = 15
p

+0

+1

+2

+3

+4

0.15

0.087354

0.318586

0.604225

0.822655

0.938295

0.983190

0.996394

0.999390

0.999919

0.999992

10

0.999999

1.000000

1.000000

1.000000

1.000000

0.035184

0.167126

0.398023

0.648162

0.835766

0.938949

0.981941

0.995760

0.999215

0.999887

10

0.999988

0.999999

1.000000

1.000000

1.000000

0.013363

0.080181

0.236088

0.461287

0.686486

0.851632

0.943380

0.982700

0.995807

0.999205

10

0.999885

0.999988

0.999999

1.000000

1.000000

0.004748

0.035268

0.126828

0.296868

0.515491

0.721621

0.868857

0.949987

0.984757

0.996347

10

0.999328

0.999908

0.999991

1.000000

1.000000

0.001562

0.014179

0.061734

0.172696

0.351943

0.564282

0.754842

0.886769

0.957806

0.987557

10

0.997169

0.999521

0.999943

0.999996

1.000000

0.000470

0.005172

0.027114

0.090502

0.217278

0.403216

0.609813

0.786897

0.904953

0.966167

10

0.990652

0.998072

0.999721

0.999975

0.999999

0.000127

0.001692

0.010652

0.042421

0.120399

0.260760

0.452160

0.653504

0.818240

0.923071

10

0.974534

0.993673

0.998893

0.999879

0.999994

0.000031

0.000488

0.003693

0.017578

0.059235

0.150879

0.303619

0.500000

0.696381

0.849121

10

0.940765

0.982422

0.996307

0.999512

0.999969

0.2

0.25

0.3

0.35

0.4

0.45

0.5

89

P1: IML/FFX
P2: IML
MOBK042-Appe
MOBK042-Enderle.cls

90

October 30, 2006

19:54

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

TABLE A.3: Bernoulli CDF for n = 20

n = 20
p

+0

+1

+2

+3

+4

0.05

0.358486

0.735840

0.924516

0.984098

0.997426

0.999671

0.999966

0.999997

1.000000

1.000000

10

1.000000

1.000000

1.000000

1.000000

1.000000

15

1.000000

1.000000

1.000000

1.000000

1.000000

0.121577

0.391747

0.676927

0.867047

0.956825

0.988747

0.997614

0.999584

0.999940

0.999993

10

0.999999

1.000000

1.000000

1.000000

1.000000

15

1.000000

1.000000

1.000000

1.000000

1.000000

0.038760

0.175558

0.404896

0.647725

0.829847

0.932692

0.978065

0.994079

0.998671

0.999752

10

0.999961

0.999995

0.999999

1.000000

1.000000

15

1.000000

1.000000

1.000000

1.000000

1.000000

0.011529

0.069175

0.206085

0.411449

0.629648

0.804208

0.913307

0.967857

0.990018

0.997405

10

0.999437

0.999898

0.999985

0.999998

1.000000

15

1.000000

1.000000

1.000000

1.000000

1.000000

0.003171

0.024313

0.091260

0.225156

0.414842

0.617173

0.785782

0.898188

0.959075

0.986136

10

0.996058

0.999065

0.999816

0.999970

0.999996

15

1.000000

1.000000

1.000000

1.000000

1.000000

0.000798

0.007637

0.035483

0.107087

0.237508

0.416371

0.608010

0.772272

0.886669

0.952038

10

0.982855

0.994862

0.998721

0.999739

0.999957

15

0.999994

0.999999

1.000000

1.000000

1.000000

0.000181

0.002133

0.012118

0.044376

0.118197

0.245396

0.416625

0.601027

0.762378

0.878219

10

0.946833

0.980421

0.993985

0.998479

0.999689

15

0.999950

0.999994

0.999999

1.000000

1.000000

0.1

0.15

0.2

0.25

0.3

0.35

P1: IML/FFX
P2: IML
MOBK042-Appe
MOBK042-Enderle.cls

October 30, 2006

19:54

DISTRIBUTION TABLES

91

TABLE A.3: Bernoulli CDF for n = 20 (Continued)

n = 20
p

+0

+1

+2

+3

+4

0.4

0.000037

0.000524

0.003611

0.015961

0.050952

0.125599

0.250011

0.415893

0.595599

0.755337

10

0.872479

0.943474

0.978971

0.993534

0.998388

15

0.999683

0.999953

0.999995

1.000000

1.000000

0.000006

0.000111

0.000927

0.004933

0.018863

0.055334

0.129934

0.252006

0.414306

0.591361

10

0.750711

0.869235

0.941966

0.978586

0.993566

15

0.998469

0.999723

0.999964

0.999997

1.000000

0.000001

0.000020

0.000201

0.001288

0.005909

0.020695

0.057659

0.131588

0.251722

0.411901

10

0.588099

0.748278

0.868412

0.942341

0.979305

15

0.994091

0.998712

0.999799

0.999980

0.999999

0.45

0.5

TABLE A.4: Poisson CDF for = 0.1, 0.2, . . . , 1, 1.5, 2, . . . , 4.5

+0

+1

+2

+3

+4

0.1

0.904837

0.995321

0.999845

0.999996

1.000000

0.2

0.818731

0.982477

0.998852

0.999943

0.999998

0.3

0.740818

0.963064

0.996400

0.999734

0.999984

0.4

0.670320

0.938448

0.992074

0.999224

0.999939

0.5

0.606531

0.909796

0.985612

0.998248

0.999828

0.999986

0.999999

1.000000

1.000000

1.000000

0.548812

0.878099

0.976885

0.996642

0.999605

0.999961

0.999997

1.000000

1.000000

1.000000

0.496585

0.844195

0.965858

0.994247

0.999214

0.999910

0.999991

0.999999

1.000000

1.000000

0.6
0.7

(Continued )

P1: IML/FFX
P2: IML
MOBK042-Appe
MOBK042-Enderle.cls

92

October 30, 2006

19:54

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

TABLE A.4: Poisson CDF for = 0.1, 0.2, . . . , 1, 1.5, 2, . . . , 4.5 (Continued)

+0

+1

+2

+3

+4

0.8

0.449329

0.808792

0.952577

0.990920

0.998589

0.999816

0.999979

0.999998

1.000000

1.000000

0.406570

0.772482

0.937143

0.986541

0.997656

0.999656

0.999957

0.999995

1.000000

1.000000

0.367879

0.735759

0.919699

0.981012

0.996340

0.999406

0.999917

0.999990

0.999999

1.000000

0.223130

0.557825

0.808847

0.934358

0.981424

0.995544

0.999074

0.999830

0.999972

0.999996

0.135335

0.406006

0.676676

0.857123

0.947347

0.983436

0.995466

0.998903

0.999763

0.999954

0.082085

0.287297

0.543813

0.757576

0.891178

0.957979

0.985813

0.995753

0.998860

0.999723

10

0.999938

0.999987

0.999998

1.000000

1.000000

0.049787

0.199148

0.423190

0.647232

0.815263

0.916082

0.966491

0.988095

0.996197

0.998897

10

0.999708

0.999929

0.999984

0.999997

0.999999

0.030197

0.135888

0.320847

0.536633

0.725445

0.857614

0.934712

0.973261

0.990126

0.996685

10

0.998981

0.999711

0.999924

0.999981

0.999996

0.018316

0.091578

0.238103

0.433470

0.628837

0.785130

0.889326

0.948866

0.978637

0.991868

10

0.997160

0.999085

0.999726

0.999924

0.999980

0.011109

0.061099

0.173578

0.342296

0.532104

0.702930

0.831051

0.913414

0.959743

0.982907

10

0.993331

0.997596

0.999195

0.999748

0.999926

15

0.999980

0.999995

0.999999

1.000000

1.000000

0.9
1
1.5
2
2.5

3.5

4.5

P1: IML/FFX
P2: IML
MOBK042-Appe
MOBK042-Enderle.cls

October 30, 2006

19:54

DISTRIBUTION TABLES

93

TABLE A.5: Poisson CDF for = 5, 5.5, . . . , 8.5

+0

+1

+2

+3

+4

0.006738

0.040428

0.124652

0.265026

0.440493

0.615961

0.762183

0.866628

0.931906

0.968172

10

0.986305

0.994547

0.997981

0.999302

0.999774

15

0.999931

0.999980

0.999995

0.999999

1.000000

0.004087

0.026564

0.088376

0.201699

0.357518

0.528919

0.686036

0.809485

0.894357

0.946223

10

0.974749

0.989012

0.995549

0.998315

0.999401

15

0.999800

0.999937

0.999981

0.999995

0.999999

0.002479

0.017351

0.061969

0.151204

0.285057

0.445680

0.606303

0.743980

0.847238

0.916076

10

0.957379

0.979908

0.991173

0.996372

0.998600

15

0.999491

0.999825

0.999943

0.999982

0.999995

0.001503

0.011276

0.043036

0.111850

0.223672

0.369041

0.526524

0.672758

0.791573

0.877384

10

0.933161

0.966120

0.983973

0.992900

0.997044

15

0.998840

0.999570

0.999849

0.999949

0.999984

0.000912

0.007295

0.029636

0.081765

0.172992

0.300708

0.449711

0.598714

0.729091

0.830496

10

0.901479

0.946650

0.973000

0.987189

0.994283

15

0.997593

0.999042

0.999638

0.999870

0.999956

0.000553

0.004701

0.020257

0.059145

0.132062

0.241436

0.378155

0.524639

0.661967

0.776408

10

0.862238

0.920759

0.957334

0.978435

0.989740

15

0.995392

0.998041

0.999210

0.999697

0.999889

20

0.999961

0.999987

0.999996

0.999999

1.000000

5.5

6.5

7.5

(Continued )

P1: IML/FFX
P2: IML
MOBK042-Appe
MOBK042-Enderle.cls

94

October 30, 2006

19:54

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

TABLE A.5: Poisson CDF for = 5, 5.5, . . . , 8.5 (Continued)

+0

+1

+2

+3

+4

0.000335

0.003019

0.013754

0.042380

0.099632

0.191236

0.313374

0.452961

0.592547

0.716624

10

0.815886

0.888076

0.936203

0.965819

0.982743

15

0.991769

0.996282

0.998406

0.999350

0.999747

20

0.999906

0.999967

0.999989

0.999996

0.999999

0.000203

0.001933

0.009283

0.030109

0.074364

0.149597

0.256178

0.385597

0.523105

0.652974

10

0.763362

0.848662

0.909083

0.948589

0.972575

15

0.986167

0.993387

0.996998

0.998703

0.999465

20

0.999789

0.999921

0.999971

0.999990

0.999997

8.5

TABLE A.6: Poisson CDF for = 9, 9.5, 10, 11, 12, 13

+0

+1

+2

+3

+4

0.000123

0.001234

0.006232

0.021226

0.054964

0.115691

0.206781

0.323897

0.455653

0.587408

10

0.705988

0.803008

0.875773

0.926149

0.958534

15

0.977964

0.988894

0.994680

0.997574

0.998944

20

0.999561

0.999825

0.999933

0.999975

0.999991

0.000075

0.000786

0.004164

0.014860

0.040263

0.088528

0.164949

0.268663

0.391824

0.521826

10

0.645328

0.751990

0.836430

0.898136

0.940008

15

0.966527

0.982273

0.991072

0.995716

0.998038

20

0.999141

0.999639

0.999855

0.999944

0.999979

0.000045

0.000499

0.002769

0.010336

0.029253

0.067086

0.130141

0.220221

0.332820

0.457930

10

0.583040

0.696776

0.791556

0.864464

0.916542

9.5

10

P1: IML/FFX
P2: IML
MOBK042-Appe
MOBK042-Enderle.cls

October 30, 2006

19:54

DISTRIBUTION TABLES

95

TABLE A.6: Poisson CDF for = 9, 9.5, 10, 11, 12, 13 (Continued)

11

12

13

+0

+1

+2

+3

+4

15

0.951260

0.972958

0.985722

0.992813

0.996546

20

0.998412

0.999300

0.999704

0.999880

0.999953

0.000017

0.000200

0.001211

0.004916

0.015105

0.037520

0.078614

0.143192

0.231985

0.340511

10

0.459889

0.579267

0.688697

0.781291

0.854044

15

0.907396

0.944076

0.967809

0.982313

0.990711

20

0.995329

0.997748

0.998958

0.999536

0.999801

25

0.999918

0.999967

0.999987

0.999995

0.999998

0.000006

0.000080

0.000522

0.002292

0.007600

0.020341

0.045822

0.089505

0.155028

0.242392

10

0.347229

0.461597

0.575965

0.681536

0.772025

15

0.844416

0.898709

0.937034

0.962584

0.978720

20

0.988402

0.993935

0.996953

0.998527

0.999314

25

0.999692

0.999867

0.999944

0.999977

0.999991

0.000002

0.000032

0.000223

0.001050

0.003740

0.010734

0.025887

0.054028

0.099758

0.165812

10

0.251682

0.353165

0.463105

0.573045

0.675132

15

0.763607

0.835493

0.890465

0.930167

0.957331

20

0.974988

0.985919

0.992378

0.996028

0.998006

25

0.999034

0.999548

0.999796

0.999911

0.999962

TABLE A.7: Poisson CDF for = 14, 15, 16, 17, 18

+0

+1

+2

+3

+4

14

0.000001

0.000012

0.000094

0.000474

0.001805

0.005532

0.014228

0.031620

0.062055

0.109399

10

0.175681

0.260040

0.358458

0.464448

0.570437
(Continued )

P1: IML/FFX
P2: IML
MOBK042-Appe
MOBK042-Enderle.cls

96

October 30, 2006

19:54

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

TABLE A.7: Poisson CDF for = 14, 15, 16, 17, 18 (Continued)

15

16

17

+0

+1

+2

+3

+4

15

0.669360

0.755918

0.827201

0.882643

0.923495

20

0.952092

0.971156

0.983288

0.990672

0.994980

25

0.997392

0.998691

0.999365

0.999702

0.999864

30

0.999940

0.999974

0.999989

0.999996

0.999998

0.000000

0.000005

0.000039

0.000211

0.000857

0.002792

0.007632

0.018002

0.037446

0.069854

10

0.118464

0.184752

0.267611

0.363218

0.465654

15

0.568090

0.664123

0.748859

0.819472

0.875219

20

0.917029

0.946894

0.967256

0.980535

0.988835

25

0.993815

0.996688

0.998284

0.999139

0.999582

30

0.999803

0.999910

0.999960

0.999983

0.999993

0.000000

0.000002

0.000016

0.000093

0.000400

0.001384

0.004006

0.010000

0.021987

0.043298

10

0.077396

0.126993

0.193122

0.274511

0.367527

15

0.466745

0.565962

0.659344

0.742349

0.812249

20

0.868168

0.910773

0.941759

0.963314

0.977685

25

0.986881

0.992541

0.995895

0.997811

0.998869

30

0.999433

0.999724

0.999869

0.999940

0.999973

0.000000

0.000001

0.000007

0.000041

0.000185

0.000675

0.002062

0.005433

0.012596

0.026125

10

0.049124

0.084669

0.135024

0.200873

0.280833

15

0.371454

0.467738

0.564023

0.654958

0.736322

20

0.805481

0.861466

0.904728

0.936704

0.959354

25

0.974755

0.984826

0.991166

0.995016

0.997273

30

0.998552

0.999253

0.999626

0.999817

0.999913

P1: IML/FFX
P2: IML
MOBK042-Appe
MOBK042-Enderle.cls

October 30, 2006

19:54

DISTRIBUTION TABLES

97

TABLE A.7: Poisson CDF for = 14, 15, 16, 17, 18 (Continued)

+0

+1

+2

+3

+4

18

0.000000

0.000000

0.000003

0.000018

0.000084

0.000324

0.001043

0.002893

0.007056

0.015381

10

0.030366

0.054887

0.091669

0.142598

0.208077

15

0.286653

0.375050

0.468648

0.562245

0.650916

20

0.730720

0.799124

0.855090

0.898890

0.931740

25

0.955392

0.971766

0.982682

0.989700

0.994056

30

0.996669

0.998187

0.999040

0.999506

0.999752

TABLE A.8: Marcums Q function for = 0, 0.01, . . . , 1.99

+0.00

+0.01

+0.02

+0.03

+0.04

0.00

0.500000

0.496011

0.492022

0.488033

0.484046

0.05

0.480061

0.476078

0.472097

0.468119

0.464144

0.10

0.460172

0.456205

0.452242

0.448283

0.444330

0.15

0.440382

0.436441

0.432505

0.428576

0.424655

0.20

0.420740

0.416834

0.412936

0.409046

0.405165

0.25

0.401294

0.397432

0.393580

0.389739

0.385908

0.30

0.382089

0.378281

0.374484

0.370700

0.366928

0.35

0.363169

0.359424

0.355691

0.351973

0.348268

0.40

0.344578

0.340903

0.337243

0.333598

0.329969

0.45

0.326355

0.322758

0.319178

0.315614

0.312067

0.50

0.308538

0.305026

0.301532

0.298056

0.294598

0.55

0.291160

0.287740

0.284339

0.280957

0.277595

0.60

0.274253

0.270931

0.267629

0.264347

0.261086

0.65

0.257846

0.254627

0.251429

0.248252

0.245097
(Continued )

P1: IML/FFX
P2: IML
MOBK042-Appe
MOBK042-Enderle.cls

98

October 30, 2006

19:54

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

TABLE A.8: Marcums Q function for = 0, 0.01, . . . , 1.99 (Continued)

+0.00

+0.01

+0.02

+0.03

+0.04

0.70

0.241964

0.238852

0.235762

0.232695

0.229650

0.75

0.226627

0.223627

0.220650

0.217695

0.214764

0.80

0.211855

0.208970

0.206108

0.203269

0.200454

0.85

0.197662

0.194894

0.192150

0.189430

0.186733

0.90

0.184060

0.181411

0.178786

0.176186

0.173609

0.95

0.171056

0.168528

0.166023

0.163543

0.161087

1.00

0.158655

0.156248

0.153864

0.151505

0.149170

1.05

0.146859

0.144572

0.142310

0.140071

0.137857

1.10

0.135666

0.133500

0.131357

0.129238

0.127143

1.15

0.125072

0.123024

0.121001

0.119000

0.117023

1.20

0.115070

0.113140

0.111233

0.109349

0.107488

1.25

0.105650

0.103835

0.102042

0.100273

0.098525

1.30

0.096801

0.095098

0.093418

0.091759

0.090123

1.35

0.088508

0.086915

0.085344

0.083793

0.082264

1.40

0.080757

0.079270

0.077804

0.076359

0.074934

1.45

0.073529

0.072145

0.070781

0.069437

0.068112

1.50

0.066807

0.065522

0.064256

0.063008

0.061780

1.55

0.060571

0.059380

0.058208

0.057053

0.055917

1.60

0.054799

0.053699

0.052616

0.051551

0.050503

1.65

0.049471

0.048457

0.047460

0.046479

0.045514

1.70

0.044565

0.043633

0.042716

0.041815

0.040929

1.75

0.040059

0.039204

0.038364

0.037538

0.036727

1.80

0.035930

0.035148

0.034379

0.033625

0.032884

1.85

0.032157

0.031443

0.030742

0.030054

0.029379

1.90

0.028716

0.028067

0.027429

0.026803

0.026190

1.95

0.025588

0.024998

0.024419

0.023852

0.023295

P1: IML/FFX
P2: IML
MOBK042-Appe
MOBK042-Enderle.cls

October 30, 2006

19:54

DISTRIBUTION TABLES

99

TABLE A.9: Marcums Q function for = 3, 3.01, . . . , 3.99

+0.00

+0.01

+0.02

+0.03

+0.04

2.00

0.022750

0.022216

0.021692

0.021178

0.020675

2.05

0.020182

0.019699

0.019226

0.018763

0.018309

2.10

0.017864

0.017429

0.017003

0.016586

0.016177

2.15

0.015778

0.015386

0.015003

0.014629

0.014262

2.20

0.013903

0.013553

0.013209

0.012874

0.012545

2.25

0.012224

0.011911

0.011604

0.011304

0.011011

2.30

0.010724

0.010444

0.010170

0.009903

0.009642

2.35

0.009387

0.009137

0.008894

0.008656

0.008424

2.40

0.008198

0.007976

0.007760

0.007549

0.007344

2.45

0.007143

0.006947

0.006756

0.006569

0.006387

2.50

0.006210

0.006037

0.005868

0.005703

0.005543

2.55

0.005386

0.005234

0.005085

0.004940

0.004799

2.60

0.004661

0.004527

0.004397

0.004269

0.004145

2.65

0.004025

0.003907

0.003793

0.003681

0.003573

2.70

0.003467

0.003364

0.003264

0.003167

0.003072

2.75

0.002980

0.002890

0.002803

0.002718

0.002635

2.80

0.002555

0.002477

0.002401

0.002327

0.002256

2.85

0.002186

0.002118

0.002052

0.001988

0.001926

2.90

0.001866

0.001807

0.001750

0.001695

0.001641

2.95

0.001589

0.001538

0.001489

0.001441

0.001395

3.00

0.001350

0.001306

0.001264

0.001223

0.001183

3.05

0.001144

0.001107

0.001070

0.001035

0.001001

3.10

0.000968

0.000936

0.000904

0.000874

0.000845

3.15

0.000816

0.000789

0.000762

0.000736

0.000711

3.20

0.000687

0.000664

0.000641

0.000619

0.000598

3.25

0.000577

0.000557

0.000538

0.000519

0.000501

3.30

0.000483

0.000467

0.000450

0.000434

0.000419
(Continued )

P1: IML/FFX
P2: IML
MOBK042-Appe
MOBK042-Enderle.cls

100

October 30, 2006

19:54

ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS

TABLE A.9: Marcums Q function for = 3, 3.01, . . . , 3.99 (Continued)

+0.00

+0.01

+0.02

+0.03

+0.04

3.35

0.000404

0.000390

0.000376

0.000362

0.000350

3.40

0.000337

0.000325

0.000313

0.000302

0.000291

3.45

0.000280

0.000270

0.000260

0.000251

0.000242

3.50

0.000233

0.000224

0.000216

0.000208

0.000200

3.55

0.000193

0.000185

0.000179

0.000172

0.000165

3.60

0.000159

0.000153

0.000147

0.000142

0.000136

3.65

0.000131

0.000126

0.000121

0.000117

0.000112

3.70

0.000108

0.000104

0.000100

0.000096

0.000092

3.75

0.000088

0.000085

0.000082

0.000078

0.000075

3.80

0.000072

0.000070

0.000067

0.000064

0.000062

3.85

0.000059

0.000057

0.000054

0.000052

0.000050

3.90

0.000048

0.000046

0.000044

0.000042

0.000041

3.95

0.000039

0.000037

0.000036

0.000034

0.000033

You might also like