This action might not be possible to undo. Are you sure you want to continue?
MOBK042FM MOBK042Enderle.cls October 30, 2006 19:55
Advanced Probability Theory
for Biomedical Engineers
i
Copyright © 2006 by Morgan & Claypool
All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted in
any form or by any means—electronic, mechanical, photocopy, recording, or any other except for brief quotations
in printed reviews, without the prior permission of the publisher.
Advanced Probability Theory for Biomedical Engineers
John D. Enderle, David C. Farden, and Daniel J. Krause
www.morganclaypool.com
ISBN10: 1598291505 paperback
ISBN13: 9781598291506 paperback
ISBN10: 1598291513 ebook
ISBN13: 9781598291513 ebook
DOI 10.2200/S00063ED1V01Y200610BME011
A lecture in the Morgan & Claypool Synthesis Series
SYNTHESIS LECTURES ON BIOMEDICAL ENGINEERING #11
Lecture #11
Series Editor: John D. Enderle, University of Connecticut
Series ISSN: 19300328 print
Series ISSN: 19300336 electronic
First Edition
10 9 8 7 6 5 4 3 2 1
Printed in the United States of America
P1: IML/FFX P2: IML/FFX QC: IML/FFX T1: IML
MOBK042FM MOBK042Enderle.cls October 30, 2006 19:55
Advanced Probability Theory
for Biomedical Engineers
John D. Enderle
Program Director & Professor for Biomedical Engineering,
University of Connecticut
David C. Farden
Professor of Electrical and Computer Engineering,
North Dakota State University
Daniel J. Krause
Emeritus Professor of Electrical and Computer Engineering,
North Dakota State University
SYNTHESIS LECTURES ON BIOMEDICAL ENGINEERING #11
M
&C
Morgan
&
Claypool Publishers
iii
P1: IML/FFX P2: IML/FFX QC: IML/FFX T1: IML
MOBK042FM MOBK042Enderle.cls October 30, 2006 19:55
iv
ABSTRACT
This is the third in a series of short books on probability theory and random processes for
biomedical engineers. This book focuses on standard probability distributions commonly en
countered in biomedical engineering. The exponential, Poisson and Gaussian distributions are
introduced, as well as important approximations to the Bernoulli PMF and Gaussian CDF.
Many important properties of jointly Gaussian random variables are presented. The primary
subjects of the ﬁnal chapter are methods for determining the probability distribution of a func
tion of a random variable. We ﬁrst evaluate the probability distribution of a function of one
random variable using the CDF and then the PDF. Next, the probability distribution for a
single random variable is determined from a function of two random variables using the CDF.
Then, the joint probability distribution is found from a function of two random variables using
the joint PDF and the CDF.
The aim of all three books is as an introduction to probability theory. The audience
includes students, engineers and researchers presenting applications of this theory to a wide
variety of problems—as well as pursuing these topics at a more advanced level. The theory
material is presented in a logical manner—developing special mathematical skills as needed.
The mathematical background required of the reader is basic knowledge of differential calculus.
Pertinent biomedical engineering examples are throughout the text. Drill problems, straight
forward exercises designed to reinforce concepts and develop problem solution skills, follow
most sections.
KEYWORDS
Probability Theory, Random Processes, Engineering Statistics, Probability and Statistics for
Biomedical Engineers, Exponential distributions, Poisson distributions, Gaussian distributions
Bernoulli PMF and Gaussian CDF. Gaussian random variables
P1: IML/FFX P2: IML/FFX QC: IML/FFX T1: IML
MOBK042FM MOBK042Enderle.cls October 30, 2006 19:55
v
Contents
5. Standard Probability Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
5.1 Uniform Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
5.2 Exponential Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
5.3 Bernoulli Trials . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
5.3.1 Poisson Approximation to Bernoulli . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
5.3.2 Gaussian Approximation to Bernoulli . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
5.4 Poisson Distribution. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
5.4.1 Interarrival Times. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
5.5 Univariate Gaussian Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
5.5.1 Marcum’s Q Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
5.6 Bivariate Gaussian Random Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
5.6.1 Constant Contours. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
5.7 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
5.8 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
6. Transformations of RandomVariables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
6.1 Univariate CDF Technique . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
6.1.1 CDF Technique with Monotonic Functions . . . . . . . . . . . . . . . . . . . . . . . . 45
6.1.2 CDF Technique with Arbitrary Functions . . . . . . . . . . . . . . . . . . . . . . . . . . 46
6.2 Univariate PDF Technique . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
6.2.1 Continuous Random Variable . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
6.2.2 Mixed Random Variable . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
6.2.3 Conditional PDF Technique . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
6.3 One Function of Two Random Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
6.4 Bivariate Transformations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
6.4.1 Bivariate CDF Technique . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
6.4.2 Bivariate PDF Technique . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
6.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
6.6 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75
P1: IML/FFX P2: IML/FFX QC: IML/FFX T1: IML
MOBK042FM MOBK042Enderle.cls October 30, 2006 19:55
vi
P1: IML/FFX P2: IML/FFX QC: IML/FFX T1: IML
MOBK042FM MOBK042Enderle.cls October 30, 2006 19:55
vii
Preface
This is the third in a series of short books on probability theory and random processes for
biomedical engineers. This text is written as an introduction to probability theory. The goal
was to prepare students at the sophomore, junior or senior level for the application of this
theory to a wide variety of problems  as well as pursue these topics at a more advanced
level. Our approach is to present a uniﬁed treatment of the subject. There are only a few key
concepts involved in the basic theory of probability theory. These key concepts are all presented
in the ﬁrst chapter. The second chapter introduces the topic of random variables. The third
chapter focuses on expectation, standard deviation, moments, and the characteristic function.
In addition, conditional expectation, conditional moments and the conditional characteristic
function are also discussed. The fourth chapter introduces jointly distributed random variables,
along with joint expectation, joint moments, and the joint characteristic function. Convolution
is also developed. Later chapters simply expand upon these key ideas and extend the range of
application.
This short book focuses on standard probability distributions commonly encountered in
biomedical engineering. Here in Chapter 5, the exponential, Poisson and Gaussian distributions
are introduced, as well as important approximations to the Bernoulli PMF and Gaussian CDF.
Many important properties of jointly distributed Gaussian random variables are presented.
The primary subjects of Chapter 6 are methods for determining the probability distribution of
a function of a random variable. We ﬁrst evaluate the probability distribution of a function of
one random variable using the CDF and then the PDF. Next, the probability distribution for a
single random variable is determined from a function of two random variables using the CDF.
Then, the joint probability distribution is found from a function of two random variables using
the joint PDF and the CDF.
Aconsiderable effort has been made to develop the theory in a logical manner  developing
special mathematical skills as needed. The mathematical background required of the reader is
basic knowledge of differential calculus. Every effort has been made to be consistent with
commonly used notation and terminology—both within the engineering community as well as
the probability and statistics literature.
The applications and examples given reﬂect the authors’ background in teaching prob
ability theory and random processes for many years. We have found it best to introduce this
material using simple examples such as dice and cards, rather than more complex biological
P1: IML/FFX P2: IML/FFX QC: IML/FFX T1: IML
MOBK042FM MOBK042Enderle.cls October 30, 2006 19:55
viii PREFACE
and biomedical phenomena. However, we do introduce some pertinent biomedical engineering
examples throughout the text.
Students in other ﬁelds should also ﬁnd the approach useful. Drill problems, straightfor
ward exercises designed to reinforce concepts and develop problem solution skills, follow most
sections. The answers to the drill problems follow the problem statement in random order.
At the end of each chapter is a wide selection of problems, ranging from simple to difﬁcult,
presented in the same general order as covered in the textbook.
We acknowledge and thank WilliamPruehsner for the technical illustrations. Many of the
examples and end of chapter problems are based on examples from the textbook by Drake [9].
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
1
C H A P T E R 5
Standard Probability Distributions
A surprisingly small number of probability distributions describe many natural probabilistic
phenomena. This chapter presents some of these discrete and continuous probability distribu
tions that occur often enough in a variety of problems to deserve special mention. We will see
that many random variables and their corresponding experiments have similar properties and
can be described by the same probability distribution. Each section introduces a new PMF or
PDF. Following this, the mean, variance, and characteristic function are found. Additionally,
special properties are pointed out along with relationships among other probability distribu
tions. In some instances, the PMF or PDF is derived according to the characteristics of the
experiment. Because of the vast number of probability distributions, we cannot possibly discuss
them all here in this chapter.
5.1 UNIFORMDISTRIBUTIONS
Deﬁnition 5.1.1. The discrete RV x has a uniform distribution over n points (n > 1) on the
interval [a, b] if x is a lattice RV with span h = (b −a)/(n −1) and PMF
p
x
(α) =
_
1/n, α = kh +a, k = 0, 1, . . . , n −1
0, otherwise.
(5.1)
The mean and variance of a discrete uniform RV are easily computed with the aid of
Lemma 2.3.1:
η
x
=
1
n
n−1
k=0
(kh +a) =
h
n
γ
[2]
n
2
+a =
1
n
b −a
n −1
n(n −1)
2
+a =
b +a
2
, (5.2)
and
σ
2
x
=
1
n
n−1
k=0
_
kh −
b −a
2
_
2
=
(b −a)
2
n
n−1
k=0
_
k
2
(n −1)
2
−
k
n −1
+
1
4
_
. (5.3)
Simplifying,
σ
2
x
=
(b −a)
2
12
n +1
n −1
. (5.4)
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
2 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
0 .5 1
x
p ( )
.05
(a)
1
x
(t )
t
10 30 50
(b)
φ
α
α
FIGURE5.1: (a) PMF and (b) characteristic function magnitude for discrete RV with uniform distri
bution over 20 points on [0, 1].
The characteristic function can be found using the sum of a geometric series:
φ
x
(t) =
e
jat
n
n−1
k=0
(e
jht
)
k
=
e
jat
n
1 −e
jhnt
1 −e
jht
. (5.5)
Simplifying with the aid of Euler’s identity,
φ
x
(t) = exp
_
j
a +b
2
t
_
sin
_
b−a
2
n
n−1
t
_
n sin
_
b−a
2
1
n−1
t
_. (5.6)
Figure 5.1 illustrates the PMFand the magnitude of the characteristic function for a discrete RV
which is uniformly distributed over 20 points on [0, 1]. The characteristic function is plotted
over [0, π/h], where the span h = 1/19. Recall from Section 3.3 that φ
x
(−t) = φ
∗
x
(t) and that
φ
x
(t) is periodic in t with period 2π/h. Thus, Figure 5.1 illustrates onehalf period of φ
x
(·).
Deﬁnition 5.1.2. The continuous RV x has a uniform distribution on the interval [a, b] if x has
PDF
f
x
(α) =
_
1/(b −a), a ≤ α ≤ b
0, otherwise.
(5.7)
The mean and variance of a continuous uniform RV are easily computed directly:
η
x
=
1
b −a
b
_
a
αdα =
b
2
−a
2
2(b −a)
=
b +a
2
, (5.8)
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
STANDARDPROBABILITY DISTRIBUTIONS 3
0 .5 1
1
(a)
1
x
(t ) φ
t
10 30 50
(b)
x
f (α)
FIGURE 5.2: (a) PDF and (b) characteristic function magnitude for continuous RV with uniform
distribution on [0, 1].
and
σ
2
x
=
1
b −a
b
_
a
_
α −
b +a
2
_
2
dα =
(b −a)
2
12
. (5.9)
The characteristic function can be found as
φ
x
(t) =
1
b −a
b
_
a
e
jαt
dα =
exp
_
j
b+a
2
t
_
b −a
(b−a)/2
_
−(b−a)/2
e
jαt
dα.
Simplifying with the aid of Euler’s identity,
φ
x
(t) = exp
_
j
a +b
2
t
_
sin
_
b−a
2
t
_
b−a
2
t
. (5.10)
Figure 5.2 illustrates the PDF and the magnitude of the characteristic function for a continuous
RV uniformly distributed on [0, 1]. Note that the characteristic function in this case is not
periodic but φ
x
(−t) = φ
∗
x
(t).
Drill Problem 5.1.1. A pentahedral die (with faces labeled 0,1,2,3,4) is tossed once. Let x be a
random variable equaling ten times the number tossed. Determine: (a) p
x
(20), (b) P(10 ≤ x ≤ 50),
(c) E(x), (d) σ
2
x
.
Answers: 20, 0.8, 200, 0.2.
Drill Problem5.1.2. Random variable x is uniformly distributed on the interval [−1, 5]. Deter
mine: (a) F
x
(0), (b) F
x
(5), (c) η
x
, (d) σ
2
x
.
Answers: 1, 1/6, 3, 2.
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
4 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
5.2 EXPONENTIAL DISTRIBUTIONS
Deﬁnition 5.2.1. The discrete RV x has a geometric distribution or discrete exponential distri
bution with parameter p(0 < p < 1) if x has PMF
p
x
(α) =
_
p(1 − p)
α−1
, α = 1, 2, . . .
0, otherwise.
(5.11)
The characteristic function can be found using the sum of a geometric series (q = 1 − p):
φ
x
(t) =
p
q
∞
k=1
_
qe
j t
_
k
=
pe
j t
1 −qe
j t
. (5.12)
The mean and variance of a discrete RV with a geometric distribution can be computed using
the moment generating property of the characteristic function. The results are
η
x
=
1
p
, and σ
2
x
=
q
p
2
. (5.13)
Figure 5.3 illustrates the PMF and the characteristic function magnitude for a discrete RV with
geometric distribution and parameter p = 0.18127.
It can be shown that a discrete exponentially distributed RV has a memoryless property:
p
xx>
(αx > ) = p
x
(α −), ≥ 0. (5.14)
Deﬁnition5.2.2. The continuous RV x has an exponential distribution with parameter λ(λ > 0)
if x has PDF
f
x
(α) = λe
−λα
u(α), (5.15)
where u(·) is the unit step function.
0
α
10 20
x
p (α)
.18
.09
(a)
1
x
(t ) φ
t
0 1 2 3
(b)
FIGURE 5.3: (a) PMF and (b) characteristic function magnitude for discrete RV with geometric dis
tribution [ p = 0.18127].
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
STANDARDPROBABILITY DISTRIBUTIONS 5
0 10 5
.2
x
f (α)
.1
1
x
(t ) φ
t
10 30 50
(a) (b)
15 α
FIGURE 5.4: (a) PDF and (b) characteristic function magnitude for continuous RV with exponential
distribution and parameter λ = 0.2.
The exponential probability distribution is also a very important probability density function
in biomedical engineering applications, arising in situations involving reliability theory and
queuing problems. Reliability theory, which describes the time to failure for a system or compo
nent, grewprimarily out of military applications and experiences with multicomponent systems.
Queuing theory describes the waiting times between events.
The characteristic function can be found as
φ
x
(t) = λ
∞
_
0
e
α( j t−λ)
dα =
λ
λ − j t
. (5.16)
Figure 5.4 illustrates the PDF and the magnitude of the characteristic function for a continuous
RV with exponential distribution and parameter λ = 0.2.
The mean and variance of a continuous exponentially distributed RV can be obtained
using the moment generating property of the characteristic function. The results are
η
x
=
1
λ
, σ
2
x
=
1
λ
2
. (5.17)
A continuous exponentially distributed RV, like its discrete counterpart, satisﬁes a memoryless
property:
f
xx>τ
(αx > τ) = f
x
(α −τ), τ ≥ 0. (5.18)
Example 5.2.1. Suppose a system contains a component that has an exponential failure rate. Reli
ability engineers determined its reliability at 5000 hours to be 95%. Determine the number of hours
reliable at 99%.
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
6 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
Solution. First, the parameter λ is determined from
0.95 = P(x > 5000) =
∞
_
5000
λe
−λα
dα = e
−5000λ
.
Thus
λ =
−ln(0.95)
5000
= 1.03 ×10
−5
.
Then, to determine the number of hours reliable at 99%, we solve for α from
P(x > α) = e
−λα
= 0.99
or
α =
−ln(0.99)
λ
= 980 hours.
Drill Problem 5.2.1. Suppose a system has an exponential failure rate in years to failure with
λ = 0.02. Determine the number of years reliable at: (a) 90%, (b) 95%, (c) 99%.
Answers: 0.5, 2.6, 5.3.
Drill Problem 5.2.2. Random variable x, representing the length of time in hours to complete an
examination in Introduction to Random Processes, has PDF
f
x
(α) =
4
3
e
−
4
3
α
u(α).
The examination results are given by
g(x) =
⎧
⎪
⎨
⎪
⎩
75, 0 < x < 4/3
75 +39.44(x −4/3), x ≥ 4/3
0, otherwise.
Determine the average examination grade.
Answer: 80.
5.3 BERNOULLI TRIALS
A Bernoulli experiment consists of a number of repeated (independent) trials with only two
possible events for each trial. The events for each trial can be thought of as any two events which
partition the sample space, such as a head and a tail in a coin toss, a zero or one in a computer
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
STANDARDPROBABILITY DISTRIBUTIONS 7
bit, or an even and odd number in a die toss. Let us call one of the events a success, the other
a failure. The Bernoulli PMF describes the probability of k successes in n trials of a Bernoulli
experiment. The ﬁrst two chapters used this PMF repeatedly in problems dealing with games
of chance and in situations where there were only two possible outcomes in any given trial.
For biomedical engineers, the Bernoulli distribution is used in infectious disease problems and
other applications. The Bernoulli distribution is also known as a Binomial distribution.
Deﬁnition 5.3.1. A discrete RV x is Bernoulli distributed if the PMF for x is
p
x
(k) =
⎧
⎪
⎨
⎪
⎩
_
n
k
_
p
k
q
n−k
, k = 0, 1, . . . , n
0, otherwise,
(5.19)
where p =probability of success and q = 1 − p.
The characteristic function can be found using the binomial theorem:
φ
x
(t) =
n
k=0
_
n
k
_
( pe
j t
)
k
q
n−k
= (q + pe
j t
)
n
. (5.20)
Figure 5.5 illustrates the PMF and the characteristic function magnitude for a discrete RV with
Bernoulli distribution, p = 0.2, and n = 30.
Using the moment generating property of characteristic functions, the mean and variance
of a Bernoulli RV can be shown to be
η
x
= np, σ
2
x
= npq. (5.21)
0
α
10 20
x
p (α)
.18
.09
(a)
1
x
(t ) φ
t
0 1 2
(b)
FIGURE5.5: (a) PMF and (b) characteristic function magnitude for discrete RV with Bernoulli distri
bution, p = 0.2 and n = 30.
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
8 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
Unlike the preceding distributions, a closed form expression for the Bernoulli CDF is not
easily obtained. Tables A.1–A.3 in the Appendix list values of the Bernoulli CDF for p =
0.05, 0.1, 0.15, . . . , 0.5 and n = 5, 10, 15, and 20. Let k ∈ {0, 1, . . . , n −1} and deﬁne
G(n, k, p) =
k
=0
_
n
_
p
(1 − p)
n−
.
Making the change of variable m = n − yields
G(n, k, p) =
n
m=n−k
_
n
n −m
_
p
n−m
(1 − p)
m
.
Now, since
_
n
n −m
_
=
n!
m! (n −m)!
=
_
n
m
_
,
G(n, k, p) =
n
m=0
_
n
m
_
p
n−m
(1 − p)
m
−
n−k−1
m=0
_
n
m
_
p
n−m
(1 − p)
m
.
Using the Binomial Theorem,
G(n, k, p) = 1 − G(n, n −k −1, 1 − p). (5.22)
This result is easily applied to obtain values of the Bernoulli CDF for values of p > 0.5 from
Tables A.1–A.3.
Example 5.3.1. The probability that Fargo Polytechnic Institute wins a game is 0.7. In a 15 game
season, what is the probability that they win: (a) at least 10 games, (b) from 9 to 12 games, (c) exactly
11 games? (d) With x denoting the number of games won, ﬁnd η
x
and σ
2
x
.
Solution. With x a Bernoulli randomvariable, we consult Table A.2, using (5.22) with n = 15,
k = 9, and p = 0.7, we ﬁnd
a) P(x ≥ 10) = 1 − F
x
(9) = 1.0 −0.2784 = 0.7216,
b) P(9 ≤ x ≤ 12) = F
x
(12) − F
x
(8) = 0.8732 −0.1311 = 0.7421,
c) p
x
(11) = F
x
(11) − F
x
(10) = 0.7031 −0.4845 = 0.2186.
d) η
x
= np = 10.5, σ
2
x
= np(1 − p) = 3.15.
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
STANDARDPROBABILITY DISTRIBUTIONS 9
We now consider the number of trials needed for k successes in a sequence of Bernoulli trials.
Let
p(k, n) = P(k successes in n trials) (5.23)
=
⎧
⎪
⎨
⎪
⎩
_
n
k
_
p
k
q
n−k
, k = 0, 1, . . . , n
0, otherwise,
where p = p(1, 1) and q = 1 − p. Let RV n
r
represent the number of trials to obtain exactly
r successes (r ≥ 1). Note that
P(success in th trial r −1 successes in previous −1 trials) = p; (5.24)
hence, for = r, r +1, . . . , we have
P(n
r
= ) = p(r −1, −1) p. (5.25)
Discrete RV n
r
thus has PMF
p
n
r
() =
⎧
⎪
⎨
⎪
⎩
_
−1
r −1
_
p
r
q
−r
, = r, r +1, . . .
0, otherwise,
(5.26)
where the parameter r is a positive integer. The PMF for the RV n
r
is called the negative
binomial distribution, also known as the P´ olya and the Pascal distribution. Note that with
r = 1 the negative binomial PMF is the geometric PMF.
The moment generating function for n
r
can be expressed as
M
n
r
(λ) =
∞
=r
( −1)( −2) · · · ( −r +1)
(r −1)!
p
r
q
−r
e
λ
.
Letting m = −r , we obtain
M
n
r
(λ) =
e
λr
p
r
(r −1)!
∞
m=0
(m +r −1)(m +r −2) · · · (m +1)(qe
λ
)
m
.
With
s (x) =
∞
k=0
x
k
=
1
1 − x
, x < 1,
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
10 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
we have
s
()
(x) =
∞
k=
k(k −1) · · · (k − +1)x
k−
=
∞
m=0
(m +)(m + −1) · · · (m +1)x
m
=
!
(1 − x)
+1
.
Hence
M
n
r
(λ) =
_
pe
λ
1 −qe
λ
_
r
, qe
λ
< 1. (5.27)
The mean and variance for n
r
are found to be
η
n
r
=
r
p
, and σ
2
n
r
=
r q
p
2
. (5.28)
We note that the characteristic function is simply
φ
n
r
(t) = M
n
r
( j t) = φ
r
x
(t), (5.29)
where RV x has a discrete geometric distribution. Figure 5.6 illustrates the PMF and the
characteristic function magnitude for a discrete RV with negative binomial distribution, r = 3,
and p = 0.18127.
1
x
(t ) φ
t
0 1 2
(b)
0
α
20 40
x
p (α)
.06
.03
(a)
FIGURE5.6: (a) PMFand (b) magnitude characteristic function for discrete RVwith negative binomial
distribution, r = 3, and p = 0.18127.
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
STANDARDPROBABILITY DISTRIBUTIONS 11
5.3.1 Poisson Approximation to Bernoulli
When n becomes large in the Bernoulli PMF in such a way that np = λ = constant, the
Bernoulli PMF approaches another important PMF known as the Poisson PMF. The Poisson
PMF is treated in the following section.
Lemma 5.3.1. We have
p(k) = lim
n→∞,np=λ
_
n
k
_
p
k
q
n−k
=
⎧
⎨
⎩
λ
k
e
−λ
k!
, k = 0, 1, . . .
0, otherwise,
(5.30)
Proof. Substituting p =
λ
n
and q = 1 −
λ
n
,
p(k) = lim
n→∞
1
k!
_
λ
k
_
k _
1 −
λ
n
_
n−k k−1
i =0
(n −i ).
Note that
lim
n→∞
n
−k
_
1 −
λ
n
_
−k k−1
i =0
(n −i ) = 1,
so that
p(k) = lim
n→∞
λ
k
k!
_
1 −
λ
n
_
n
.
Now,
lim
n→∞
ln
_
1 −
λ
n
_
n
= lim
n→∞
ln
_
1 −
λ
n
_
1
n
= −λ
so that
lim
n→∞
_
1 −
λ
n
_
n
= e
−λ
,
from which the desired result follows.
We note that the limiting value p(k) may be used as an approximation for the Bernoulli
PMF when p is small by substituting λ = np. While there are no prescribed rules regarding the
values of n and p for this approximation, the larger the value of n and the smaller the value of p,
the better the approximation. Satisfactory results are obtained with np < 10. The motivation
for using this approximation is that when n is large, Tables A.1–A.3 are useless for ﬁnding
values for the Bernoulli CDF.
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
12 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
Example 5.3.2. Suppose x is a Bernoulli random variable with n = 5000 and p = 0.001. Find
P(x ≤ 5).
Solution. Our solution involves approximating the Bernoulli PMF with the Poisson PMF
since n is quite large (and the Bernoulli CDF table is useless), and p is very close to zero.
Since λ = np = 5, we ﬁnd from Table A.5 (the Poisson CDF table is covered in Section 4) that
P(x ≤ 5) = 0.6160.
Incidentally, if p is close to one, we can still use this approximation by reversing our deﬁnition of
success and failure in the Bernoulli experiment, which results in a value of p close to zero—see
(5.22).
5.3.2 Gaussian Approximation to Bernoulli
Previously, the PoissonPMFwas usedtoapproximate a Bernoulli PMFunder certainconditions,
that is, when n is large, p is small and np < 10. This approximation is quite useful since the
Bernoulli table lists only CDF values for n up to 20. The Gaussian PDF (see Section 5.5)
is also used to approximate a Bernoulli PMF under certain conditions. The accuracy of this
approximation is best when n is large, p is close to 1/2, and npq > 3. Notice that in some
circumstances np < 10 and npq > 3. Then either the Poisson or the Gaussian approximation
will yield good results.
Lemma 5.3.2. Let
y =
x −np
√
npq
, (5.31)
where x is a Bernoulli RV. Then the characteristic function for y satisﬁes
φ(t) = lim
n→∞
φ
y
(t) = e
−t
2
/2
. (5.32)
Proof. We have
φ
y
(t) = exp
_
−j
np
√
npq
t
_
φ
x
_
t
√
npq
_
.
Substituting for φ
x
(t),
φ
y
(t) = exp
_
−j
_
np
q
t
__
q + p exp
_
j
t
√
npq
__
n
.
Simplifying,
φ
y
(t) =
_
q exp
_
−j t
_
p
qn
_
+ p exp
_
j t
_
q
np
__
n
.
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
STANDARDPROBABILITY DISTRIBUTIONS 13
Letting
β =
_
q
p
, and α =
_
1
n
,
we obtain
lim
n→∞
ln φ
y
(t) = lim
α→0
ln( pβ
2
e
−j tα/β
+ pe
j tβα
)
α
2
.
Applying L’Hˆ ospital’s Rule twice,
lim
α→0
ln φ
y
(t) = lim
α→0
−j tpβe
−j tα/β
+ j tpβe
j tβα
2α
=
−t
2
p −t
2
β
2
p
2
= −
t
2
2
.
Consequently,
lim
n→∞
φ
y
(t) = exp
_
lim
n→∞
ln φ
y
(t)
_
= e
−t
2
/2
.
The limiting φ(t) in the above lemma is the characteristic function for a Gaussian RV
with zero mean and unit variance. Hence, for large n and a < b
P(a < x < b) = P(a
< y < b
) ≈ F(b
) − F(a
), (5.33)
where
F(γ ) =
1
√
2π
γ
_
−∞
e
−α
2
/2
dα = 1 − Q(γ ) (5.34)
is the standard Gaussian CDF,
a
=
a −np
√
npq
, b
=
b −np
√
npq
, (5.35)
and Q(·) is Marcum’s Q function which is tabulated in Tables A.8 and A.9 of the Appendix.
Evaluation of the above integral as well as the Gaussian PDF are treated in Section 5.5.
Example 5.3.3. Suppose x is a Bernoulli random variable with n = 5000 and p = 0.4. Find
P(x ≤ 2048).
Solution. The solution involves approximating the Bernoulli CDF with the Gaussian CDF
since npq = 1200 > 3. With np = 2000, npq = 1200 and b
= (2048 −2000)/34.641 =
1.39, we ﬁnd from Table A.8 that
P(x ≤ 2048) ≈ F(1.39) = 1 − Q(1.39) = 0.91774.
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
14 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
When approximating the Bernoulli CDF with the Gaussian CDF, a continuous distribution is
used to calculate probabilities for a discrete RV. It is important to note that while the approxi
mation is excellent in terms of the CDFs—the PDF of any discrete RV is never approximated
with a continuous PDF. Operationally, to compute the probability that a Bernoulli RV takes an
integer value using the Gaussian approximation we must round off to the nearest integer.
Example 5.3.4. Suppose x is a Bernoulli random variable with n = 20 and p = 0.5. Find
P(x = 8).
Solution. Since npq = 5 > 3, the Gaussian approximation is used to evaluate the Bernoulli
PMF, p
x
(8). With np = 10, npq = 5, a
= (7.5 −10)/
√
5 = −1.12, and b
= (8.5 −10)/
√
5 = −0.67, we have
p
x
(8) = P(7.5 < x < 8.5) ≈ F(−0.67) − F(−1.12) = 0.25143 −0.13136;
hence, p
x
(8) ≈ 0.12007. From the Bernoulli table, p
x
(8) = 0.1201, which is very close to the
above approximation.
Drill Problem 5.3.1. A survey of residents in Fargo, North Dakota revealed that 30% preferred a
white automobile over all other colors. Determine the probability that: (a) exactly ﬁve of the next 20
cars purchased will be white, (b) at least ﬁve of the next twenty cars purchased will be white, (c) from
two to ﬁve of the next twenty cars purchased will be white.
Answers: 0.1789, 0.4088, 0.7625.
Drill Problem 5.3.2. Prof. Rensselaer is an avid albeit inaccurate marksman. The probability she
will hit the target is only 0.3. Determine: (a) the expected number of hits scored in 15 shots, (b) the
standard deviation for 15 shots, (c) the number of times she must ﬁre so that the probability of hitting
the target at least once is greater than 1/2.
Answers: 2, 4.5, 1.7748.
5.4 POISSONDISTRIBUTION
A Poisson PMF describes the number of successes occurring on a continuous line, typically a
time interval, or within a given region. For example, a Poisson random variable might represent
the number of telephone calls per hour, or the number of errors per page in this textbook.
In the previous section, we found that the limit (as n →∞and constant mean np) of a
Bernoulli PMF is a Poisson PMF. In this section, we derive the Poisson probability distribution
from two fundamental assumptions about the phenomenon based on physical characteristics.
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
STANDARDPROBABILITY DISTRIBUTIONS 15
The following development makes use of the order notation o(h) to denote any function
g(h) which satisﬁes
lim
h→0
g(h)
h
= 0. (5.36)
For example, g(h) = 15h
2
+7h
3
= o(h).
We use the notation
p(k, τ) = P(k successes in interval [0, τ]). (5.37)
The Poisson probability distribution is characterized by the following two properties:
(1) The number of successes occurring in a time interval or region is independent of the
number of successes occurring in any other nonoverlapping time interval or region. Thus, with
A = {k successes in interval I
1
}, (5.38)
and
B = { successes in interval I
2
}, (5.39)
we have
P(A ∩ B) = P(A)P(B), if I
1
∩ I
2
= . (5.40)
As we will see, the number of successes depends only on the length of the time interval
and not the location of the interval on the time axis.
(2) The probability of a single success during a very small time interval is proportional to
the length of the interval. The longer the interval, the greater the probability of success. The
probability of more than one success occurring during an interval vanishes as the length of the
interval approaches zero. Hence
p(1, h) = λh +o(h), (5.41)
and
p(0, h) = 1 −λh +o(h). (5.42)
This second property indicates that for a series of very small intervals, the Poisson process is
composed of a series of Bernoulli trials, each with a probability of success p = λh +o(h).
Since [0, τ +h] = [0, τ] ∪ (τ, τ +h] and [0, τ] ∩ (τ, τ +h] = , we have
p(0, τ +h) = p(0, τ) p(0, h) = p(0, τ)(1 −λh +o(h)).
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
16 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
Noting that
p(0, τ +h) − p(0, τ)
h
=
−λhp(0, τ) +o(h)
h
and taking the limit as h →0,
dp(0, τ)
dτ
= −λp(0, τ), p(0, 0) = 1. (5.43)
This differential equation has solution
p(0, τ) = e
−λτ
u(τ). (5.44)
Applying the above properties, it is readily seen that
p(k, τ +h) = p(k −1, τ) p(1, h) + p(k, τ) p(0, h) +o(h),
or
p(k, τ +h) = p(k −1, τ)λh + p(k, τ)(1 −λh) +o(h),
so that
p(k, τ +h) − p(k, τ)
h
+λp(k, τ) = λp(k −1, τ) +
o(h)
h
.
Taking the limit as h →0
dp(k, τ)
dτ
+λp(k, τ) = λp(k −1, τ), k = 1, 2, . . . , (5.45)
with p(k, 0) = 0. It can be shown ([7, 8]) that
p(k, τ) = λe
−λτ
τ
_
0
e
λt
p(k −1, t)dt (5.46)
and hence that
p(k, τ) =
(λτ)
k
e
−λτ
k!
u(τ), k = 0, 1, . . . . (5.47)
The RV x = number of successes thus has a Poisson distribution with parameter λτ and PMF
p
x
(k) = p(k, τ). The rate of the Poisson process is λ and the interval length is τ.
For ease in subsequent development, we replace the parameter λτ with λ. The characteris
tic function for a Poisson RV x with parameter λ is found as (with parameter λ, p
x
(k) = p(k, 1))
φ
x
(t) = e
−λ
∞
k=0
(λe
j t
)
k
k!
= e
−λ
exp(λe
j t
) = exp(λ(e
j t
−1)). (5.48)
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
STANDARDPROBABILITY DISTRIBUTIONS 17
0 10 20
x
p (α)
.12
.06
(a)
1
x
(t )
t
0 1 2
(b)
φ
α
FIGURE 5.7: (a) PMF and (b) magnitude characteristic function for Poisson distributed RV with
parameter λ = 10.
Figure 5.7 illustrates the PMF and characteristic function magnitude for a discrete RV with
Poisson distribution and parameter λ = 10.
It is of interest to note that if x
1
and x
2
are independent Poisson RVs with parameters λ
1
and λ
2
, respectively, then
φ
x
1
+x
2
(t) = exp((λ
1
+λ
2
)(e
j t
−1)); (5.49)
i.e., x
1
+ x
2
is also a Poisson with parameter λ
1
+λ
2
.
The moments of a Poisson RV are tedious to compute using techniques we have seen so
far. Consider the function
ψ
x
(γ ) = E(γ
x
) (5.50)
and note that
ψ
(k)
x
(γ ) = E
_
γ
x−k
k−1
i =0
(x −i )
_
,
so that
E
_
k−1
i =0
(x −i )
_
= ψ
(k)
x
(1). (5.51)
If x is Poisson distributed with parameter λ, then
ψ
x
(γ ) = e
λ(γ −1)
, (5.52)
so that
ψ
(k)
x
(γ ) = λ
k
e
λ(γ −1)
;
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
18 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
hence,
E
_
k−1
i =0
(x −i )
_
= λ
k
. (5.53)
In particular, E(x) = λ, E(x(x −1)) = λ
2
= E(x
2
) −λ, so that σ
2
x
= λ
2
+λ −λ
2
= λ.
While it is quite easy to calculate the value of the Poisson PMF for a particular number
of successes, hand computation of the CDF is quite tedious. Therefore, the Poisson CDF is
tabulated in Tables A.4A.7 of the Appendix for selected values of λ ranging from 0.1 to 18.
Fromthe Poisson CDFtable, we note that the value of the Poisson PMFincreases as the number
of successes k increases from zero to the mean, and then decreases in value as k increases from
the mean. Additionally, note that the table is written with a ﬁnite number of entries for each
value of λ because the PMF values are written with six decimal place accuracy, even though an
inﬁnite number of Poisson successes are theoretically possible.
Example 5.4.1. On the average, Professor Rensselaer grades 10 problems per day. What is the
probability that on a given day (a) 8 problems are graded, (b) 8–10 problems are graded, and (c) at
least 15 problems are graded?
Solution. With x a Poisson random variable, we consult the Poisson CDF table with λ = 10,
and ﬁnd
a) p
x
(8) = F
x
(8) − F
x
(7) = 0.3328 −0.2202 = 0.1126,
b) P(8 ≤ x ≤ 10) = F
x
(10) − F
x
(7) = 0.5830 −0.2202 = 0.3628,
c) P(x ≥ 15) = 1 − F
x
(14) = 1 −0.9165 = 0.0835.
5.4.1 Interarrival Times
In many instances, the length of time between successes, known as an interarrival time, of a
Poisson random variable is more important than the actual number of successes. For example,
in evaluating the reliability of a medical device, the time to failure is far more signiﬁcant to the
biomedical engineer than the fact that the device failed. Indeed, the subject of reliability theory
is so important that entire textbooks are devoted to the topic. Here, however, we will brieﬂy
examine the subject of interarrival times from the basis of the Poisson PMF.
Let RV t
r
denote the length of the time interval from zero to the r th success. Then
p(τ −h < t
r
≤ τ) = p(r −1, τ −h) p(1, h)
= p(r −1, τ −h)λh +o(h)
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
STANDARDPROBABILITY DISTRIBUTIONS 19
so that
F
t
r
(τ) − F
t
r
(τ −h)
h
= λp(r −1, τ −h) +
o(h)
h
.
Taking the limit as h →0 we ﬁnd that the PDF for the r th order interarrival time, that is, the
time interval from any starting point to the r th success after it, is
f
t
r
(τ) =
λ
r
τ
r −1
e
−λτ
(r −1)!
u(τ), r = 1, 2, . . . . (5.54)
This PDF is known as the Erlang PDF. Clearly, with r = 1, we have the exponential PDF:
f
t
(τ) = λe
−λτ
u(τ). (5.55)
The RV t is called the ﬁrstorder interarrival time.
The Erlang PDF is a special case of the gamma PDF:
f
x
(α) =
λ
r
α
r −1
e
−λα
(r )
u(α), (5.56)
for any real r > 0, λ > 0, where is the gamma function
(r ) =
∞
_
0
α
r −1
e
−α
dα. (5.57)
Straightforward integration reveals that (1) = 1 and (r +1) = r (r ) so that if r is a positive
integer then (r ) = (r −1)!—for this reason the gamma function is often called the factorial
function. Using the above deﬁnition for (r ), it is easily shown that the moment generating
function for a gammadistributed RV is
M
x
(η) =
_
λ
λ −η
_
r
, for η < λ. (5.58)
The characteristic function is thus
φ
x
(t) =
_
λ
λ − j t
_
r
. (5.59)
It follows that the mean and variance are
η
x
=
r
λ
, and σ
2
x
=
r
λ
2
. (5.60)
Figure 5.8 illustrates the PDF and magnitude of the characteristic function for a RV with
gamma distribution with r = 3 and λ = 0.2.
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
20 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
0 20 40
x
f (α)
.06
.03
(a)
1
x
(t )
t
0 1 2
(b)
φ
α
FIGURE5.8: (a) PDF and (b) magnitude characteristic function for gamma distributed RV with r = 3
and parameter λ = 0.2.
Drill Problem 5.4.1. On the average, Professor S. Rensselaer makes ﬁve blunders per lecture.
Determine the probability that she makes (a) less than six blunders in the next lecture: (b) exactly ﬁve
blunders in the next lecture: (c) from three to seven blunders in the next lecture: (d) zero blunders in
the next lecture.
Answers: 0.6160, 0.0067, 0.7419, 0.1755.
Drill Problem 5.4.2. A process yields 0.001% defective items. If one million items are produced,
determine the probability that the number of defective items exceeds twelve.
Answer: 0.2084.
Drill Problem5.4.3. Professor S. Rensselaer designs her examinations so that the probability of at
least one extremely difﬁcult problem is 0.632. Determine the average number of extremely difﬁcult
problems on a Rensselaer examination.
Answer: 1.
5.5 UNIVARIATEGAUSSIANDISTRIBUTION
The Gaussian PDF is the most important probability distribution in the ﬁeld of biomedical
engineering. Plentiful applications arise in industry, research, and nature, ranging from instru
mentation errors to scores on examinations. The PDF is named in honor of Gauss (1777–1855),
who derived the equation based on an error study involving repeated measurements of the same
quantity. However, De Moivre is ﬁrst credited with describing the PDF in 1733. Applications
also abound in other areas outside of biomedical engineering since the distribution ﬁts the
observed data in many processes. Incidentally, statisticians refer to the Gaussian PDF as the
normal PDF.
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
STANDARDPROBABILITY DISTRIBUTIONS 21
Deﬁnition 5.5.1. A continuous RV z is a standardized Gaussian RV if the PDF is
f
z
(α) =
1
√
2π
e
−
1
2
α
2
. (5.61)
The moment generating function for a standardized Gaussian RV can be found as follows:
M
z
(λ) =
1
√
2π
∞
_
−∞
e
λα−
1
2
α
2
dα
=
1
√
2π
∞
_
−∞
e
−
1
2
((α−λ)
2
−λ
2
)
dα.
Making the change of variable β = α −λ we ﬁnd
M
z
(λ) = e
1
2
λ
2
∞
_
−∞
f
z
(β)dβ = e
1
2
λ
2
, (5.62)
for all real λ. We have made use of the fact that the function f
z
is a bona ﬁde PDF, as treated
in Problem 42. Using the Taylor series expansion for an exponential,
e
1
2
λ
2
=
∞
k=0
λ
2k
2
k
k!
=
∞
n=0
M
(n)
x
(0)λ
n
n!
,
so that all moments of z exist and
E(z
2k
) =
(2k)!
2
k
k!
, k = 0, 1, 2, . . . , (5.63)
and
E(z
2k+1
) = 0, k = 0, 1, 2, . . . . (5.64)
Consequently, a standardized Gaussian RV has zero mean and unit variance. Extending the
range of deﬁnition of M
z
(λ) to include the ﬁnite complex plane, we ﬁnd that the characteristic
function is
φ
z
(t) = M
z
( j t) = e
−
1
2
t
2
. (5.65)
Letting the RV x = σz +η we ﬁnd that E(x) = η and σ
2
x
= σ
2
. For σ > 0
F
x
(α) = P(σz +η ≤ α) = F
z
((α −η)/σ),
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
22 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
so that x has the general Gaussian PDF
f
x
(α) =
1
√
2πσ
2
exp
_
−
1
2σ
2
(α −η)
2
_
. (5.66)
Similarly, with σ > 0 and x = −σz +η we ﬁnd
F
x
(α) = P(−σz +η ≤ α) = 1 − F
z
((η −α)/σ),
so that f
x
is as above. We will have occasion to use the shorthand notation x ∼ G(η, σ
2
) to
denote that the RVhas a Gaussian PDFwith mean η and variance σ
2
. Note that if x ∼ G(η, σ
2
)
then (x = σz +η)
φ
x
(t) = e
j ηt
e
−
1
2
σ
2
t
2
. (5.67)
The Gaussian PDF, illustrated with η = 75 and σ
2
= 25, as well as with η = 75 and σ
2
= 9
in Figure 5.9, is a bellshaped curve completely determined by its mean and variance. As can
be seen, the Gaussian PDF is symmetrical about the vertical axis through the expected value.
If, in fact, η = 25, identically shaped curves could be drawn, centered now at 25 instead of
75. Additionally, the maximum value of the Gaussian PDF, (2πσ
2
)
−1/2
, occurs at α = η. The
PDF approaches zero asymptotically as α approaches ±∞. Naturally, the larger the value of the
variance, the more spread in the distribution and the smaller the maximum value of the PDF.
For any combination of the mean and variance, the Gaussian PDF curve must be symmetrical
as previously described, and the area under the curve must equal one.
Unfortunately, a closed form expression does not exist for the Gaussian CDF, which
necessitates numerical integration. Rather than attempting to tabulate the general Gaussian
CDF, a normalization is performed to obtain a standardized Gaussian RV (with zero mean
α
x
f (α)
.12
65 75 85
.08
.04
FIGURE5.9: Gaussian probability density function for η = 75 and σ
2
= 9, 25.
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
STANDARDPROBABILITY DISTRIBUTIONS 23
and unit variance). If x ∼ G(η, σ
2
), the RV z = (x −η)/σ is a standardized Gaussian RV:
z ∼ G(0, 1). This transformation is always applied when using standard tables for computing
probabilities for Gaussian RVs. The probability P(α
1
< x ≤ α
2
) can be obtained as
P(α
1
< x ≤ α
2
) = F
x
(α
2
) − F
x
(α
1
), (5.68)
using the fact that
F
x
(α) = F
z
((α −η)/σ). (5.69)
Note that
F
z
(α) =
1
√
2π
α
_
−∞
e
−
1
2
τ
2
dτ = 1 − Q(α), (5.70)
where Q(·) is Marcum’s Q function:
Q(α) =
1
√
2π
∞
_
α
e
−
1
2
τ
2
dτ. (5.71)
Marcum’s Qfunction is tabulated in Tables A.8 and A.9 for 0 ≤ α < 4 using the approximation
presented in Section 5.5.1. It is easy to show that
Q(−α) = 1 − Q(α) = F
z
(α). (5.72)
The error and complementary error functions, deﬁned by
erf(α) =
2
π
α
_
0
e
−t
2
dt (5.73)
and
erfc(α) =
2
π
∞
_
α
e
−t
2
dt = 1 −erf(α) (5.74)
are also often used to evaluate the standard normal integral. A simple change of variable reveals
that
erfc(α) = 2Q(α/
√
2). (5.75)
Example 5.5.1. Compute F
z
(−1.74), where z ∼ G(0, 1).
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
24 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
Solution. To compute F
z
(−1.74), we ﬁnd
F
z
(−1.74) = 1 − Q(−1.74) = Q(1.74) = 0.04093,
using (5.72) and Table A.8.
While the value a Gaussian randomvariable takes on is any real number between negative
inﬁnity and positive inﬁnity, the realistic range of values is much smaller. From Table A.9,
we note that 99.73% of the area under the curve is contained between −3.0 and 3.0. From
the transformation z = (x −η)/σ, the range of values random variable x takes on is then
approximately η ±3σ. This notion does not imply that randomvariable x cannot take on a value
outside this interval, but the probability of it occurring is really very small (2Q(3) = 0.0027).
Example 5.5.2. Suppose x is a Gaussian random variable with η = 35 and σ = 10. Sketch the
PDF and then ﬁnd P(37 ≤ x ≤ 51). Indicate this probability on the sketch.
Solution. The PDF is essentially zero outside the interval [η −3σ, η +3σ] = [5, 65]. The
sketch of this PDF is shown in Figure 5.10 along with the indicated probability. With
z =
x −35
10
we have
P(37 ≤ x ≤ 51) = P(0.2 ≤ z ≤ 1.6) = F
z
(1.6) − F
z
(0.2).
Hence P(37 ≤ x ≤ 51) = Q(0.2) − Q(1.6) = 0.36594 from Table A.9.
x
f (α)
.03
30 60
.02
.01
.04
0
α
FIGURE5.10: PDF for Example 5.5.2.
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
STANDARDPROBABILITY DISTRIBUTIONS 25
Example 5.5.3. A machine makes capacitors with a mean value of 25 μF and a standard deviation
of 6 μF. Assuming that capacitance follows a Gaussian distribution, ﬁnd the probability that the value
of capacitance exceeds 31 μF if capacitance is measured to the nearest μF.
Solution. Let the RV x denote the value of a capacitor. Since we are measuring to the nearest
μF, the probability that the measured value exceeds 31 μF is
P(31.5 ≤ x) = P(1.083 ≤ z) = Q(1.083) = 0.13941,
where z = (x −25)/6 ∼ G(0, 1). This result is determined by linear interpolation of the CDF
between equal 1.08 and 1.09.
5.5.1 Marcum’s QFunction
Marcum’s Q function, deﬁned by
Q(γ ) =
1
√
2π
∞
_
γ
e
−
1
2
α
2
dα (5.76)
has been extensively studied. If the RV z ∼ G(0, 1) then
Q(γ ) = 1 − F
z
(γ ); (5.77)
i.e., Q(γ ) is the complement of the standard Gaussian CDF. Note that Q(0) = 0.5, Q(∞) = 0,
and that F
z
(−γ ) = Q(γ ). A very accurate approximation to Q(γ ) is presented in [1, p. 932]:
Q(γ ) ≈ e
−
1
2
γ
2
h(t), γ > 0, (5.78)
where
t =
1
1 +0.2316419γ
, (5.79)
and
h(t) =
1
√
2π
t(a
1
+t(a
2
+t(a
3
+t(a
4
+a
5
t)))). (5.80)
The constants are
i a
i
1 0.31938153
2 −0.356563782
3 1.781477937
4 −1.821255978
5 1.330274429
The error in using this approximation is less than 7.5 ×10
−8
.
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
26 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
A very useful bound for Q(α) is [1, p. 298]
_
2
π
e
−
1
2
α
2
α +
√
α
2
+4
< Q(α) ≤
_
2
π
e
−
1
2
α
2
α +
√
α
2
+0.5π
. (5.81)
The ratio of the upper bound to the lower bound is 0.946 when α = 3 and 0.967 when α = 4.
The bound improves as α increases.
Sometimes, it is desired to ﬁnd the value of γ for which Q(γ ) = q. Helstrom [14] offers
an iterative procedure which begins with an initial guess γ
0
> 0. Then compute
t
i
=
1
1 +0.2316419γ
i
(5.82)
and
γ
i +1
=
_
2 ln
_
h(t
i
)
q
__
1/2
, i = 0, 1, . . . . (5.83)
The procedure is terminated when γ
i +1
≈ γ
i
to the desired degree of accuracy.
Drill Problem5.5.1. Students attend Fargo Polytechnic Institute for an average of four years with a
standard deviation of onehalf year. Let the randomvariable x denote the length of attendance and as
sume that x is Gaussian. Determine: (a) P(1 < x < 3), (b)P(x > 4), (c )P(x = 4), (d)F
x
(4.721).
Answers: 0.5, 0, 0.02275, 0.92535.
Drill Problem 5.5.2. The quality point averages of 2500 freshmen at Fargo Polytechnic Institute
follow a Gaussian distribution with a mean of 2.5 and a standard deviation of 0.7. Suppose grade
point averages are computed to the nearest tenth. Determine the number of freshmen you would expect
to score: (a) from 2.6 to 3.0, (b) less than 2.5, (c) between 3.0 and 3.5, (d) greater than 3.5.
Answers: 167, 322, 639, 1179.
Drill Problem5.5.3. Professor Rensselaer loves the game of golf. She has determined that the distance
the ball travels on her ﬁrst shot follows a Gaussian distribution with a mean of 150 and a standard
deviation of 17. Determine the value of d so that the range, 150 ±d, covers 95% of the shots.
Answer: 33.32.
5.6 BIVARIATEGAUSSIANRANDOMVARIABLES
The previous section introduced the univariate Gaussian PDF along with some general char
acteristics. Now, we discuss the joint Gaussian PDF and its characteristics by drawing on our
univariate Gaussian PDF experiences, and signiﬁcantly expanding the scope of applications.
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
STANDARDPROBABILITY DISTRIBUTIONS 27
Numerous applications of this joint PDF are found throughout the ﬁeld of biomedical engi
neering and, like the univariate case, the joint Gaussian PDF is considered the most important
joint distribution for biomedical engineers.
Deﬁnition 5.6.1. The bivariate RV z = (x, y) is a bivariate Gaussian RV if every linear combi
nation of x and y has a univariate Gaussian distribution. In this case we also say that the RVs x and
y are jointly distributed Gaussian RVs.
Let the RV w = ax +by, and let x and y be jointly distributed Gaussian RVs. Then
w is a univariate Gaussian RV for all real constants a and b. In particular, x ∼ G(η
x
, σ
2
x
) and
y ∼ G(η
y
, σ
2
y
); i.e., the marginal PDFs for a joint Gaussian PDF are univariate Gaussian. The
above deﬁnition of a bivariate Gaussian RV is sufﬁcient for determining the bivariate PDF,
which we now proceed to do.
The following development is signiﬁcantly simpliﬁed by considering the standardized
versions of x and y. Also, we assume that ρ
x,y
 < 1, σ
x
= 0, and σ
y
= 0. Let
z
1
=
x −η
x
σ
x
and z
2
=
y −η
y
σ
y
, (5.84)
so that z
1
∼ G(0, 1) and z
2
∼ G(0, 1). Below, we ﬁrst ﬁnd the joint characteristic function
for the standardized RVs z
1
and z
2
, then the conditional PDF f
z
2
z
1
and the joint PDF f
z
1
,z
2
.
Next, the results for z
1
and z
2
are applied to obtain corresponding quantities φ
x,y
, f
yx
and f
x,y
.
Finally, the special cases ρ
x,y
= ±1, σ
x
= 0, and σ
y
= 0 are discussed.
Since z
1
and z
2
are jointly Gaussian, the RV t
1
z
1
+t
2
z
2
is univariate Gaussian:
t
1
z
1
+t
2
z
2
∼ G
_
0, t
2
1
+2t
1
t
2
ρ +t
2
2
_
.
Completing the square,
t
2
1
+2t
1
t
2
ρ +t
2
2
= (t
1
+ρt
2
)
2
+(1 −ρ
2
)t
2
2
,
so that
φ
z
1
,z
2
(t
1
, t
2
) = E(e
j t
1
z
1
+j t
2
z
2
) = e
−
1
2
(1−ρ
2
)t
2
2
e
−
1
2
(t
1
+ρt
2
)
2
. (5.85)
From (6) we have
f
z
1
,z
2
(α, β) =
1
2π
∞
_
−∞
I(α, t
2
)e
−jβt
2
dt
2
, (5.86)
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
28 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
where
I(α, t
2
) =
1
2π
∞
_
−∞
φ
z
1
,z
2
(t
1
, t
2
)e
−j αt
1
dt
1
.
Substituting (5.85) and letting τ = t
1
+t
2
ρ, we obtain
I(α, t
2
) = e
−
1
2
(1−ρ
2
)t
2
2
1
2π
∞
_
−∞
e
−
1
2
τ
2
e
−j α(τ−ρt
2
)
dτ,
or
I(α, t
2
) = φ(t
2
) f
z
1
(α),
where
φ(t
2
) = e
j αρt
2
e
−
1
2
(1−ρ
2
)t
2
2
.
Substituting into (5.86) we ﬁnd
f
z
1
,z
2
(α, β) = f
z
1
(α)
1
2π
∞
_
−∞
φ(t
2
)e
−jβt
2
dt
2
and recognize that φ is the characteristic function for a Gaussian RVwith mean αρ and variance
1 −ρ
2
. Thus
f
z
1
,z
2
(α, β)
f
z
1
(α)
= f
z
2
z
1
(βα) =
1
_
2π(1 −ρ
2
)
exp
_
−
(β −ρα)
2
2(1 −ρ
2
)
_
, (5.87)
so that
E(z
2
 z
1
) = ρz
1
(5.88)
and
σ
2
z
2
z
1
= 1 −ρ
2
. (5.89)
After some algebra, we ﬁnd
f
z
1
,z
2
(α, β) =
1
2π(1 −ρ
2
)
1/2
exp
_
−
α
2
−2ραβ +β
2
2(1 −ρ
2
)
_
. (5.90)
We now turn our attention to using the above results for z
1
and z
2
to obtain similar results for
x and y. From (5.84) we ﬁnd that
x = σ
x
z
1
+η
x
and y = σ
y
z
2
+η
y
,
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
STANDARDPROBABILITY DISTRIBUTIONS 29
so that the joint characteristic function for x and y is
φ
x,y
(t
1
, t
2
) = E(e
j t
1
x+j t
2
y
) = E(e
j t
1
σ
x
z
1
+j t
2
σ
y
z
2
)e
j t
1
η
x
+j t
2
η
y
.
Consequently, the joint characteristic function for x and y can be found from the joint charac
teristic function of z
1
and z
2
as
φ
x,y
(t
1
, t
2
) = φ
z
1
,z
2
(σ
x
t
1
, σ
y
t
2
)e
j η
x
t
1
e
j η
y
t
2
. (5.91)
Using (4.66), the joint characteristic function φ
x,y
can be transformed to obtain the joint PDF
f
x,y
(α, β) as
f
x,y
(α, β) =
1
(2π)
2
∞
_
−∞
∞
_
−∞
φ
z
1
,z
2
(σ
x
t
1
, σ
y
t
2
)e
−j (α−η
x
)t
1
e
−j (β−η
y
)t
2
dt
1
dt
2
. (5.92)
Making the change of variables τ
1
= σ
x
t
1
, τ
2
= σ
y
t
2
, we obtain
f
x,y
(α, β) =
1
σ
x
σ
y
f
z
1
,z
2
_
α −η
x
σ
x
,
β −η
y
σ
y
_
. (5.93)
Since
f
x,y
(α, β) = f
yx
(βα) f
x
(α)
and
f
x
(α) =
1
σ
x
f
z
1
_
α −η
x
σ
x
_
,
we may apply (5.93) to obtain
f
yx
(βα) =
1
σ
y
f
z
2
z
1
_
β −η
y
σ
y
¸
¸
¸
¸
α −η
x
σ
x
_
. (5.94)
Substituting (5.90) and (5.87) into (5.93) and (5.94) we ﬁnd
f
x,y
(α, β) =
exp
_
−
1
2(1−ρ
2
)
_
(α−η
x
)
2
σ
2
x
−
2ρ(α−η
x
)(β−η
y
)
σ
x
σ
y
+
(β−η
y
)
2
σ
2
y
__
2πσ
x
σ
y
(1 −ρ
2
)
1/2
(5.95)
and
f
yx
(βα) =
exp
_
−
_
β−η
y
−ρσ
y
α−η
x
σ
x
_
2
2(1−ρ
2
)σ
2
y
_
_
2πσ
2
y
(1 −ρ
2
)
. (5.96)
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
30 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
It follows that
E(yx) = η
y
+ρσ
y
x −η
x
σ
x
(5.97)
and
σ
2
yx
= σ
2
y
(1 −ρ
2
). (5.98)
By interchanging x with y and α with β,
f
xy
(αβ) =
exp
⎛
⎝
−
_
α−η
x
−ρσ
x
β−η
y
σ
y
_
2
2(1−ρ
2
)σ
2
y
⎞
⎠
_
2πσ
2
x
(1 −ρ
2
)
, (5.99)
E(xy) = η
x
+ρσ
x
y −η
y
σ
y
, (5.100)
and
σ
2
xy
= σ
2
x
(1 −ρ
2
). (5.101)
A threedimensional plot of a bivariate Gaussian PDF is shown in Figure 5.11.
The bivariate characteristic function for x and y is easily obtained as follows. Since x and
y are jointly Gaussian, the RV t
1
x +t
2
y is a univariate Gaussian RV:
t
1
x +t
2
y ∼ G
_
t
1
η
x
+t
2
η
y
, t
2
1
σ
2
x
+2t
1
t
2
σ
x,y
+t
2
2
σ
2
y
_
.
x , y
f (α, β )
.265
α = −3
α = 3
β = −3
β = 3
FIGURE5.11: Bivariate Gaussian PDF f
x,y
(α, β) with σ
x
= σ
y
= 1, η
x
= η
y
= 0, and ρ = −0.8.
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
STANDARDPROBABILITY DISTRIBUTIONS 31
Consequently, the joint characteristic function for x and y is
φ
x,y
(t
1
, t
2
) = e
−
1
2
(t
2
1
σ
2
x
+2t
1
t
2
σ
x,y
+t
2
2
σ
2
y
)
e
j t
1
η
x
+j t
2
η
y
, (5.102)
which is valid for all σ
x,y
, σ
x
and σ
y
.
We now consider some special cases of the bivariate Gaussian PDF. If ρ = 0 then (from
(5.95))
f
x,y
(α, β) = f
x
(α) f
y
(β); (5.103)
i.e., RVs x and y are independent.
As ρ →±1, from (5.97) and (5.98) we ﬁnd
E(yx) →η
y
±σ
y
x −η
x
σ
x
and σ
2
yx
→0. Hence,
y →η
y
±σ
y
x −η
x
σ
x
in probability
1
. We conclude that
f
x,y
(α, β) = f
x
(α)δ
_
β −η
y
±σ
y
α −η
x
σ
x
_
(5.104)
for ρ = ±1. Interchanging the roles of x and y we ﬁnd that the joint PDF for x and y may also
be written as
f
x,y
(α, β) = f
y
(β)δ
_
α −η
x
±σ
x
β −η
y
σ
y
_
(5.105)
when ρ = ±1. These results can also be obtained “directly” fromthe joint characteristic function
for x and y.
A very special property of jointly Gaussian RVs is presented in the following theorem.
Theorem5.6.1. The jointly Gaussian RVs x and y are independent iff ρ
x,y
= 0.
Proof. We showed previously that if x and y are independent, then ρ
x,y
= 0.
Suppose that ρ = ρ
x,y
= 0. Then f
yx
(βα) = f
y
(β).
Example 5.6.1. Let x and y be jointly Gaussian with zero means, σ
2
x
= σ
2
y
= 1, and ρ = ±1.
Find constants a and b such that v = ax +by ∼ G(0, 1) and such that v and x are independent.
1
As the variance of a RV decreases to zero, the probability that the RV deviates from its mean by more than an
arbitrarily small ﬁxed amount approaches zero. This is an application of the Chebyshev Inequality.
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
32 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
Solution. We have E(v) = 0. We require
σ
2
v
= a
2
+b
2
+2abρ
2
x,y
= 1
and
E(vx) = a +bρ
x,y
= 0.
Hence a = −bρ
x,y
and b
2
= 1/(1 −ρ
2
x,y
), so that
v =
y −ρ
x,y
x
_
1 −ρ
2
x,y
is independent of x and σ
2
v
= 1.
5.6.1 Constant Contours
Returning to the normalized jointly Gaussian RVs z
1
and z
2
, we now investigate the shape of
the joint PDF f
z
1
,z
2
(α, β) by ﬁnding the locus of points where the PDF is constant. We assume
that ρ < 1. By inspection of (5.90), we ﬁnd that f
z
1
,z
2
(α, β) is constant for α and β satisfying
α
2
−2ραβ +β
2
= c
2
, (5.106)
where c is a positive constant.
If ρ = 0 the contours where the joint PDF is constant is a circle of radius c centered at
the origin.
Along the line β = qα we ﬁnd that
α
2
(1 −2ρq +q
2
) = c
2
(5.107)
so that the constant contours are parameterized by
α =
±c
_
1 −2ρq +q
2
, (5.108)
and
β =
±c q
_
1 −2ρq +q
2
. (5.109)
The square of the distance from a point (α, β) on the contour to the origin is given by
d
2
(q) = α
2
+β
2
=
c
2
(1 +q
2
)
1 −2ρq +q
2
. (5.110)
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
STANDARDPROBABILITY DISTRIBUTIONS 33
Differentiating, we ﬁnd that d
2
(q) attains its extremal values at q = ±1. Thus, the line β = α
intersects the constant contour at
±β = α =
±c
_
2(1 −ρ)
. (5.111)
Similarly, the line β = −α, intersects the constant contour at
±β = −α =
±c
_
2(1 −ρ)
. (5.112)
Consider the rotated coordinates α
= (α +β)/
√
2 and β
= (β −α)/
√
2, so that
α
+β
√
2
= β (5.113)
and
α
−β
√
2
= α. (5.114)
The rotated coordinate system is a rotation by π/4 counterclockwise. Thus
α
2
−2ραβ +β
2
= c
2
(5.115)
is transformed into
α
2
1 +ρ
+
β
2
1 −ρ
=
c
2
1 −ρ
2
. (5.116)
The above equation represents an ellipse with major axis length 2c /
√
1 −ρ and minor axis
length 2c /
√
1 +ρ. In the α −β plane, the major and minor axes of the ellipse are along the
lines β = ±α.
From (5.93), the constant contours for f
x,y
(α, β) are solutions to
_
α −η
x
σ
x
_
2
−2ρ
_
α −η
x
σ
x
__
β −η
y
σ
y
_
+
_
β −η
y
σ
y
_
2
= c
2
. (5.117)
Using the transformation
α
=
1
√
2
_
α −η
x
σ
x
+
β −η
y
σ
y
_
, β
=
1
√
2
_
β −η
y
σ
y
−
α −η
x
σ
x
_
(5.118)
transforms the constant contour to (5.116). With α
= 0 we ﬁnd that one axis is along
α −η
x
σ
x
= −
β −η
y
σ
y
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
34 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
with endpoints at
α =
±c σ
x
√
2
√
1 +ρ
+η
x
, β =
±c σ
y
√
2
√
1 +ρ
+η
y
;
the length of this axis in the α −β plane is
√
2c
_
σ
2
x
+σ
2
y
1 +ρ
.
With β
= 0 we ﬁnd that the other axis is along
α −η
x
σ
x
=
β −η
y
σ
y
with endpoints at
α =
±c σ
x
√
2
√
1 −ρ
+η
x
, β =
±c σ
y
√
2
√
1 −ρ
+η
y
;
the length of this axis in the α −β plane is
√
2c
_
σ
2
x
+σ
2
y
1 −ρ
.
Points on this ellipse in the α −β plane satisfy (5.117); the value of the joint PDF f
x,y
on this
curve is
1
2πσ
x
σ
y
_
1 −ρ
2
exp
_
−
c
2
2(1 −ρ
2
)
_
. (5.119)
A further transformation
α
=
α
√
1 +ρ
, β
=
β
√
1 −ρ
transforms the ellipse in the α
−β
plane to a circle in the α
−β
plane:
α
2
+β
2
=
c
2
1 −ρ
2
.
This transformation provides a straightforward way to compute the probability that the joint
Gaussian RVs x and y lie within the region bounded by the ellipse speciﬁed by (5.117). Letting
A denote the region bounded by the ellipse in the α −β plane and A
denote the image (a circle)
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
STANDARDPROBABILITY DISTRIBUTIONS 35
in the α
−β
plane, we have
_
A
_
f
x,y
(α, β) dα dβ =
_ _
1
2πσ
x
σ
y
_
1 −ρ
2
e
−
1
2
(α
2
+β
2
)
 J (α, β)
dα
dβ
,
where the Jacobian of the transformation is
J (α, β) =
¸
¸
¸
¸
¸
¸
¸
¸
∂α
∂α
∂α
∂β
∂β
∂α
∂β
∂β
¸
¸
¸
¸
¸
¸
¸
¸
.
Computing the indicated derivatives, we have
J (α, β) =
¸
¸
¸
¸
¸
¸
¸
¸
¸
1
σ
x
√
2
√
1 +ρ
1
σ
y
√
2
√
1 +ρ
−1
σ
x
√
2
√
1 −ρ
1
σ
y
√
2
√
1 −ρ
¸
¸
¸
¸
¸
¸
¸
¸
¸
,
so that
J (α, β) =
1
σ
x
σ
y
_
1 −ρ
2
.
Substituting and transforming to polar coordinates, we ﬁnd
_
A
_
f
x,y
(α, β) dα dβ =
_
A
_
1
2π
e
−
1
2
(α
2
+β
2
)
dα
dβ
=
1
2π
2π
_
0
c /
√
1−ρ
2
_
0
r e
−
1
2
r
2
dr dθ
=
c
2
/(2−2ρ
2
)
_
0
e
−u
du
= 1 −e
−c
2
/(2−2ρ
2
)
.
This bivariate Gaussianprobability computationis one of the fewwhichis “easily” accomplished.
Additional techniques for treating these computations are given in [1, pp. 956–958].
Drill Problem5.6.1. Given that x and y are jointly distributed Gaussian random variables with
E(yx) = 2 +1.5x, E(xy) = 7/6 + y/6, and σ
2
xy
= 0.75. Determine: (a) η
x
, (b) η
y
, (c) σ
2
x
, (d)
σ
2
y
, and (e) ρ
x,y
.
Answers: 0.5, 1, 9, 2, 5.
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
36 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
Drill Problem 5.6.2. Random variables x and y are jointly Gaussian with η
x
= −2, η
y
= 3,
σ
2
x
= 21, σ
2
y
= 31, and ρ = −0.3394. With c
2
= 0.2212 in (154), ﬁnd: (a) the smallest angle
that either the minor or major axis makes with the positive α axis in the α −β plane, (b) the length
of the minor axis, (c) the length of the major axis.
Answers: 3, 2, 30
◦
.
Drill Problem 5.6.3. Random variables x and y are jointly Gaussian with η
x
= −2, η
y
= 3,
σ
2
x
= 21, σ
2
y
= 31, and ρ = −0.3394. Find: (a) E(y  x = 0), (b)P(1 < y ≤ 10  x = 0),
(c )P(−1 < x < 7).
Answers: 0.5212, 0.3889, 2.1753.
5.7 SUMMARY
This chapter introduces certain probability distributions commonly encountered in biomedical
engineering. Special emphasis is placed on the exponential, Poisson and Gaussian distributions.
Important approximations to the Bernoulli PMF and Gaussian CDF are developed.
Bernoulli event probabilities may be approximated by the Poisson PMF when np < 10
or by the Gaussian PDF when npq > 3. For the Poisson approximation use η = np. For the
Gaussian approximation use η = np and σ
2
= npq.
Many important properties of jointly Gaussian random variables are presented.
Drill Problem 5.7.1. The length of time William Smith plays a video game is given by random
variable x distributed exponentially with a mean of four minutes. His play during each game is
independent from all other games. Determine: (a) the probability that William is still playing after
four minutes, (b) the probability that, out of ﬁve games, he has played at least one game for more than
four minutes.
Answers: exp(−1), 0.899.
5.8 PROBLEMS
1. Assume x is a Bernoulli random variable. Determine P(x ≤ 3) using the Bernoulli
CDF table if: (a) n = 5, p = 0.1; (b) n = 10, p = 0.1; (c) n = 20, p = 0.1; (d) n = 5,
p = 0.3; (e) n = 10, p = 0.3; (f ) n = 20, p = 0.3; (g) n = 5, p = 0.6; (h) n = 10,
p = 0.6; (i) n = 20, p = 0.6.
2. Suppose you are playing a game with a friend in which you roll a die 10 times. If the
die comes up with an even number, your friend gives you a dollar and if the die comes
up with an odd number you give your friend a dollar. Unfortunately, the die is loaded
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
STANDARDPROBABILITY DISTRIBUTIONS 37
so that a 1 or a 3 are three times as likely to occur as a 2, a 4, a 5 or a 6. Determine:
(a) how many dollars your friend can expect to win in this game; (b) the probability of
your friend winning more than 4 dollars.
3. The probability that a basketball player makes a basket is 0.4. If he makes 10 attempts,
what is the probability he will make: (a) at least 4 baskets; (b) 4 baskets; (c) from 7 to
9 baskets; (d) less than 2 baskets; (e) the expected number of baskets.
4. The probability that Professor Rensselaer bowls a strike is 0.2. Determine the probability
that: (a) 3 of the next 20 rolls are strikes; (b) at least 4 of the next 20 rolls are strikes;
(c) from 3 to 7 of the next 20 rolls are strikes. (d) She is to keep rolling the ball until
she gets a strike. Determine the probability it will take more than 5 rolls. Determine
the: (e) expected number of strikes in 20 rolls; (f ) variance for the number of strikes in
20 rolls; (g) standard deviation for the number of strikes in 20 rolls.
5. The probability of a man hitting a target is 0.3. (a) If he tries 15 times to hit the target,
what is the probability of him hitting it at least 5 but less than 10 times? (b) What
is the average number of hits in 30 tries? (c) What is the probability of him getting
exactly the average number of hits in 30 tries? (d) How many times must the man try
to hit the target if he wants the probability of hitting it to be at least 2/3? (e) What is
the probability that no more than three tries are required to hit the target for the ﬁrst
time?
6. In Junior Bioinstrumentation Lab, one experiment introduces students to the transistor.
Each student is given only one transistor to use. The probability of a student destroying
a transistor is 0.7. One lab class has 5 students and they will perform this experiment
next week. Let random variable x show the possible numbers of students who destroy
transistors. (a) Sketch the PMF for x. Determine: (b) the expected number of destroyed
transistors, (c) the probability that fewer than 2 transistors are destroyed.
7. On a frosty January morning in Fargo, North Dakota, the probability that a car parked
outside will start is 0.6. (a) If we take a sample of 20 cars, what is the probability that
exactly 12 cars will start and 8 will not? (b) What is the probability that the number of
cars starting out of 20 is between 9 and 15.
8. Consider Problem 7. If there are 20,000 cars to be started, ﬁnd the probability that: (a)
at least 12,100 will start; (b) exactly 12,000 will start; (c) the number starting is between
11,900 and 12,150; (d) the number starting is less than 12,500.
9. A dart player has found that the probability of hitting the dart board in any one throw
is 0.2. How many times must he throw the dart so that the probability of hitting the
dart board is at least 0.6?
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
38 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
10. Let random variable x be Bernoulli with n = 15 and p = 0.4. Determine E(x
2
).
11. Suppose x is a Bernoulli random variable with η = 10 and σ
2
= 10/3. Determine:
(a) q, (b) n, (c) p.
12. An electronics manufacturer is evaluating its quality control program. The current
procedure is to take a sample of 5 from 1000 and pass the shipment if not more than 1
component is found defective. What proportion of 20% defective components will be
shipped?
13. Repeat Problem 1, when appropriate, using the Poisson approximation to the Bernoulli
PMF.
14. Acertain intersection averages 3 trafﬁc accidents per week. What is the probability that
more than 2 accidents will occur during any given week?
15. Suppose that on the average, a student makes 6 mistakes per test. Determine the
probability that the student makes: (a) at least 1 mistake; (b) from 3 to 5 mistakes;
(c) exactly 2 mistakes; (d) more than the expected number of mistakes.
16. On the average, Professor Rensselaer gives 11 quizzes per quarter in Introduction to
Random Processes. Determine the probability that: (a) from 8 to 12 quizzes are given
during the quarter; (b) exactly 11 quizzes are given during the quarter; (c) at least 10
quizzes are given during the quarter; (d) at most 9 quizzes are given during the quarter.
17. Suppose a typist makes an average of 30 mistakes per page. (a) If you give him a one
page letter to type, what is the probability that he makes exactly 30 mistakes? (b) The
typist decides to take typing lessons, and, after the lessons, he averages 5 mistakes per
page. You give him another one page letter to type. What is the probability of him
making fewer than 5 mistakes. (c) With the 5 mistakes per page average, what is the
probability of him making fewer than 50 mistakes in a 25 page report?
18. On the average, a sample of radioactive material emits 20 alpha particles per minute.
What is the probability of 10 alpha particles being emitted in: (a) 1 min, (b) 10 min?
(c) Many years later, the material averages 6 alpha particles emitted per minute. What
is the probability of at least 6 alpha particles being emitted in 1 min?
19. At Fargo Polytechnic Institute (FPI), a student may take a course as many times as
desired. Suppose the average number of times a student takes Introduction to Random
Processes is 1.5. (Professor Rensselaer, the course instructor, thinks so many students
repeat the course because they enjoy it so much.) (a) Determine the probability that a
student takes the course more than once. (b) The academic vicepresident of FPI wants
to ensure that on the average, 80% of the students take the course at most one time. To
what value should the mean be adjusted to ensure this?
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
STANDARDPROBABILITY DISTRIBUTIONS 39
20. Suppose 1% of the transistors in a box are defective. Determine the probability that
there are: (a) 3 defective transistors in a sample of 200 transistors; (b) more than 15
defective transistors in a sample of 1000 transistors; (c) 0 defective transistors in a
sample of 20 transistors.
21. A perfect car is assembled with a probability of 2 ×10
−5
. If 15,000 cars are produced
in a month, what is the probability that none are perfect?
22. FPI admits only 1000 freshmen per year. The probability that a student will major in
Bioengineering is 0.01. Determine the probability that fewer than 9 students major in
Bioengineering.
23. (a) Every time a carpenter pounds in a nail, the probability that he hits his thumb is
0.002. If in building a house he pounds 1250 nails, what is the probability of himhitting
his thumb at least once while working on the house? (b) If he takes ﬁve extra minutes
off every time he hits his thumb, how many extra minutes can he expect to take off in
building a house with 3000 nails?
24. The manufacturer of Leaping Lizards, a bran cereal with milk expanding (exploding)
marshmallow lizards, wants to ensure that on the average, 95% of the spoonfuls will
each have at least one lizard. Assuming that the lizards are randomly distributed in the
cereal box, to what value should the mean of the lizards per spoonful be set at to ensure
this?
25. The distribution for the number of students seeking advising help fromProfessor Rens
selaer during any particular day is given by
P(x = k) =
3
k
e
−3
k!
, k = 0, 1, . . . .
The PDF for the time interval between students seeking help for Introduction to
Random Processes from Professor Rensselaer during any particular day is given by
f
t
(τ) = e
−τ
u(τ).
If random variable z equals the total number of students Professor Rensselaer helps
each day, determine: (a) E(z), (b) σ
z
.
26. This year, on its anniversary day, a computer store is going to run an advertising cam
paign in which the employees will telephone 5840 people selected at random from the
population of North America. The caller will ask the person answering the phone
if it’s his or her birthday. If it is, then that lucky person will be mailed a brand
new programmable calculator. Otherwise, that person will get nothing. Assuming that
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
40 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
the person answering the phone won’t lie and that there is no such thing as leap year,
ﬁnd the probability that: (a) the computer store mails out exactly 16 calculators, (b) the
computer store mails out from 20 to 40 calculators.
27. Random variable x is uniform between −2 and 3. Event A = {0 < x ≤ 2} and B =
{−1 < x ≤ 0} ∪ {1 < x ≤ 2}. Find: (a) P(−1 < x < 0), (b) η
x
, (c) σ
x
, (d) f
x A
(α A),
(e) F
x A
(α A), (f ) f
xB
(αB), (g) F
xB
(αB).
28. The time it takes a runner to run a mile is equally likely to lie in an interval from 4.0 to
4.2 min. Determine: (a) the probability it takes the runner exactly 4 min to run a mile,
(b) the probability it takes the runner from 4.1 to 4.15 min.
29. Assume x is a standard Gaussian random variable. Using Tables A.9 and A.10, de
termine: (a) P(x = 0), (b) P(x < 0), (c) P(x < 0.2), (d) P(−1.583 ≤ x < 1.471),
(e) P(−2.1 < x ≤ −0.5), (f ) P(x is an integer).
30. Repeat Problem 1, when appropriate, using the Gaussian approximation to the
Bernoulli PMF.
31. A light bulb manufacturer distributes light bulbs that have a length of life that is
normally distributed with a mean equal to 1200 h and a standard deviation of 40 h.
Find the probability that a bulb burns between 1000 and 1300 h.
32. A certain type of resistor has resistance values that are Gaussian distributed with a
mean of 50 ohms and a variance of 3. (a) Write the PDF for the resistance value. (b)
Find the probability that a particular resistor is within 2 ohms of the mean. (c) Find
P(49 < r < 54).
33. Consider Problem 32. If resistances are measured to the nearest ohm, ﬁnd: (a) the
probability that a particular resistor is within 2 ohms of the mean, (b) P(49 < r < 54).
34. A battery manufacturer has found that 8.08% of their batteries last less than 2.3 years
and 2.5% of their batteries last more than 3.98 years. Assuming the battery lives are
Gaussian distributed, ﬁnd: (a) the mean, (b) variance.
35. Assume that the scores on an examination are Gaussian distributed with mean 75 and
standard deviation 10. Grades are assigned as follows: A: 90–100, B: 80–90, C: 70–80,
D: 60–70, and F: below 60. In a class of 25 students, what is the probability that grades
will be equally distributed?
36. A certain transistor has a current gain, h, that is Gaussian distributed with a mean of 77
and a variance of 11. Find: (a) P(h > 74), (b) P(73 < h ≤ 80), (c) P(h −η
h
 < 3σ
h
).
37. Consider Problem 36. Find the value of d so that the range 77 +d covers 95% of the
current gains.
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
STANDARDPROBABILITY DISTRIBUTIONS 41
38. A250 question multiple choice ﬁnal examis given. Each question has 5 possible answers
and only one correct answer. Determine the probability that a student guesses the correct
answers for 20–25 of 85 questions about which the student has no knowledge.
39. The average golf score for Professor Rensselaer is 78 with a standard deviation of 3.
Assuming a Gaussian distribution for random variable x describing her golf game,
determine: (a) P(x = 78), (b) P(x ≤ 78), (c) P(70 < x ≤ 80), (d) the probability that
x is less than 75 if the score is measured to the nearest unit.
40. Suppose a system contains a component whose length is normally distributed with a
mean of 2.0 and a standard deviation of 0.2. If 5 of these components are removed from
different systems, what is the probability that at least 2 have a length greater than 2.1?
41. A large box contains 10,000 resistors with resistances that are Gaussian distributed. If
the average resistance is 1000 ohms with a standard deviation of 200 ohms, how many
resistors have resistances that are within 10% of the average?
42. The RV x has PDF
f
x
(α) = a exp
_
−
1
2σ
2
(α −η)
2
_
.
(a) Find the constant a. (Hint: assume RV y is independent of x and has PDF f
y
(β) =
f
x
(β) and evaluate F
x,y
(∞, ∞).) (b) Using direct integration, ﬁnd E(x). (c) Find σ
2
x
using direct integration.
43. Assume x and y are jointly distributed Gaussian random variables with x ∼ G(−2, 4),
y ∼ G(3, 9), and ρ
x,y
= 0. Find: (a) P(1 < y < 7  x = 0), (b) P(1 < y < 7),
(c) P(−1 < x < 1, 1 < y < 7).
44. Suppose x and y are jointly distributed Gaussian random variables with E(yx) =
2.8 +0.32x, E(xy) = −1 +0.5y, and σ
yx
= 3.67. Determine: (a) η
x
, (b) η
y
, (c) σ
x
,
(d) σ
y
, (e) ρ
x,y
, (f ) σ
x,y
.
45. Assume x ∼ G(3, 1), y ∼ G(−2, 1), and that x and y are jointly Gaussian with ρ
x,y
=
−0.5. Draw a sketch of the joint Gaussian contour equation showing the original and
the translatedrotated sets of axes.
46. Consider Problem45. Determine: (a) E(yx =0), (b) f
yx
(β0), (c) P(0 < y <4x =0),
(d) P(3 < x < 10).
47. Assume x and y are jointly Gaussian with x ∼ G(2, 13), y ∼ G(1, 8), and ρ
x,y
=
−5.8835. (a) Draw a sketch of the constant contour equation for the standardized RVs
z
1
and z
2
. (b) Using the results of (a), Draw a sketch of the joint Gaussian constant
contour for x and y.
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
42 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
48. Consider Problem 47. Determine: (a) E(y  x =0), (b) f
y  x
(β  0), (c) P(0 <
y <4x =0), (d) P(3 < x < 10).
49. Assume x and y are jointly Gaussian with x ∼ G(−1, 4), y ∼ G(1.6, 7), and ρ
x,y
=
0.378. (a) Draw a sketch of the constant contour equation for the standardized RVs z
1
and z
2
. (b) Using the results of (a), Draw a sketch of the joint Gaussian contour for x
and y.
50. Consider Problem 49. Determine: (a) E(y  x =0), (b) f
yx
(β  0), (c) P(0 <
y <4x =0), (d) P(−3 < x < 0).
51. A component with an exponential failure rate is 90% reliable at 10,000 h. Determine
the number of hours reliable at 95%.
52. Suppose a system has an exponential failure rate in years to failure with η = 2.5. De
termine the number of years reliable at: (a) 90%, (b) 95%, (c) 99%.
53. Consider Problem 52. If 20 of these systems are installed, determine the probability
that 10 are operating at the end of 2.3 years.
54. In the circuit shown in Figure 5.12, each of the four components operate independently
of one another and have an exponential failure rate (in hours) with η = 10
5
. For suc
cessful operation of the circuit, at least two of these components must connect A with
B. Determine the probability that the circuit is operating successfully at 10,000 h.
55. The survival rate of individuals with a certain type of cancer is assumed to be exponential
with η = 4 years. Five individuals have this cancer. Determine the probability that at
most three will be alive at the end of 2.0433 years.
56. Random variable t is exponential with η = 2. Determine: (a) P(t > η), (b) f
z
(α) if
z = t − T, where T is a constant.
57. William Smith is a varsity wrestler on his high school team. Without exception, if he
does not pin his opponent with his trick move, he loses the match on points. William’s
3
1 2
4
A
B
FIGURE5.12: Circuit for Problem 54.
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
STANDARDPROBABILITY DISTRIBUTIONS 43
trick move also prevents him from ever getting pinned. The length of time it takes
William to pin his opponent in each period of a wrestling match is given by:
period 1 : f
x
(α) = 0.4598394 exp(−0.4598394α)u(α),
period 2 : f
x A
(α A) = 0.2299197 exp(−0.2299197α)u(α),
period 3 : f
x A
(α A) = 0.1149599 exp(−0.1149599α)u(α),
where A = {Smith did not pin his opponent during the previous periods}. Assume
each period is 2 min and the match is 3 periods. Determine the probability that William
Smith: (a) pins his opponent during the ﬁrst period: (b) pins his opponent during the
second period: (c) pins his opponent during the third period: (d) wins the match.
58. Consider Problem 57. Find the probability that William Smith wins: (a) at least 4 of
his ﬁrst 5 matches, (b) more matches than he is expected to during a 10 match season.
59. The average time between power failures in a Coop utility is once every 1.4 years.
Determine: (a) the probability that there will be at least one power failure during the
coming year, (b) the probability that there will be at least two power failures during the
coming year.
60. Consider Problem59 statement. Assume that a power failure will last at least 24 h. Sup
pose Fargo Community Hospital has a backup emergency generator to provide auxilary
power during a power failure. Moreover, the emergency generator has an expected time
between failures of once every 200 h. What is the probability that the hospital will be
without power during the next 24 h?
61. The queue for the cash register at a popular package store near Fargo Polytechnic
Institute becomes quite long on Saturdays following football games. On the average,
the queue is 6.3 people long. Each customer takes 4 min to check out. Determine: (a)
your expected waiting time to make a purchase, (b) the probability that you will have
less than four people in the queue ahead of you, (c) the probability that you will have
more than ﬁve people in the queue ahead of you.
62. Outer Space Adventures, Inc. prints brochures describing their vacation packages as
“Unique, Unexpected and Unexplained.” The vacations certainly live up to their ad
vertisement. In fact, they are so unexpected that the length of the vacations is random,
following an exponential distribution, having an average length of 6 months. Suppose
that you have signed up for a vacation trip that starts at the end of this quarter. What
is the probability that you will be back home in time for next fall quarter (that is, 9
months later)?
P1: IML/FFX P2: IML
MOBK04205 MOBK042Enderle.cls October 30, 2006 19:51
44 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
63. A certain brand of light bulbs has an average lifeexpectancy of 750 h. The failure rate
of these light bulbs follows an exponential PDF. Sevenhundred and ﬁfty of these bulbs
were put in light ﬁxtures in four rooms. The lights were turned on and left that way for
a different length of time in each room as follows:
ROOM TIMEBULBS LEFTON, HOURS NUMBEROF BULBS
1 1000 125
2 750 250
3 500 150
4 1500 225
After the speciﬁed length of time, the bulbs were taken from the ﬁxtures and placed
in a box. If a bulb is selected at random from the box, what is the probability that it is
burnt out?
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
45
C H A P T E R 6
Transformations of RandomVariables
Functions of random variables occur frequently in many applications of probability theory. For
example, a full wave rectiﬁer circuit produces an output that is the absolute value of the input.
The input/output characteristics of many physical devices can be represented by a nonlinear
memoryless transformation of the input.
The primary subjects of this chapter are methods for determining the probability dis
tribution of a function of a random variable. We ﬁrst evaluate the probability distribution of
a function of one random variable using the CDF and then the PDF. Next, the probability
distribution for a single random variable is determined from a function of two random variables
using the CDF. Then, the joint probability distribution is found from a function of two random
variables using the joint PDF and the CDF.
6.1 UNIVARIATECDF TECHNIQUE
This section introduces a method of computing the probability distribution of a function of
a random variable using the CDF. We will refer to this method as the CDF technique. The
CDF technique is applicable for all functions z = g(x), and for all types of continuous, discrete,
and mixed random variables. Of course, we require that the function z : S → R
∗
, with z(ζ ) =
g(x(ζ )), is a random variable on the probability space (S, , P); consequently, we require z to
be a measurable function on the measurable space (S, ) and P(z(ζ ) ∈ {−∞, +∞}) = 0.
The ease of use of the CDF technique depends critically on the functional form of g(x).
To make the CDF technique easier to understand, we start the discussion of computing the
probability distribution of z = g(x) with the simplest case, a continuous monotonic function
g. (Recall that if g is a monotonic function then a onetoone correspondence between g(x)
and x exists.) Then, the difﬁculties associated with computing the probability distribution of
z = g(x) are investigated when the restrictions on g(x) are relaxed.
6.1.1 CDF Technique with Monotonic Functions
Consider the problem where the CDF F
x
is known for the RV x, and we wish to ﬁnd the CDF
for random variable z = g(x). Proceeding from the deﬁnition of the CDF for z, we have for a
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
46 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
monotonic increasing function g(x)
F
z
(γ ) = P(z = g(x) ≤ γ ) = P(x ≤ g
−1
(γ )) = F
x
(g
−1
(γ )), (6.1)
where g
−1
(γ ) is the value of α for which g(α) = γ . As (6.1) indicates, the CDF of random
variable z is written in terms of F
x
(α), with the argument α replaced by g
−1
(γ ).
Similarly, if z = g(x) and g is monotone decreasing, then
F
z
(γ ) = P(z = g(x) ≤ γ ) = P(x ≥ g
−1
(γ )) = 1 − F
x
(g
−1
(γ )
−
). (6.2)
The following example illustrates this technique.
Example 6.1.1. Random variable x is uniformly distributed in the interval 0 to 4. Find the CDF
for random variable z = 2x +1.
Solution. Since random variable x is uniformly distributed, the CDF of x is
F
x
(α) =
⎧
⎪
⎨
⎪
⎩
0, α < 0
α/4, 0 ≤ α < 4
1, 4 ≤ α.
Letting g(x) = 2x +1, we see that g is monotone increasing and that g
−1
(γ ) = (γ −1)/2.
Applying (6.1), the CDF for z is given by
F
z
(γ ) = F
x
_
γ −1
2
_
=
⎧
⎪
⎨
⎪
⎩
0, (γ −1)/2 < 0
(γ −1)/8, 0 ≤ (γ −1)/2 < 4
1, 4 ≤ (γ −1)/2.
Simplifying, we obtain
F
z
(γ ) =
⎧
⎪
⎨
⎪
⎩
0, γ < 1
(γ −1)/8, 1 ≤ γ < 9
1, 9 ≤ γ,
which is also a uniform distribution.
6.1.2 CDF Technique with Arbitrary Functions
In general, the relationship between x and z can take on any form, including discontinuities.
Additionally, the function does not have to be monotonic, more than one solution of z = g(x)
can exist—resulting in a manytoone mapping from x to z. In general, the only requirement
on g is that z = g(x) be a random variable. In this general case, F
z
(γ ) is no longer found by
simple substitution. In fact, under these conditions it is impossible to write a general expression
for F
z
(γ ) using the CDF technique. However, this case is conceptually no more difﬁcult than
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
TRANSFORMATIONS OF RANDOMVARIABLES 47
the previous case, and involves only careful book keeping.
Let
A(γ ) = {x : g(x) ≤ γ }. (6.3)
Note that A(γ ) = g
−1
((−∞, γ ]). Then
F
z
(γ ) = P(g(x) ≤ γ ) = P(x ∈ A(γ )). (6.4)
Partition A(γ ) into disjoint intervals {A
i
(γ ) : i = 1, 2, . . .} so that
A(γ ) =
∞
_
i =1
A
i
(γ ). (6.5)
Note that the intervals as well as the number of nonempty intervals depends on γ . Since the
A
i
’s are disjoint,
F
z
(γ ) =
∞
i =1
P(x ∈ A
i
(γ )). (6.6)
The above probabilities are easily found from the CDF F
x
. If interval A
i
(γ ) is of the form
A
i
(γ ) = (a
i
(γ ), b
i
(γ )], (6.7)
then
P(x ∈ A
i
(γ )) = F
x
(b
i
(γ )) − F
x
(a
i
(γ )). (6.8)
Similarly, if interval A
i
(γ ) is of the form
A
i
(γ ) = [a
i
(γ ), b
i
(γ )], (6.9)
then
P(x ∈ A
i
(γ )) = F
x
(b
i
(γ )) − F
x
(a
i
(γ )
−
). (6.10)
The success of this method clearly depends onour ability to partition A(γ ) into disjoint intervals.
Using this method, any function g and CDF F
x
is amenable to a solution for F
z
(γ ). The
following examples illustrate various aspects of this technique.
Example 6.1.2. Random variable x has the CDF
F
x
(α) =
⎧
⎪
⎨
⎪
⎩
0, α < −1
(3α −α
3
+2)/4, −1 ≤ α < 1
1, 1 ≤ α.
Find the CDF for the RV z = x
2
.
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
48 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
Solution. Letting g(x) = x
2
, we ﬁnd
A(γ ) = g
−1
((−∞, γ ]) =
_
∅, γ < 0
[−
√
γ ,
√
γ ], γ ≥ 0.
so that
F
z
(γ ) = F
x
(
√
γ ) − F
x
((−
√
γ )
−
).
Noting that F
x
(α) is continuous and has the same functional form for −1 < α < 1, we obtain
F
z
(γ ) =
⎧
⎪
⎨
⎪
⎩
0, γ < 0
(3
√
γ −(
√
γ )
3
)/2, 0 ≤ γ < 1
1 1 ≤ γ.
Example 6.1.3. Random variable x is uniformly distributed in the interval −3 to 3. Random
variable z is deﬁned by
z = g(x) =
⎧
⎪
⎪
⎪
⎪
⎪
⎨
⎪
⎪
⎪
⎪
⎪
⎩
−1, x < −2
3x +5, −2 ≤ x < −1
−3x −1, −1 ≤ x < 0
3x −1, 0 ≤ x < 1
2, 1 ≤ x.
Find F
z
(γ ).
Solution. Plots for this example are given in Figure 6.1. The CDF for random variable x is
F
x
(α) =
⎧
⎪
⎨
⎪
⎩
0, α < −3
(α +3)/6, −3 ≤ α < 3
1, 3 ≤ α.
Let A(γ ) = g
−1
((−∞, ∞)). Referring to Figure 6.1 we ﬁnd
A(γ ) =
⎧
⎪
⎪
⎨
⎪
⎪
⎩
∅, γ < −1
_
−∞,
γ −5
3
_
∪
_
−γ −1
3
,
γ +1
3
_
, −1 ≤ γ < 2
R 2 ≤ γ.
Consequently,
F
z
(γ ) =
⎧
⎪
⎪
⎪
⎨
⎪
⎪
⎪
⎩
0, γ < −1
F
x
_
γ −5
3
_
+ F
x
_
γ +1
3
_
− F
x
_
_
−γ −1
3
_
−
_
, −1 ≤ γ < 2
1 2 ≤ γ.
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
TRANSFORMATIONS OF RANDOMVARIABLES 49
−3 −1 1 3
1
z
F (γ)
−3 −1 1 3
1
x
F (α)
−3 −1 1 3
4
x
g(x)
2
−1
γ
α
FIGURE6.1: Plots for Example 6.1.3.
Substituting, we obtain
F
z
(γ ) =
⎧
⎪
⎪
⎨
⎪
⎪
⎩
0, γ < −1
1
6
_
γ −5
3
+3 +
γ +1
3
−
−γ −1
3
_
=
γ +2
6
, −1 ≤ γ < 2
1 2 ≤ γ.
Note that F
z
(γ ) has jumps at γ = −1 and at γ = 2.
Whenever random variable x is continuous and g(x) is constant over some interval or
intervals, then random variable z can be continuous, discrete or mixed, dependent on the CDF
for random variable x. As presented in the previous example, random variable z is mixed due to
the constant value of g(x) in the intervals −3 ≤ x < −2 and 1 ≤ x < 3. In fact, z is a mixed
random variable whenever g(α) is constant in an interval where f
x
(α) = 0. This results in a
jump in F
z
and an impulse function in f
z
. Moreover, z = g(x) is a discrete random variable if
g(α) changes values only on intervals where f
x
(α) = 0. Random variable z is continuous if x is
continuous and g(x) is not equal to a constant over any interval where f
x
(α) = 0.
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
50 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
Example 6.1.4. Random variable x has the PDF
f
x
(α) =
_
(1 +α
2
)/6, −1 < α < 2
0, otherwise.
Find the PDF of random variable z deﬁned by
z = g(x) =
⎧
⎪
⎨
⎪
⎩
x −1, x ≤ 0
0, 0 < x ≤ 0.5
1, 0.5 < x.
Solution. To ﬁnd f
z
, we evaluate F
z
ﬁrst, then differentiate this result. The CDF for random
variable x is
F
x
(α) =
⎧
⎪
⎨
⎪
⎩
0, α < −1
(α
3
+3α +4)/18, −1 ≤ α < 2
1, 2 ≤ α.
Figure 6.2 shows a plot of g(x). With the aid of Figure 6.2, we ﬁnd
A(γ ) = {x : g(x) ≤ γ } =
⎧
⎪
⎪
⎪
⎨
⎪
⎪
⎪
⎩
(−∞, γ +1], γ ≤ −1
(−∞, 0], −1 ≤ γ < 0
(−∞, 0.5], 0 ≤ γ < 1
(−∞, ∞), 1 ≤ γ.
Consequently,
F
z
(γ ) =
⎧
⎪
⎪
⎪
⎨
⎪
⎪
⎪
⎩
F
x
(γ +1), γ ≤ −1
F
x
(0), −1 ≤ γ < 0
F
x
(0.5), 0 ≤ γ < 1
1, 1 ≤ γ.
−1 1 2
x
1
−1
g( x)
−2
FIGURE6.2: Transformation for Example 6.1.4.
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
TRANSFORMATIONS OF RANDOMVARIABLES 51
Substituting,
F
z
(γ ) =
⎧
⎪
⎪
⎪
⎪
⎪
⎨
⎪
⎪
⎪
⎪
⎪
⎩
0, γ < −2
((γ +1)
3
+3γ +7)/18, −2 ≤ γ < −1
2/9, −1 ≤ γ < 0
5/16, 0 ≤ γ < 1
1, 1 ≤ γ.
Note that F
z
(γ ) has jumps at γ = 0 and γ = 1 of heights 13/144 and 11/16, respectively.
Differentiating F
z
,
f
z
(γ ) =
3γ
2
+6γ +6
18
(u(γ +2) −u(γ +1)) +
13
144
δ(γ ) +
11
16
δ(γ −1).
Example 6.1.5. Random variable x is uniformly distributed in the interval from 0 to 10. Find the
CDF for random variable z = g(x) = −ln(x).
Solution. The CDF for x is
F
x
(α) =
⎧
⎪
⎨
⎪
⎩
0, α < 0
α/10, 0 ≤ α < 10
1, 10 ≤ α.
For γ > 0, we ﬁnd
A(γ ) = {x : −ln(x) ≤ γ } = (e
−γ
, ∞),
so that F
z
(γ ) = 1 − F
x
((e
−γ
)
−
). Note that P(x ≤ 0) = 0, as required since g(x) = −ln(x) is
not deﬁned (or at least not real–valued) for x ≤ 0. We ﬁnd
F
z
(γ ) =
_
0, γ < ln(0.1)
1 −e
−0.1γ
, ln(0.1) ≤ γ.
The previous examples illustrated evaluating the probability distribution of a function
of a continuous random variable using the CDF technique. This technique is applicable for
all functions z = g(x), continuous and discontinuous. Additionally, the CDF technique is
applicable if random variable x is mixed or discrete. For mixed random variables, the CDF
technique is used without any changes or modiﬁcations as shown in the next example.
Example 6.1.6. Random variable x has PDF
f
x
(α) = 0.5(u(α) −u(α −1)) +0.5δ(α −0.5).
Find the CDF for z = g(x) = 1/x
2
.
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
52 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
Solution. The mixed random variable x has CDF
F
x
(α) =
⎧
⎪
⎪
⎪
⎨
⎪
⎪
⎪
⎩
0, α < 0
0.5α, 0 ≤ α < 0.5
0.5 +0.5α, 0.5 ≤ α < 1
1, 1 ≤ α.
For γ < 0, F
z
(γ ) = 0. For γ > 0,
A(γ ) = {x : x
−2
≤ γ } =
_
−∞, −
1
√
γ
_
∪
_
1
√
γ
, ∞
_
,
so that
F
z
(γ ) = F
x
(−1/
√
γ ) +1 − F
x
((1/
√
γ )
−
).
Since F
x
(−1/
√
γ ) = 0 for all real γ , we have
F
z
(γ ) = 1 −
⎧
⎪
⎪
⎪
⎨
⎪
⎪
⎪
⎩
0, (1/
√
γ )
−
< 0
0.5γ
−1/2
, 0 ≤ (1/
√
γ )
−
< 0.5
0.5 +0.5γ
−1/2
, 0.5 ≤ (1/
√
γ )
−
< 1
1, 1 ≤ (1/
√
γ )
−
.
After some algebra,
F
z
(γ ) =
⎧
⎪
⎨
⎪
⎩
0, γ < 1
0.5 −0.5γ
−1/2
, 1 ≤ γ < 4
1 −0.5γ
−1/2
, 4 ≤ γ.
Drill Problem6.1.1. Randomvariable x is uniformly distributed in the interval −1 to 4. Random
variable z = 3x +2. Determine: (a) F
z
(0), (b)F
z
(1), (c ) f
z
(0), (c ) f
z
(15).
Answers: 0, 2/15, 1/15, 1/15.
Drill Problem6.1.2. Random variable x has the PDF
f
x
(α) = 0.5α(u(α) −u(α −2)).
Random variable z is deﬁned by
z =
⎧
⎪
⎨
⎪
⎩
−1, x < 1
x, −1 ≤ x ≤ 1
1, x > 1.
Determine: (a) F
z
(−1/2), (b)F
z
(1/2), (c )F
z
(3/2), (d) f
z
(1/2).
Answers: 0, 1, 1/16, 1/4.
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
TRANSFORMATIONS OF RANDOMVARIABLES 53
Drill Problem6.1.3. Random variable x has the PDF
f
x
(α) = 0.5α(u(α) −u(α −2)).
Random variable z is deﬁned by
z =
⎧
⎪
⎨
⎪
⎩
−1, x ≤ 0.5
x +0.5, 0.5 < x ≤ 1
3, x > 1.
Determine: (a) F
z
(−1), (b)F
z
(0), (c )F
z
(3/2), (d)F
z
(4).
Answers: 1/4, 1/16, 1/16, 1.
Drill Problem6.1.4. Random variable x has PDF
f
x
(α) = e
−α−1
u(α +1).
Random variable z = 1/x
2
. Determine: (a) F
z
(1/8), (b)F
z
(1/2), (c )F
z
(4), (d) f
z
(4).
Answers: 0.0519, 0.617, 0.089, 0.022.
6.2 UNIVARIATEPDF TECHNIQUE
The previous sectionsolvedthe problemof determining the probability distributionof a function
of a random variable using the cumulative distribution function. Now, we introduce a second
method for calculating the probability distribution of a function z = g(x) using the probability
density function, called the PDF technique. The PDF technique, however, is only applicable
for functions of randomvariables in which z = g(x) is continuous and does not equal a constant
in any interval in which f
x
is nonzero. We introduce the PDF technique for two reasons. First,
in many situations it is much simpler to use than the CDF technique. Second, we will ﬁnd the
PDF method most useful in extensions to multivariate functions. In this section, we discuss
a wide variety of situations using the PDF technique with functions of continuous random
variables. Then, a method for handling mixed random variables with the PDF technique is
introduced. Finally, we consider computing the conditional PDF of a function of a random
variable using the PDF technique.
6.2.1 Continuous RandomVariable
Theorem 6.2.1. Let x be a continuous RV with PDF f
x
(α) and let the RV z = g(x) . Assume g
is continuous and not constant over any interval for which f
x
= 0 . Let
α
i
= α
i
(γ ) = g
−1
(γ ), i = 1, 2, . . . , (6.11)
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
54 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
denote the distinct solutions to g(α
i
) = γ . Then
f
z
(γ ) =
∞
i =1
f
x
(α
i
(γ ))
g
(1)
(α
i
(γ ))
, (6.12)
where we interpret
f
x
(α
i
(γ ))
g
(1)
(α
i
(γ ))
= 0,
if f
x
(α
i
(γ ) = 0.
Proof. Let h > 0 and deﬁne
I(γ, h) = {x : γ −h < g(x) ≤ γ }.
Partition I(γ, h) into disjoint intervals of the form
I
i
(γ, h) = (a
i
(γ, h), b
i
(γ, h)), i = 1, 2, . . . ,
such that
I(γ, h) =
∞
_
i =1
I
i
(γ, h).
Then
F
z
(γ ) − F
z
(γ −h) =
∞
i =1
(F
x
(b
i
(γ, h)) − F
x
(a
i
(γ, h)))
By hypothesis,
lim
h→0
a
i
(γ, h) = lim
h→0
b
i
(γ, h) = α
i
(γ ).
Note that (for all γ with f
x
(α
i
(γ )) = 0)
lim
h→0
b
i
(γ, h) −a
i
(γ, h)
h
= lim
h→0
b
i
(γ, h) −a
i
(γ, h)
g(b
i
(γ, h)) − g(a
i
(γ, h))
=
1
g
(1)
(α
i
(γ ))
,
and that
lim
h→0
F
x
(b
i
(γ, h)) − F
x
(a
i
(γ, h))
b
i
(γ, h) −a
i
(γ, h)
= f
x
(α
i
(γ )).
The desired result follows by taking the product of the above limits. The absolute value ap
pears because by construction we have b
i
> a
i
and h > 0, whereas g
(1)
may be positive or
negative.
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
TRANSFORMATIONS OF RANDOMVARIABLES 55
Example 6.2.1. Randomvariable x is uniformly distributed in the interval 0–4. Find the PDF for
random variable z = g(x) = 2x +1.
Solution. In this case, there is only one solution to the equation γ = 2α +1, given by α
1
=
(γ −1)/2. We easily ﬁnd g
(1)
(α) = 2. Hence
f
z
(γ ) = f
x
((γ −1)/2)/2 =
_
1/8, 1 < γ < 9
0, otherwise.
Example 6.2.2. Random variable x has PDF
f
x
(α) = 0.75(1 −α
2
)(u(α +1) −u(α −1)).
Find the PDF for random variable z = g(x) = 1/x
2
.
Solution. For γ < 0, there are no solutions to g(α
i
) = γ , so that f
z
(γ ) = 0 for γ < 0. For
γ > 0 there are two solutions to g(α
i
) = γ :
α
1
(γ ) = −
1
√
γ
, and α
2
(γ ) =
1
√
γ
.
Since γ = g(α) = α
−2
, we have g
(1)
(α) = −2α
−3
; hence, g
(1)
(α
i
) = 2/α
i

3
= 2γ 
3/2
, and
f
z
(γ ) =
f
x
(−γ
−1/2
) + f
x
(γ
−1/2
)
2γ
3/2
u(γ ).
Substituting,
f
z
(γ ) =
0.75(1 −γ
−1
)(u(1 −γ
−1/2
) −0 +1 −u(γ
−1/2
−1))
2γ
3/2
u(γ ).
Simplifying,
f
z
(γ ) =
0.75(1 −γ
−1
)(u(γ −1) −0 +1 −u(1 −γ ))
2γ
3/2
u(γ ),
or
f
z
(γ ) = 0.75(γ
−
3
2
−γ
−
5
2
)u(γ −1).
Example 6.2.3. Random variable x has PDF
f
x
(α) =
1
6
(1 +α
2
)(u(α +1) −u(α −2)).
Find the PDF for random variable z = g(x) = x
2
.
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
56 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
Solution. For γ < 0 there are no solutions to g(α
i
) = γ , so that f
z
(γ ) = 0 for γ < 0. For
γ > 0 there are two solutions to g(α
i
) = γ :
α
1
(γ ) = −
√
γ , and α
2
(γ ) =
√
γ .
Since γ = g(α) = α
2
, we have g
(1)
(α) = 2α; hence, g
(1)
(α
i
) = 2α
i
 = 2
√
γ , and
f
z
(γ ) =
f
x
(−
√
γ ) + f
x
(
√
γ )
2
√
γ
u(γ ).
Substituting,
f
z
(γ ) =
1 +γ
12
√
γ
(u(1 −
√
γ ) −u(−2 −
√
γ ) +u(
√
γ +1) −u(
√
γ −2))u(γ ).
Simplifying,
f
z
(γ ) =
1 +γ
12
√
γ
(u(1 −γ ) −0 +1 −u(γ −4))u(γ ),
or
f
z
(γ ) =
⎧
⎪
⎨
⎪
⎩
(γ
−1/2
+γ
1/2
)/6, 0 < γ < 1
(γ
−1/2
+γ
1/2
)/12, 1 < γ < 4
0, elsewhere.
6.2.2 Mixed RandomVariable
Consider the problem where random variable x is mixed, and we wish to ﬁnd the PDF for
z = g(x). Here, we treat the discrete and continuous portions of f
x
separately, and then combine
the results to yield f
z
. The continuous part of the PDF of x is handled by the PDF technique.
To illustrate the use of this technique, consider the following example.
Example 6.2.4. Random variable x has PDF
f
x
(α) =
3
8
(u(α +1) −u(α −1)) +
1
8
δ(α +0.5) +
1
8
δ(α −0.5).
Find the PDF for the RV z = g(x) = e
−x
.
Solution. There is only one solution to g(α) = γ :
α
1
(γ ) = −ln(γ ).
We have g
(1)
(α
1
) = −e
ln(γ )
= −γ . The probability masses of 1/8 for x at −0.5 and 0.5 are
mapped to probability masses of 1/8 for z at e
0.5
and e
−0.5
, respectively. For all γ > 0 such that
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
TRANSFORMATIONS OF RANDOMVARIABLES 57
α
1
(γ ) ±0.5 > 0 we have
f
z
(γ ) =
f
x
(−ln(γ ))
γ
.
Combining these results, we ﬁnd
f
z
(γ ) =
3
8γ
(u(γ −e
−1
) −u(γ −e )) +
1
8
δ(γ −e
0.5
) +
1
8
δ(γ −e
−0.5
).
6.2.3 Conditional PDF Technique
Since a conditional PDF is also a PDF, the above techniques apply to ﬁnd the conditional PDF
for z = g(x), given event A. Consider the problem where random variable x has PDF f
x
, and
we wish to evaluate the conditional PDF for random variable z = g(x), given that event A
occurred. Depending on whether the event A is deﬁned on the range or domain of z = g(x),
one of the following two methods may be used to determine the conditional PDF of z using
the PDF technique.
(i) If A is an event deﬁned for an interval on z, the conditional PDF, f
z A
, is computed by ﬁrst
evaluating f
z
using the technique in this section. Then, by the deﬁnition of a conditional
PDF, we have
f
z A
(γ  A) =
f
z
(γ )
P(A)
, γ ∈ A, (6.13)
and f
z A
(γ  A) = 0 for γ ∈A.
(ii) If A is an event deﬁned for an interval on x, we will use the conditional PDF of x to evaluate
the conditional PDF for z as
f
z A
(γ  A) =
∞
i =1
f
z A
(α
i
(γ ) A)
g
(1)
(α
i
(γ ))
. (6.14)
Example 6.2.5. Random variable x has the PDF
f
x
(α) =
1
6
(1 +α
2
)(u(α +1) −u(α −2)).
Find the PDF for random variable z = g(x) = x
2
, given A = {x : x > 0}.
Solution. First, we solve for the conditional PDF for x and then ﬁnd the conditional PDF for
z, based on f
x A
. We have
P(A) =
1
6
_
2
0
(1 +α
2
)dα =
7
9
,
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
58 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
so that
f
x A
(α A) =
3
14
(1 +α
2
)(u(α) −u(α −2)).
There is only one solution to γ = g(α) = α
2
on the interval 0 < α < 2 where f
x A
= 0. We
have α
1
(γ ) =
√
γ and g
(1)
(α
1
(γ )) = 2
√
γ . Consequently,
f
z A
(γ  A) =
3
28
_
√
γ +
1
√
γ
_
(u(γ ) −u(γ −4)).
Drill Problem6.2.1. Randomvariable x has a uniformPDFinthe interval 0–8. Randomvariable
z = 3x +1 .Use the PDF method to determine: (a) f
z
(0), (b) f
z
(6), (c )E(z), (d)σ
2
z
.
Answers: 13, 48, 0, 1/24.
Drill Problem6.2.2. Random variable x has the PDF
f
x
(α) =
⎧
⎪
⎨
⎪
⎩
9α
2
, 0 ≤ α < 0.5
3(1 −α
2
), 0.5 ≤ α ≤ 1
0, otherwise.
Random variable z = x
3
.Use the PDF method to determine: (a) f
z
(1/27), (b) f
z
(1/4), (c ) f
zz>1/8
(1/4z > 1/8), (d) f
z
(2).
Answers: 1.52, 0, 3, 2.43.
Drill Problem6.2.3. Random variable x has the PDF
f
x
(α) =
2
9
α(u(α) −u(α −3)).
Random variable z = (x −1)
2
. Use the PDF method to determine: (a) f
z
(1/4), (b) f
z
(9/4), (c )
f
zz≤1
(1/4z ≤ 1), (d)E(zz ≤ 1).
Answers: 4/9, 5/27, 1/3, 1.
Drill Problem6.2.4. Random variable x has the PDF
f
x
(α) =
2
9
(α +1)(u(α +1) −u(α −2)).
Random variable z = 2x
2
and event A = {x : x ≥ 0}. Determine: (a) P(A), (b) f
x A
(1 A), (c )
f
z A
(2 A),(d) f
z A
(9 A).
Answers: 0, 1/2, 1/8, 8/9.
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
TRANSFORMATIONS OF RANDOMVARIABLES 59
6.3 ONEFUNCTIONOF TWORANDOMVARIABLES
Consider a random variable z = g(x, y) created from jointly distributed random variables x
and y. In this section, the probability distribution of z = g(x, y) is computed using a CDF
technique similar to the one at the start of this chapter. Because we are dealing with regions in
a plane instead of intervals on a line, these problems are not as straightforward and tractable as
before.
With z = g(x, y), we have
F
z
(γ ) = P(z ≤ γ ) = P(g(x, y) ≤ γ ) = P((x, y) ∈ A(γ )), (6.15)
where
A(γ ) = {(x, y) : g(x, y) ≤ γ }. (6.16)
The CDF for the RV z can then be found by evaluating the integral
F
z
(γ ) =
_
A(γ )
dF
x,y
(α, β). (6.17)
This result cannot be continued further until a speciﬁc F
x,y
and g(x, y) are considered. Re
member that in the case of a single random variable, our efforts primarily dealt with algebraic
manipulations. Here, our efforts are concentrated on evaluating F
z
through integrals, with the
ease of solution critically dependent on g(x, y).
The ease in solution for F
z
is dependent on transforming A(γ ) into proper limits of
integration. Sketching the support region for f
x,y
(the region where f
x,y
= 0, or F
x,y
is not
constant) and the region A(γ ) is often most helpful, even crucial, in the problem solution. Pay
careful attention to the limits of integration to determine the range of integration in which the
integrand is zero because f
x,y
= 0. Let us consider several examples to illustrate the mechanics
of the CDF technique and also to provide further insight.
Example 6.3.1. Random variables x and y have joint PDF
f
x,y
(α, β) =
_
1/4, 0 < α < 2, 0 < β < 2
0, otherwise.
Find the CDF for z = x + y.
Solution. We have A(γ ) = {(α, β) : α +β ≤ γ }. We require the volume under the surface
f
x,y
(α, β) where α ≤ γ −β:
F
z
(γ ) =
_
∞
−∞
_
γ −β
−∞
f
x,y
(α, β)dα dβ.
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
60 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
( a) 0 < γ < 2 ( b) 2 < γ < 4
α 0
β
2
2
γ
γ
2
2 − γ
β
0 α
2
2
γ
γ
− γ
FIGURE6.3: Plots for Example 6.3.1.
For γ < 0 we have F
z
(γ ) = 0. For 0 ≤ γ < 2, with the aid of Figure 6.3(a) we obtain
F
z
(γ ) =
_
2
0
_
γ −β
0
1
4
dαdβ =
1
8
γ
2
.
For 2 ≤ γ < 4, referring to Figure 6.3(b), we consider the complementary region (to save some
work):
F
z
(γ ) = 1 −
_
2
γ −2
_
2
γ −β
1
4
dαdβ = 1 −
1
8
(4 −γ )
2
.
Finally, for 4 ≤ γ , F
z
(γ ) = 1.
Example 6.3.2. Random variables x and y have joint PDF
f
x,y
(α, β) =
_
1, 0 < α < 1, 0 < β < 1
0, otherwise.
Find the PDF for z = x − y.
Solution. We have A(γ ) = {(α, β) : α −β ≤ γ }. We require the volume under the surface
f
x,y
(α, β) where α ≤ γ +β:
F
z
(γ ) =
_
∞
−∞
_
γ +β
−∞
f
x,y
(α, β)dαdβ.
For γ < −1 we have F
z
(γ ) = 0 and f
z
(γ ) = 0. With the aid of Figure 6.4(a), for −1 ≤ γ < 0,
F
z
(γ ) =
_
1
−γ
_
γ +β
0
dαdβ =
1
2
(1 +γ )
2
,
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
TRANSFORMATIONS OF RANDOMVARIABLES 61
β
0 α
1
1
1
γ
γ
β
0
α
1
1 1+
−γ − γ
(a) −1 < γ < 0 (b) 0 < γ < 1
FIGURE6.4: Plots for Example 6.3.2.
so that f
z
(γ ) = 1 +γ . For 0 ≤ γ < 1, we consider the complementary region shown in Fig
ure 6.4(b) (to save some work):
F
z
(γ ) = 1 −
_
1−γ
0
_
1
γ +β
dαdβ = 1 −
1
2
(1 −γ )
2
,
so that f
z
(γ ) = 1 −γ . Finally, for 1 ≤ γ , F
z
(γ ) = 1, so that f
z
(γ ) = 0.
Example 6.3.3. Find the CDF for z = x/y, where x and y have the joint PDF
f
x,y
(α, β) =
_
1/α, 0 < β < α < 1
0, otherwise.
Solution. We have A(γ ) = {(α, β) : α/β ≤ γ }. Inside the support region for f
x,y
, we have
α/β > 1; hence, for γ < 1 we have F
z
(γ ) = 0. As shown in Figure 6.5 it is easiest to integrate
with respect to β ﬁrst: for 1 ≤ γ ,
F
z
(γ ) =
_
1
0
_
α
α/γ
1
α
dβdα = 1 −
1
γ
.
β
γ
0 α
1
1
1
FIGURE6.5: Integration region for Example 6.3.3.
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
62 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
Example 6.3.4. Find the CDF for z = x
2
+ y
2
, where x and y have the joint PDF
f
x,y
(α, β) =
_
3α, 0 < β < α < 1
0, otherwise.
Solution. We have A(γ ) = {(α, β) : α
2
+β
2
≤ γ }. For γ < 0, we obtain F
z
(γ ) = 0. Trans
forming to polar coordinates: α = r cos(θ), β = r sin(θ),
F
z
(γ ) =
_
2π
0
_
r
2
≤γ
f
x,y
(r cos(θ), r sin(θ))r dr dθ.
Referring to Figure 6.6(a), for 0 ≤ γ < 1,
F
z
(γ ) =
_
π/4
0
_
√
γ
0
3r
2
cos(θ)dr dθ =
1
√
2
γ
3
2
.
For 1 ≤ γ < 2, we split the integral into two parts: one with polar coordinates, the other using
rectangular coordinates. With
sin(θ
1
) =
√
γ −1
√
γ
,
we ﬁnd with the aid of Figure 6.6(b) that
F
z
(γ ) =
_
π/4
θ
1
_
√
γ
0
3r
2
cos(θ)dr dθ +
_
1
0
_
α
√
γ −1
0
3αdβ dα,
or
F
z
(γ ) =
1
√
2
γ
3
2
−(γ −1)
3
2
.
Finally, we have F
z
(γ ) = 1 for 2 ≤ γ .
(a) 0 < γ < 1 (b) 1 < γ < 2
β
0 α
1
1
γ
γ
α 0
β
1
1
γ
γ
1 − γ
FIGURE6.6: Integration regions for Example 6.3.4.
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
TRANSFORMATIONS OF RANDOMVARIABLES 63
Drill Problem6.3.1. Random variables x and y have joint PDF
f
x,y
(α, β) = e
−α
e
−β
u(α)u(β).
Random variable z = x − y . Find (a) F
z
(−1/3), (b) f
z
(−1), (c )F
z
(1), and (d) f
z
(1).
Answers:
1
4
e
−1
,
3
4
e
−1
, 1 −
3
4
e
−1
,
3
4
e
−1
.
6.4 BIVARIATETRANSFORMATIONS
In this section, we ﬁnd the joint distribution of random variables z = g(x, y) and w = h(x, y)
fromjointly distributed randomvariables x and y. First, we consider a bivariate CDF technique.
Then, the joint PDF technique for ﬁnding the joint PDF for random variables z and w formed
from x and y is described. Next, the case of one function of two random variables is treated by
using the joint PDF technique with an auxiliary random variable. Finally, the conditional joint
PDF is presented.
6.4.1 Bivariate CDF Technique
Let x and y be jointly distributed RVs on the probability space (S, , P), and let z = g(x, y)
and w = h(x, y). Deﬁne A(γ, ψ) to be the region of the x − y plane for which z = g(x, y) ≤ γ
and w = h(x, y) ≤ ψ; i.e.,
A(γ, ψ) = {(x, y) : g(x, y) ≤ γ, h(x, y) ≤ ψ}. (6.18)
Note that
A(γ, ψ) = g
−1
((−∞, γ ]) ∩ h
−1
((−∞, ψ]). (6.19)
Then
F
z,w
(γ, ψ) =
_
A(γ,ψ)
_
dF
x,y
(α, β). (6.20)
It is often difﬁcult to perform the integration indicated in (6.20).
Example 6.4.1. Random variables x and y have the joint PDF
f
x,y
(α, β) =
_
0.25, 0 ≤ α ≤ 2, 0 ≤ β ≤ 2
0, otherwise.
With z = g(x, y) = x + y and w = h(x, y) = y, ﬁnd the joint PDF f
z,w
.
Solution. We have
g
−1
((−∞, γ ]) = {(x, y) : x + y ≤ γ }
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
64 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
y
x
ψ
γ
γ ψ −
FIGURE6.7: Integration region for Example 6.4.1.
and
h
−1
((−∞, ψ]) = {(x, y) : y ≤ ψ}.
The intersection of these regions is
A(γ, ψ) = {(x, y) : y ≤ min(γ − x, ψ)},
which is illustrated in Figure. 6.7. With the aid of Figure 6.7 and (6.20) we ﬁnd
F
z,w
(γ, ψ) =
_
ψ
−∞
_
γ −β
−∞
d F
x,y
(α, β).
Instead of carrying out the above integration and then differentiating the result, we differentiate
the above integral to obtain the PDF f
z,w
directly. We ﬁnd
∂ F
z,w
(γ, ψ)
∂γ
= lim
h
1
→0
1
h
1
_
ψ
−∞
_
γ −β
γ −h
1
−β
d F
x,y
(α, β),
and
∂
2
F
z,w
(γ, ψ)
∂ψ ∂γ
= lim
h
2
→0
lim
h
1
→0
1
h
1
h
2
_
ψ
ψ−h
2
_
γ −β
γ −h
1
−β
dF
x,y
(α, β).
Performing the indicated limits, we ﬁnd that f
z,w
(γ, ψ) = f
x,y
(γ −ψ, ψ); substituting, we
obtain
f
z,w
(γ, ψ) =
_
0.25, 0 < γ −ψ < 2, 0 < ψ < 2
0, otherwise.
When the RVs x and y are jointly continuous, it is usually easier to ﬁnd the joint PDF
f
z,w
than to carry out the integral indicated in (6.20).
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
TRANSFORMATIONS OF RANDOMVARIABLES 65
6.4.2 Bivariate PDF Technique
A very important special case of bivariate transformations is when the RVs x and y are jointly
continuous RVs and the mapping determined by g and h is continuous with g and h having
continuous partial derivatives. Let h
1
> 0, h
2
> 0 and deﬁne
I(γ, h
1
, ψ, h
2
) = g
−1
((γ −h
1
, γ ]) ∩ h
−1
((ψ −h
2
, ψ]). (6.21)
Partition I into disjoint regions so that
I(γ, h
1
, ψ, h
2
) =
_
i
I
i
(γ, h
1
, ψ, h
2
). (6.22)
Let (α
i
, β
i
) denote the unique element of
lim
h
1
→0
lim
h
2
→0
I
i
(γ, h
1
, ψ, h
2
).
Then for small h
1
and small h
2
we have
h
1
h
2
f
v,w
(γ, ψ) ≈
i
f
x,y
(α
i
, β
i
)
_
I
i
(γ,h
1
,ψ,h
2
)
dα dβ. (6.23)
Dividing both sides of the above by h
1
h
2
and letting h
1
→ 0 and h
2
→ 0, the approximation
becomes an equality. The result is summarized by the theorem below.
Theorem 6.4.1. Let x and y be jointly continuous RVs with PDF f
x,y
, and let z = g(x, y) and
w = h(x, y) . Let
(α
i
(γ, ψ), β
i
(γ, ψ)), i = 1, 2, . . . , (6.24)
be the distinct solutions to the simultaneous equations
g(α
i
, β
i
) = γ (6.25)
and
h(α
i
, β
i
) = ψ. (6.26)
Deﬁne the Jacobian
J (α, β) =
¸
¸
¸
¸
¸
¸
¸
¸
¸
∂g(α, β)
∂α
∂g(α, β)
∂β
∂h(α, β)
∂α
∂h(α, β)
∂β
¸
¸
¸
¸
¸
¸
¸
¸
¸
, (6.27)
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
66 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
where the indicated partial derivatives are assumed to exist. Then
f
z,w
(γ, ψ) =
∞
i =1
f
x,y
(α
i
, β
i
)
¸
¸
J (α
i
, β
i
)
¸
¸
. (6.28)
The most difﬁcult aspects to ﬁnding f
z,w
using the joint PDF technique are solving the si
multaneous equations γ = g(α
i
, β
i
) and ψ = h(α
i
, β
i
) for α
i
and β
i
, and determining the
support region for f
z,w
. Let us consider several examples to illustrate the mechanics of this
PDF technique.
Example 6.4.2. Random variables x and y have the joint PDF
f
x,y
(α, β) =
_
1/4, 0 < α < 2, 0 < β < 2
0, otherwise.
Find the joint PDF for z = g(x, y) = x + y and w = h(x, y) = y.
Solution. There is only one solution to γ = α +β and ψ = β: α
1
= γ −ψ and β
1
= ψ. The
Jacobian of the transformation is
J =
¸
¸
¸
¸
¸
1 1
0 1
¸
¸
¸
¸
¸
= 1 · 1 −1 · 0 = 1.
Applying Theorem 1, we ﬁnd
f
z,w
(γ, ψ) =
f
x,y
(α
1
, β
1
)
 J 
= f
x,y
(γ −ψ, ψ).
Substituting, we obtain
f
z,w
(γ, ψ) =
_
1/4, 0 < γ −ψ < 2, 0 < ψ < 2
0, otherwise.
The region of support for f
z,w
is illustrated in Figure 6.8.
Example 6.4.3. Random variables x and y have the joint PDF
f
x,y
(α, β) =
_
12αβ(1 −α), 0 < α < 1, 0 < β < 1
0, otherwise.
Find the joint PDF for z = g(x, y) = x
2
y and w = h(x, y) = y.
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
TRANSFORMATIONS OF RANDOMVARIABLES 67
ψ
1 2 3 4
γ 0
1
2
FIGURE6.8: Support region for Example 6.4.2.
Solution. Solving γ = α
2
β and ψ = β, we ﬁnd α
1
=
√
γ/ψ and β
1
= ψ. The other solution
α
2
= −α
1
is not needed since f
x,y
(α, β) = 0 for α < 0. The Jacobian is
J
1
=
¸
¸
¸
¸
¸
2α
1
β
1
α
2
1
0 1
¸
¸
¸
¸
¸
= 2α
1
β
1
= 2
_
γ ψ.
Applying Theorem 1, we ﬁnd
f
z,w
(γ, ψ) =
f
x,y
(
√
γ/ψ, ψ)
2
√
γ ψ
=
_
6(1 −
√
γ/ψ), 0 <
√
γ/ψ < 1, 0 < ψ < 1
0, otherwise.
Example 6.4.4. Random variables x and y have the joint PDF
f
x,y
(α, β) =
_
0.25(α +1), −1 < α < 1, −1 < β < 1
0, otherwise.
Find the joint PDF for z = g(x, y) = xy and w = h(x, y) = y/x.
Solution. Solving γ = αβ and ψ = β/α, we ﬁnd α = γ/β = β/ψ, so that β
2
= γ ψ. Let
ting β
1
=
√
γ ψ we have α
1
=
√
γ/ψ. Then β
2
= −
√
γ ψ and α
2
= −
√
γ/ψ. Note that the
solution (α
1
, β
1
) is in the ﬁrst quadrant of the α −β plane; the solution (α
2
, β
2
) is in the third
quadrant of the α −β plane. We ﬁnd
J =
¸
¸
¸
¸
¸
β α
−β/α
2
1/α
¸
¸
¸
¸
¸
= 2
β
α
,
so that J
1
= 2ψ = J
2
. Hence
f
z,w
(γ, ψ) =
f
x,y
(
√
γ/ψ,
√
γ ψ)
2ψ
+
f
x,y
(−
√
γ/ψ, −
√
γ ψ)
2ψ
.
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
68 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
In the γ −ψ plane,
f
x,y
(
√
γ/ψ,
√
γ ψ)
2ψ
has support region speciﬁed by
0 <
_
γ/ψ < 1 and 0 <
_
γ ψ < 1,
or
0 <
γ
ψ
< 1 and 0 < γ ψ < 1.
Similarly,
f
x,y
(−
√
γ/ψ, −
√
γ ψ)
2ψ
has support region speciﬁed by
−1 < −
_
γ/ψ < 0 and −1 < −
_
γ ψ < 0,
or
0 <
γ
ψ
< 1 and 0 < γ ψ < 1.
Consequently, the two components of the PDF f
z,w
have identical regions of support in
the γ −ψ plane. In the ﬁrst quadrant of the γ −ψ plane, this support region is easily seen to be
0 < γ < ψ < γ
−1
. Similarly, in the third quadrant, the support region is γ
−1
< ψ < γ < 0.
This support region is illustrated in Figure 6.9. Finally, we ﬁnd
f
z,w
(γ, ψ) =
_
1/(4ψ), 0 < γ < ψ < γ
−1
, or γ
−1
< ψ < γ < 0
0, otherwise.
Auxiliary RandomVariables
The joint PDF technique can also be used to transform random variables x and y with joint
PDF f
x,y
to randomvariable z = g(x, y) by introducing auxiliary randomvariable w = h(x, y)
to ﬁnd f
z,w
and then ﬁnding the marginal PDF f
z
using
f
z
(γ ) =
_
∞
−∞
f
z,w
(γ, ψ)dψ. (6.29)
It is usually advisable to let the auxiliary random variable w equal a quantity which allows
a convenient solution of the Jacobian and/or the inverse equations. This method is an alternative
to the CDF technique presented earlier.
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
TRANSFORMATIONS OF RANDOMVARIABLES 69
3
2
1
−1
−2
−3
−3 −2 −1
1 2 3
ψ
γ
FIGURE6.9: Support region for Example 6.4.4.
Example 6.4.5. Random variables x and y have the joint PDF
f
x,y
(α, β) =
_
4αβ, 0 < α < 1, 0 < β < 1
0, otherwise.
Find the PDF for z = g(x, y) = x
2
.
Solution. Let auxiliary variable w = h(x, y) = y. Solving γ = α
2
and ψ = β, we ﬁnd α =
±
√
γ and β = ψ. The only solution inside the support region for f
x,y
is α =
√
γ and β = ψ.
The Jacobian of the transformation is
J =
¸
¸
¸
¸
¸
2α 0
0 1
¸
¸
¸
¸
¸
= 2α,
so that
f
z,w
(γ, ψ) =
f
x,y
(
√
γ , ψ)
2
√
γ
=
⎧
⎨
⎩
4
√
γ ψ
2
√
γ
= 2ψ, 0 < γ < 1, 0 < ψ < 1
0, otherwise.
We ﬁnd the marginal PDF for z as
f
z
(γ ) =
_
∞
−∞
f
z,w
(γ, ψ)dψ =
_
1
0
2ψ dψ.
for 0 < γ < 1 and f
z
(γ ) = 0, otherwise.
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
70 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
1 2 0
1
γ
ψ
FIGURE6.10: Support region for Example 6.4.6.
Example 6.4.6. Random variables x and y have the joint PDF
f
x,y
(α, β) =
_
4αβ, 0 < α < 1, 0 < β < 1
0, otherwise.
Find the PDF for z = g(x, y) = x + y.
Solution. Let auxiliary variable w = h(x, y) = y. Solving γ = α +β and ψ = β, we ﬁnd
α = γ −ψ and β = ψ. We ﬁnd
J =
¸
¸
¸
¸
¸
1 1
0 1
¸
¸
¸
¸
¸
= 1,
so that
f
z,w
(γ, ψ) = f
x,y
(γ −ψ, ψ) =
_
4(γ −ψ)ψ, 0 < γ −ψ < 1, 0 < ψ < 1
0, otherwise.
The support region for f
z,w
is shown in Figure 6.10. Referring to Figure 6.10, for 0 < γ < 1,
f
z
(γ ) =
_
γ
0
4(γ −ψ)ψ dψ =
2
3
γ
3
.
For 1 < γ < 2,
f
z
(γ ) =
_
1
γ −1
4(γ −ψ)ψ dψ = −
2
3
γ
3
+4γ −
8
3
.
Otherwise, f
z
= 0.
Conditional PDF Technique
Since a conditional PDF is also a PDF, the above techniques apply to ﬁnd the conditional PDF
for z = g(x, y), given event A. Consider the problemwhere randomvariables x and y have joint
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
TRANSFORMATIONS OF RANDOMVARIABLES 71
PDF f
x,y
, and we wish to evaluate the conditional PDF for random variable z = g(x, y), given
that event A occurred. Depending on whether the event A is deﬁned on the range or domain of
z = g(x, y), one of the following two methods may be used to determine the conditional PDF
of z using the bivariate PDF technique.
i. If A is an event deﬁned for an interval on z, the conditional PDF, f
z A
, is computed by ﬁrst
evaluating f
z
using the technique in this section. Then, by the deﬁnition of a conditional
PDF, we have
f
z A
(γ  A) =
f
z
(γ )
P(A)
, γ ∈ A. (6.30)
ii. If A is an event deﬁned for a region in the x − y plane, we will use the conditional PDF
f
x,y A
to evaluate the conditional PDF f
z A
as follows. First, introduce an auxiliary random
variable w = h(x, y), and evaluate
f
z,w A
(γ, ψ A) =
∞
i =1
f
x,y A
(α
i
(γ, ψ), β
i
(γ, ψ) A)
 J (α
i
(γ, ψ), β
i
(γ, ψ))
, (6.31)
where (α
i
, β
i
), i = 1, 2, . . ., are the solutions to γ = g(α, β) and ψ = h(α, β) in region
A. Then evaluate the marginal conditional PDF
f
z A
(γ  A) =
_
∞
−∞
f
z,w A
(γ, ψ A)dψ. (6.32)
Example 6.4.7. Random variables x and y have joint PDF
f
x,y
(α, β) =
_
3α, 0 < β < α < 1
0, otherwise.
Find the PDF for z = x + y, given event A = {max(x, y) ≤ 0.5}.
Solution. We begin by ﬁnding the conditional PDF for x and y, given A. From Figure 6.11(a)
we ﬁnd
P(A) =
_
A
_
f
x,y
(α, β)dα dβ =
_
0.5
0
_
α
0
3α dβ dα =
1
8
;
consequently,
f
x,y A
(α, β A) =
⎧
⎨
⎩
f
x,y
(α, β)
P(A)
= 24α, 0 < β < α < 0.5
0, otherwise.
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
72 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
FIGURE6.11: Support region for Example 6.4.7.
Letting the auxiliary random variable w = h(x, y) = y, the only solution to γ = g(α, β) =
α +β and ψ = h(α, β) = β is α = γ −ψ and β = ψ. The Jacobian of the transformation is
J = 1 so that
f
z,w A
(γ, ψ A) =
f
x,y A
(γ −ψ, ψ A)
1
.
Substituting, we ﬁnd
f
z,w A
(γ, ψ A) =
_
24(γ −ψ), 0 < ψ < γ −ψ < 0.5
0, otherwise.
The support region for f
z,w A
is illustrated in Figure 6.11(b). The conditional PDF for z, given
A, can be found from
f
z A
(γ  A) =
_
∞
−∞
f
z,w A
(γ, ψ A)dψ.
The integration is easily carried out with the aid of Figure 6.11(b). For 0.5 < γ < 1 we have
f
z A
(γ  A) =
_
0.5γ
0
24(γ −ψ)dψ = 9γ
2
.
For 0.5 < γ < 1 we obtain
f
z A
(γ  A) =
_
0.5γ
γ −0.5
24(γ −ψ)dψ = 3(1 −γ
2
).
For γ < 0 or γ > 1, f
z A
(γ  A) = 0.
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
TRANSFORMATIONS OF RANDOMVARIABLES 73
Drill Problem6.4.1. Random variables x and y have joint PDF
f
x,y
(α, β) = e
−α−β
u(α)u(β).
Random variable z = x and w = xy. Find: (a) f
z,w
(1, 1), (b) f
z,w
(−1, 1).
Answers: 0, e
−2
.
Drill Problem6.4.2. Random variables x and y have joint PDF
f
x,y
(α, β) =
_
4αβ, 0 < α < 1, 0 < β < 1
0, otherwise.
Random variable z = x + y and w = y
2
. Determine: (a) f
z,w
(1, 1/4), (b) f
z
(−1/2), (c ) f
z
(1/2),
(d) f
z
(3/2).
Answers: 0, 1/12, 1, 13/12.
Drill Problem6.4.3. Random variables x and y have joint PDF
f
x,y
(α, β) =
_
1.2(α
2
+β), 0 < α < 1, 0 < β < 1
0, otherwise.
Random variable z = x
2
y. Determine: (a) f
z
(−1), (b) f
z
(1/4), (c ) f
z
(1/2), (d) f
z
(3/2).
Answers: 0.7172, 0, 0, 1.3.
Drill Problem6.4.4. Random variables x and y have joint PDF
f
x,y
(α, β) =
_
2β, 0 < α < 1, 0 < β < 1
0, otherwise.
Random variable z = x − y and event A = {(x, y) : x + y ≤ 1}. Determine: (a) f
z A
(−3/
2 A), (b) f
z A
(−1/2 A), (c )F
z A
(0 A), (d) f
z A
(1/2 A).
Answers: 0, 3/4, 15/16, 3/16.
6.5 SUMMARY
This chapter presented a number of different approaches to ﬁnd the probability distribution of
functions of random variables.
Two methods are presentedto ﬁndthe probability distributionof a functionof one random
variable, z = g(x). The ﬁrst, and the most general approach, is called the CDF technique. From
the deﬁnition of the CDF, we write
F
z
(γ ) = P(z ≤ γ ) = P(g(x) ≤ γ ) = P(x ∈ A(γ )), (6.33)
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
74 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
where A(γ ) = {x : g(x) ≤ γ }. By partitioning A(γ ) into disjoint intervals, the CDF F
z
may be
found using the CDF F
x
. The second approach, called the PDF technique, involves evaluating
f
z
(γ ) =
∞
i =1
f
x
(α
i
(γ ))
¸
¸
g
(1)
(α
i
(γ ))
¸
¸
, (6.34)
where
α
i
= α
i
(γ ) = g
−1
(γ ), i = 1, 2, . . . , (6.35)
denote the distinct solutions to g(α
i
) = γ . Typically, the PDF technique is much simpler to
use than the CDF technique. However, the PDF technique is applicable only when z = g(x)
is continuous and does not equal a constant in any interval in which f
x
is nonzero.
Next, we evaluated the probability distribution of a random variable z = g(x, y) created
from jointly distributed random variables x and y using two approaches. The ﬁrst approach, a
CDF technique, involves evaluating
F
z
(γ ) = P(z ≤ γ ) = P(g(x, y) ≤ γ ) = P((x, y) ∈ A(γ )), (6.36)
where
A(γ ) = {(x, y) : g(x, y) ≤ γ }. (6.37)
The CDF for the RV z can then be found by evaluating the integral
F
z
(γ ) =
_
A(γ )
dF
x,y
(α, β). (6.38)
The ease of solution here involves transforming A(γ ) into proper limits of integration. We wish
to remind the reader that the special case of a convolution integral is obtained when random
variables x and y are independent and z = x + y. The second approach involves introducing
an auxiliary random variable and using the PDF technique applied to two functions of two
random variables.
To ﬁnd the joint probability distribution of randomvariables z = g(x, y) and w = h(x, y)
from jointly distributed random variables x and y, a bivariate CDF as well as a joint PDF
technique were presented. Using the joint PDF technique, the joint PDF for z and w can be
found as
f
z,w
(γ, ψ) =
∞
i =1
f
x,y
(α
i
, β
i
)
¸
¸
J (α
i
, β
i
)
¸
¸
, (6.39)
where
(α
i
(γ, ψ), β
i
(γ, ψ)), i = 1, 2, . . . , (6.40)
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
TRANSFORMATIONS OF RANDOMVARIABLES 75
are the distinct solutions to the simultaneous equations
g(α
i
, β
i
) = γ (6.41)
and
h(α
i
, β
i
) = ψ. (6.42)
The Jacobian
J (α, β) =
¸
¸
¸
¸
¸
¸
¸
¸
∂g(α, β)
∂α
∂g(α, β)
∂β
∂h(α, β)
∂α
∂h(α, β)
∂β
¸
¸
¸
¸
¸
¸
¸
¸
, (6.43)
where the indicated partial derivatives are assumed to exist. Typically, the most difﬁcult aspect
of the joint PDF technique is in solving the simultaneous equations. The joint PDF technique
can also be used to ﬁnd the probability distribution for z = g(x, y) from f
x,y
by introducing an
auxiliary random variable w = h(x, y) to ﬁnd f
z,w
, and then integrating to obtain the marginal
PDF f
z
.
6.6 PROBLEMS
1. Let random variable x be uniform between 0 and 2 with z = exp(x). Find F
z
using the
CDF technique.
2. Given
f
x
(α) =
_
0.5(1 +α), −1 < α < 1
0, otherwise,
and
z =
_
x −1, x < 0
x +1, x > 0.
Use the CDF technique to ﬁnd F
z
.
3. Suppose random variable x has the CDF
F
x
(α) =
⎧
⎪
⎨
⎪
⎩
0, α < 0
α
2
, 0 ≤ α < 1
1, 1 ≤ α
and z = exp(−x). Using the CDF technique, ﬁnd F
z
.
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
76 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
4. The PDF of random variable x is
f
x
(α) =
1
2
e
−
1
2
(α+2)
u(α +2).
Find F
z
using the CDF technique when z = x.
5. Suppose random variable x has the PDF
f
x
(α) =
_
2α, 0 < α < 1
0, otherwise.
Random variable z is deﬁned by
z =
⎧
⎪
⎨
⎪
⎩
−1, x ≤ −1
x, −1 < x < 1
1, x ≥ 1.
Using the CDF method, determine F
z
.
6. Random variable x represents the input to a halfwave rectiﬁer and z represents the
output, so that z = u(x). Given that x is uniformly distributed between −2 and 2, ﬁnd:
(a) E(z) using f
x
, (b) F
z
using the CDF method, (c) f
z
, (d) E(z) using f
z
(compare
with the answer to part a).
7. Random variable x represents the input to a fullwave rectiﬁer and z represents the
output, so that z = x. Given that x is uniformly distributed between −2 and 2, ﬁnd:
(a) E(z) using f
x
, (b) F
z
using the CDF method, (c) f
z
, (d) E(z) using f
z
(compare
with the answer to part a).
8. Random variable x has the PDF f
x
(α) = e
−α
u(α). Find F
z
for z = x
2
using the CDF
method.
9. Given
f
x
(α) =
a
1 +α
2
and
z =
⎧
⎪
⎨
⎪
⎩
−1, x < −1
x, −1 ≤ x ≤ 2
2, x > 2.
Determine: (a) a, (b) F
z
.
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
TRANSFORMATIONS OF RANDOMVARIABLES 77
10. Random variables x and z are the input and output of a quantizer. The relationship
between them is deﬁned by:
z =
⎧
⎪
⎪
⎪
⎪
⎪
⎨
⎪
⎪
⎪
⎪
⎪
⎩
0, x < 0.5
1, 0.5 ≤ x < 1.5
2, 1.5 ≤ x < 2.5
3, 2.5 ≤ x < 3.5
4, x ≥ 3.5.
Given that the input follows a Gaussian distribution with x ∼ G(2.25, 0.49), ﬁnd f
z
using the CDF method.
11. Random variable x has the PDF f
x
(α) = e
−α
u(α). With z = 100 −25x, ﬁnd: (a) F
z
using the CDF technique, (b) F
z
using the PDF technique.
12. Random variable x has the following PDF
f
x
(α) =
_
4α
3
, 0 < α < 1
0, otherwise.
Find the PDFs for the following random variables: (a) z = x
3
, (b) z = (x −1/2)
2
, (c)
z = (x −1/4)
2
.
13. Random variable x has the CDF
F
x
(α) =
⎧
⎪
⎨
⎪
⎩
0, α < −1
(α
2
+2α +1)/9, −1 ≤ α < 2
1, 2 ≤ α
and
z =
⎧
⎪
⎨
⎪
⎩
x, x < −0.5
0, −0.5 ≤ x ≤ 0
x, 0 < x.
Determine F
z
.
14. Suppose
f
x
(α) =
_
(1 +α
2
)/6, −1 < α < 2
0, otherwise.
Let z = 1/x
2
. Use the CDF technique to determine F
z
.
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
78 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
z g( x ) =
x
3
1
−1
2 1 −1 −2
FIGURE6.12: Plot for Problem 15.
15. Random variable x has the CDF
F
x
(α) =
⎧
⎪
⎨
⎪
⎩
0, α < −1
0.25(α +1), −1 ≤ α < 3
1, 3 ≤ α.
Find the CDF F
z
, with RV z = g(x), and g(x) shown in Figure 6.12. Assume that
g(x) is a second degree polynomial for x ≥ 0.
16. The voltage x in Figure 6.13 is a random variable which is uniformly distributed from
−1 to 2. Find the PDF f
z
. Assume the diode is ideal.
17. The voltage x in Figure 6.14 is a Gaussian random variable with mean η
x
= 0 and
standard deviation σ
x
= 3. Find the PDF f
z
. Assume the diodes are ideal.
18. The voltage x in Figure 6.15 is a Gaussian random variable with mean η
x
= 1 and
standard deviation σ
x
= 1. Find the PDF f
z
. Assume the diode and the operational
ampliﬁer are ideal.
19. Randomvariable x is Gaussian with mean η
x
= 0 and standard deviation σ
x
= 1. With
z = g(x) shown in Figure 6.16, ﬁnd the PDF f
z
.
+
z
−
+
x
−
3KΩ
1K Ω
FIGURE6.13: Circuit for Problem 16.
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
TRANSFORMATIONS OF RANDOMVARIABLES 79
+
z
−
+
x
−
R
2 V 3 V
FIGURE6.14: Circuit for Problem 17.
20. Random variable x has the following PDF
f
x
(α) =
_
α +0.5δ(α −0.5), 0 < α < 1
0, otherwise.
Find F
z
if z = x
2
.
21. Random variable x is uniform in the interval 0 to 12. Random variable z = 4x +2.
Find f
z
using the PDF technique.
22. Find f
z
if z = 1/x
2
and x is uniform on −1 to 2.
23. Random variable x has the PDF
f
x
(α) =
_
2(α +1)/9, −1 < α < 2
0, otherwise.
Find the PDF of z = 2x
2
using the PDF technique.
24. Let z = cos(x). Find f
z
if:
(a) f
x
(α) =
_
1/π, α < π/2
0, otherwise.
1K Ω
5K Ω
x
z
FIGURE6.15: Circuit for Problem 18.
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
80 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
−2 −1 1
x
1.4
z g( x ) =
2
−0.5
FIGURE6.16: Transformation for Problem 19.
(b) f
x
(α) =
_
8α/π
2
, 0 < α < π/2
0, otherwise.
25. Given that x has the CDF
F
x
(α) =
⎧
⎪
⎨
⎪
⎩
0, α < 0
α, 0 ≤ α < 1
1, 1 ≤ α.
Find the PDF of z = −2 ln(x) using the PDF technique.
26. Random variable x has the PDF
f
x
(α) =
_
2α/9, 0 < α < 3
0, otherwise.
Random variable z = (x −1)
2
and event A = {x : x ≥ 1/2}. Find the PDF of random
variable z, given event A.
27. Random variable x is uniform between −1 and 1. Random variable
z =
_
x
2
, x < 0
x, x ≥ 0.
Using the PDF technique, ﬁnd f
z
.
28. A voltage v is a Gaussian random variable with η
v
= 0 and σ
v
= 2. Random variable
w = v
2
/R represents the power dissipated in a resistor of R with v volts across the
resistor. Find (a) f
w
, (b) f
w A
if A = {v ≥ 0}.
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
TRANSFORMATIONS OF RANDOMVARIABLES 81
29. Random variable x has an exponential distribution with mean one. If z = e
−x
, use
the PDF technique to determine: (a) f
z A
(γ  A) if A = {x : x > 2}, (b) f
zB
(γ B) if
B = {z : z < 1/2}.
30. Find f
z
if z = 1/x and
f
x
(α) =
1
π(α
2
+1)
.
31. Given that random variable x has the PDF
f
x
(α) =
⎧
⎪
⎨
⎪
⎩
α, 0 < α < 1
2 −α, 1 < α < 2
0, otherwise.
Using the PDF technique, ﬁnd the PDF of z = x
2
.
32. Suppose
f
x
(α) =
_
2α, 0 < α < 1
0, otherwise,
and z = 8x
3
. Determine f
z
using the PDF technique.
33. Given
f
x
(α) =
⎧
⎪
⎨
⎪
⎩
9α
2
, 0 < α < 0.5
3(1 −α
2
), 0.5 < α < 1
0, otherwise,
and z = −2 ln(x). Determine: (a) f
z
, (b) F
z
, (c) E(z), (d) E(z
2
).
34. Using the PDF technique, ﬁnd f
z
if z = x and x is a standard Gaussian random
variable.
35. Suppose RV x has PDF f
x
(α) = 0.5α(u(α) −u(α −2)). Find a transformation g such
that z = g(x) has PDF
f
z
(γ ) = c γ
2
(u(γ ) −u(γ −1)).
36. Let
f
x
(α) =
_
0.75(1 −α
2
), −1 < α < 1
0, otherwise,
and z = x
2
. Determine f
z
using the PDF technique.
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
82 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
37. Find f
z
if z = sin(x) and
f
x
(α) =
_
1/(2π), 0 ≤ α < 2π
0, otherwise.
38. Random variable x has the PDF
f
x
(α) =
⎧
⎪
⎨
⎪
⎩
0.5, −1 < α < 0
0.5 −0.25α, 0 < α < 2
0, otherwise.
(a) Find the transformation z = g(x) so that
f
z
(γ ) =
⎧
⎪
⎨
⎪
⎩
1 −0.25γ, 0 < γ < 1
0.5 −0.25γ, 1 < γ < 2
0, otherwise.
(b) Determine f
z
if z = (2x +2)u(x).
39. Let random variable x have the PDF
f
x
(α) =
_
1/12, −2 < α < 1
1/4, 1 < α < 2;
in addition, P(x = −2) = P(x = 2) = 0.25. If z = 1/x, ﬁnd f
z
.
40. Random variable x has PDF
f
x
(α) =
1
4
δ(α +1) +
1
4
δ(α) +(u(α) −u(α −2)).
Random variable z = g(x), with g shown in Figure 6.17. Find the PDF f
z
.
41. Random variable x has PDF
f
x
(α) =
α
2
3
(u(α +1) −u(α −2)).
−1 1
x
2
z = g(x)
2
1
0
FIGURE6.17: Transformation for Problem 40.
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
TRANSFORMATIONS OF RANDOMVARIABLES 83
−1 1
2
z = g( x)
2
1
0
x
FIGURE6.18: Transformation for Problem 41.
Random variable z = g(x), with g shown in Figure 6.18. Use the PDF technique to
ﬁnd: (a) f
z
, (b) f
z A
with A = {x > 0}.
42. The PDF for random variable x is f
x
(α) = e
−α
u(α). With z = e
−x
, determine: (a)
f
z A
(γ  A) where A = {x : x > 2}, (b) f
zB
(γ B) where B = {z : z < 1/2}.
43. The joint PDF of x and y is
f
x,y
(α, β) =
_
e
−α
, 0 < β < α
0, otherwise.
With z = x + y, write an expression(s) for F
z
(γ ) (do not solve, just write the integral(s)
necessary to ﬁnd F
z
.
44. If z = x/y, ﬁnd F
z
when x and y have the joint PDF
f
x,y
(α, β) =
_
0.25α, 0 < β < α < 1
0, otherwise.
45. Resistors R
1
and R
2
have values r
1
and r
2
which are independent RVs uniformly
distributed between 1 and 2 . With r denoting the equivalent resistance of a series
connection of R
1
and R
2
, ﬁnd the PDF f
r
using convolution.
46. Resistors R
1
and R
2
have values r
1
and r
2
which are independent RVs uniformly
distributed between 1 and 2 . With g denoting the equivalent conductance of a
parallel connection of R
1
and R
2
, ﬁnd the PDF f
g
using convolution.
47. Suppose the voltage v across a resistor is a random variable with PDF
f
v
(α) =
_
6α(1 −α), 0 < α < 1
0, otherwise,
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
84 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
and that resistance r is a random variable with PDF
f
r
(β) =
_
1/12, 94 < β < 106
0, otherwise.
Moreover, suppose that v and r are independent random variables. Find the PDF for
power, p = v
2
/r .
48. The joint PDF for random variables x and y is
f
x,y
(α, β) =
_
α +β, 0 < α < 1, 0 < β < 1
0, otherwise.
Let z = 2x + y and w = x +2y. Find f
z,w
.
49. The joint PDF for random variables x and y is
f
x,y
(α, β) =
_
4αβ, 0 < α < 1, 0 < β < 1
0, otherwise.
Let z = x
2
and w = xy. Find: (a) f
z,w
, (b) f
z
.
50. The joint PDF for random variables x and y is
f
x,y
(α, β) =
_
3(α
2
+β
2
), 0 < β < α < 1
0, otherwise.
Find the joint PDF for random variables z = x + y and w = x − y.
51. The joint PDF for random variables x and y is
f
x,y
(α, β) =
_
0.5βe
−α
, 0 < α, 0 < β < 2
0, otherwise.
With z = y/x, ﬁnd f
z
.
52. The joint PDF for random variables x and y is
f
x,y
(α, β) =
_
8αβ, 0 < α
2
+β
2
< 1, 0 < α, 0 < β
0, otherwise.
Let A = {(x, y) : x > y}, z = x and w = x
2
+ y
2
. Find: (a) f
z,w A
, (b) f
z A
.
53. The joint PDF for random variables x and y is
f
x,y
(α, β) =
_
1/α, 0 < β < α < 1
0, otherwise.
Let z = x/y and w = y. Find f
z,w
.
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
TRANSFORMATIONS OF RANDOMVARIABLES 85
54. The joint PDF for random variables x and y is
f
x,y
(α, β) =
_
α sin(β), 0 < α < 1, 0 < β < π
0, otherwise.
Let z = x
2
and w = cos(y). Find f
z,w
.
55. The joint PDF for random variables x and y is
f
x,y
(α, β) =
_
12αβ(1 −α), 0 < α < 1, 0 < β < 1
0, otherwise.
Let z = x
2
y. Find f
z
.
56. The joint PDF for random variables x and y is
f
x,y
(α, β) =
_
e
−α−β
, 0 < α, 0 < β
0, otherwise.
Let z = x + y and w = x/(x + y). Find: (a) f
z,w
, (b) f
z
.
57. The joint PDF for random variables x and y is
f
x,y
(α, β) =
_
0.25(α +β), α < 1, β < 1
0, otherwise.
Let z = xy and w = y/x. Find f
z,w
using the joint CDF technique.
P1: IML/FFX P2: IML
MOBK04206 MOBK042Enderle.cls October 30, 2006 19:53
86
P1: IML/FFX P2: IML
MOBK042Appe MOBK042Enderle.cls October 30, 2006 19:54
87
A P P E N D I X A
Distribution Tables
TABLEA.1: Bernoulli CDF for n = 5 and n = 10
n = 5
p k +0 +1 +2 +3 +4
0.05 0 0.773781 0.977408 0.998842 0.999970 1.000000
0.1 0 0.590490 0.918540 0.991440 0.999540 0.999990
0.15 0 0.443705 0.835210 0.973388 0.997773 0.999924
0.2 0 0.327680 0.737280 0.942080 0.993280 0.999680
0.25 0 0.237305 0.632813 0.896484 0.984375 0.999023
0.3 0 0.168070 0.528220 0.836920 0.969220 0.997570
0.35 0 0.116029 0.428415 0.764831 0.945978 0.994748
0.4 0 0.077760 0.336960 0.682560 0.912960 0.989760
0.45 0 0.050328 0.256218 0.593127 0.868780 0.981547
0.5 0 0.031250 0.187500 0.500000 0.812500 0.968750
n = 10
p k +0 +1 +2 +3 +4
0.05 0 0.598737 0.913862 0.988496 0.998971 0.999936
5 0.999997 1.000000 1.000000 1.000000 1.000000
0.1 0 0.348678 0.736099 0.929809 0.987205 0.998365
5 0.999853 0.999991 1.000000 1.000000 1.000000
0.15 0 0.196874 0.544300 0.820197 0.950030 0.990126
5 0.998617 0.999865 0.999991 1.000000 1.000000
(Continued )
P1: IML/FFX P2: IML
MOBK042Appe MOBK042Enderle.cls October 30, 2006 19:54
88 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
TABLEA.1: Bernoulli CDF for n = 5 and n = 10 (Continued)
n = 10
p k +0 +1 +2 +3 +4
0.2 0 0.107374 0.375810 0.677800 0.879126 0.967206
5 0.993631 0.999136 0.999922 0.999996 1.000000
0.25 0 0.056314 0.244025 0.525593 0.775875 0.921873
5 0.980272 0.996494 0.999584 0.999970 0.999999
0.3 0 0.028248 0.149308 0.382783 0.649611 0.849732
5 0.952651 0.989408 0.998410 0.999856 0.999994
0.35 0 0.013463 0.085954 0.261607 0.513827 0.751495
5 0.905066 0.973976 0.995179 0.999460 0.999972
0.4 0 0.006047 0.046357 0.167290 0.382281 0.633103
5 0.833761 0.945238 0.987705 0.998322 0.999895
0.45 0 0.002533 0.023257 0.099560 0.266038 0.504405
5 0.738437 0.898005 0.972608 0.995498 0.999659
0.5 0 0.000977 0.010742 0.054688 0.171875 0.376953
5 0.623047 0.828125 0.945313 0.989258 0.999023
TABLEA.2: Bernoulli CDF for n = 15
n = 15
p k +0 +1 +2 +3 +4
0.05 0 0.463291 0.829047 0.963800 0.994533 0.999385
5 0.999947 0.999996 1.000000 1.000000 1.000000
10 1.000000 1.000000 1.000000 1.000000 1.000000
0.1 0 0.205891 0.549043 0.815939 0.944444 0.987280
5 0.997750 0.999689 0.999966 0.999997 1.000000
10 1.000000 1.000000 1.000000 1.000000 1.000000
P1: IML/FFX P2: IML
MOBK042Appe MOBK042Enderle.cls October 30, 2006 19:54
DISTRIBUTIONTABLES 89
TABLEA.2: Bernoulli CDF for n = 15 (Continued)
n = 15
p k +0 +1 +2 +3 +4
0.15 0 0.087354 0.318586 0.604225 0.822655 0.938295
5 0.983190 0.996394 0.999390 0.999919 0.999992
10 0.999999 1.000000 1.000000 1.000000 1.000000
0.2 0 0.035184 0.167126 0.398023 0.648162 0.835766
5 0.938949 0.981941 0.995760 0.999215 0.999887
10 0.999988 0.999999 1.000000 1.000000 1.000000
0.25 0 0.013363 0.080181 0.236088 0.461287 0.686486
5 0.851632 0.943380 0.982700 0.995807 0.999205
10 0.999885 0.999988 0.999999 1.000000 1.000000
0.3 0 0.004748 0.035268 0.126828 0.296868 0.515491
5 0.721621 0.868857 0.949987 0.984757 0.996347
10 0.999328 0.999908 0.999991 1.000000 1.000000
0.35 0 0.001562 0.014179 0.061734 0.172696 0.351943
5 0.564282 0.754842 0.886769 0.957806 0.987557
10 0.997169 0.999521 0.999943 0.999996 1.000000
0.4 0 0.000470 0.005172 0.027114 0.090502 0.217278
5 0.403216 0.609813 0.786897 0.904953 0.966167
10 0.990652 0.998072 0.999721 0.999975 0.999999
0.45 0 0.000127 0.001692 0.010652 0.042421 0.120399
5 0.260760 0.452160 0.653504 0.818240 0.923071
10 0.974534 0.993673 0.998893 0.999879 0.999994
0.5 0 0.000031 0.000488 0.003693 0.017578 0.059235
5 0.150879 0.303619 0.500000 0.696381 0.849121
10 0.940765 0.982422 0.996307 0.999512 0.999969
P1: IML/FFX P2: IML
MOBK042Appe MOBK042Enderle.cls October 30, 2006 19:54
90 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
TABLEA.3: Bernoulli CDF for n = 20
n = 20
p k +0 +1 +2 +3 +4
0.05 0 0.358486 0.735840 0.924516 0.984098 0.997426
5 0.999671 0.999966 0.999997 1.000000 1.000000
10 1.000000 1.000000 1.000000 1.000000 1.000000
15 1.000000 1.000000 1.000000 1.000000 1.000000
0.1 0 0.121577 0.391747 0.676927 0.867047 0.956825
5 0.988747 0.997614 0.999584 0.999940 0.999993
10 0.999999 1.000000 1.000000 1.000000 1.000000
15 1.000000 1.000000 1.000000 1.000000 1.000000
0.15 0 0.038760 0.175558 0.404896 0.647725 0.829847
5 0.932692 0.978065 0.994079 0.998671 0.999752
10 0.999961 0.999995 0.999999 1.000000 1.000000
15 1.000000 1.000000 1.000000 1.000000 1.000000
0.2 0 0.011529 0.069175 0.206085 0.411449 0.629648
5 0.804208 0.913307 0.967857 0.990018 0.997405
10 0.999437 0.999898 0.999985 0.999998 1.000000
15 1.000000 1.000000 1.000000 1.000000 1.000000
0.25 0 0.003171 0.024313 0.091260 0.225156 0.414842
5 0.617173 0.785782 0.898188 0.959075 0.986136
10 0.996058 0.999065 0.999816 0.999970 0.999996
15 1.000000 1.000000 1.000000 1.000000 1.000000
0.3 0 0.000798 0.007637 0.035483 0.107087 0.237508
5 0.416371 0.608010 0.772272 0.886669 0.952038
10 0.982855 0.994862 0.998721 0.999739 0.999957
15 0.999994 0.999999 1.000000 1.000000 1.000000
0.35 0 0.000181 0.002133 0.012118 0.044376 0.118197
5 0.245396 0.416625 0.601027 0.762378 0.878219
10 0.946833 0.980421 0.993985 0.998479 0.999689
15 0.999950 0.999994 0.999999 1.000000 1.000000
P1: IML/FFX P2: IML
MOBK042Appe MOBK042Enderle.cls October 30, 2006 19:54
DISTRIBUTIONTABLES 91
TABLEA.3: Bernoulli CDF for n = 20 (Continued)
n = 20
p k +0 +1 +2 +3 +4
0.4 0 0.000037 0.000524 0.003611 0.015961 0.050952
5 0.125599 0.250011 0.415893 0.595599 0.755337
10 0.872479 0.943474 0.978971 0.993534 0.998388
15 0.999683 0.999953 0.999995 1.000000 1.000000
0.45 0 0.000006 0.000111 0.000927 0.004933 0.018863
5 0.055334 0.129934 0.252006 0.414306 0.591361
10 0.750711 0.869235 0.941966 0.978586 0.993566
15 0.998469 0.999723 0.999964 0.999997 1.000000
0.5 0 0.000001 0.000020 0.000201 0.001288 0.005909
5 0.020695 0.057659 0.131588 0.251722 0.411901
10 0.588099 0.748278 0.868412 0.942341 0.979305
15 0.994091 0.998712 0.999799 0.999980 0.999999
TABLEA.4: Poisson CDF for λ = 0.1, 0.2, . . . , 1, 1.5, 2, . . . , 4.5
λ k +0 +1 +2 +3 +4
0.1 0 0.904837 0.995321 0.999845 0.999996 1.000000
0.2 0 0.818731 0.982477 0.998852 0.999943 0.999998
0.3 0 0.740818 0.963064 0.996400 0.999734 0.999984
0.4 0 0.670320 0.938448 0.992074 0.999224 0.999939
0.5 0 0.606531 0.909796 0.985612 0.998248 0.999828
5 0.999986 0.999999 1.000000 1.000000 1.000000
0.6 0 0.548812 0.878099 0.976885 0.996642 0.999605
5 0.999961 0.999997 1.000000 1.000000 1.000000
0.7 0 0.496585 0.844195 0.965858 0.994247 0.999214
5 0.999910 0.999991 0.999999 1.000000 1.000000
(Continued )
P1: IML/FFX P2: IML
MOBK042Appe MOBK042Enderle.cls October 30, 2006 19:54
92 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
TABLEA.4: Poisson CDF for λ = 0.1, 0.2, . . . , 1, 1.5, 2, . . . , 4.5 (Continued)
λ k +0 +1 +2 +3 +4
0.8 0 0.449329 0.808792 0.952577 0.990920 0.998589
5 0.999816 0.999979 0.999998 1.000000 1.000000
0.9 0 0.406570 0.772482 0.937143 0.986541 0.997656
5 0.999656 0.999957 0.999995 1.000000 1.000000
1 0 0.367879 0.735759 0.919699 0.981012 0.996340
5 0.999406 0.999917 0.999990 0.999999 1.000000
1.5 0 0.223130 0.557825 0.808847 0.934358 0.981424
5 0.995544 0.999074 0.999830 0.999972 0.999996
2 0 0.135335 0.406006 0.676676 0.857123 0.947347
5 0.983436 0.995466 0.998903 0.999763 0.999954
2.5 0 0.082085 0.287297 0.543813 0.757576 0.891178
5 0.957979 0.985813 0.995753 0.998860 0.999723
10 0.999938 0.999987 0.999998 1.000000 1.000000
3 0 0.049787 0.199148 0.423190 0.647232 0.815263
5 0.916082 0.966491 0.988095 0.996197 0.998897
10 0.999708 0.999929 0.999984 0.999997 0.999999
3.5 0 0.030197 0.135888 0.320847 0.536633 0.725445
5 0.857614 0.934712 0.973261 0.990126 0.996685
10 0.998981 0.999711 0.999924 0.999981 0.999996
4 0 0.018316 0.091578 0.238103 0.433470 0.628837
5 0.785130 0.889326 0.948866 0.978637 0.991868
10 0.997160 0.999085 0.999726 0.999924 0.999980
4.5 0 0.011109 0.061099 0.173578 0.342296 0.532104
5 0.702930 0.831051 0.913414 0.959743 0.982907
10 0.993331 0.997596 0.999195 0.999748 0.999926
15 0.999980 0.999995 0.999999 1.000000 1.000000
P1: IML/FFX P2: IML
MOBK042Appe MOBK042Enderle.cls October 30, 2006 19:54
DISTRIBUTIONTABLES 93
TABLEA.5: Poisson CDF for λ = 5, 5.5, . . . , 8.5
λ k +0 +1 +2 +3 +4
5 0 0.006738 0.040428 0.124652 0.265026 0.440493
5 0.615961 0.762183 0.866628 0.931906 0.968172
10 0.986305 0.994547 0.997981 0.999302 0.999774
15 0.999931 0.999980 0.999995 0.999999 1.000000
5.5 0 0.004087 0.026564 0.088376 0.201699 0.357518
5 0.528919 0.686036 0.809485 0.894357 0.946223
10 0.974749 0.989012 0.995549 0.998315 0.999401
15 0.999800 0.999937 0.999981 0.999995 0.999999
6 0 0.002479 0.017351 0.061969 0.151204 0.285057
5 0.445680 0.606303 0.743980 0.847238 0.916076
10 0.957379 0.979908 0.991173 0.996372 0.998600
15 0.999491 0.999825 0.999943 0.999982 0.999995
6.5 0 0.001503 0.011276 0.043036 0.111850 0.223672
5 0.369041 0.526524 0.672758 0.791573 0.877384
10 0.933161 0.966120 0.983973 0.992900 0.997044
15 0.998840 0.999570 0.999849 0.999949 0.999984
7 0 0.000912 0.007295 0.029636 0.081765 0.172992
5 0.300708 0.449711 0.598714 0.729091 0.830496
10 0.901479 0.946650 0.973000 0.987189 0.994283
15 0.997593 0.999042 0.999638 0.999870 0.999956
7.5 0 0.000553 0.004701 0.020257 0.059145 0.132062
5 0.241436 0.378155 0.524639 0.661967 0.776408
10 0.862238 0.920759 0.957334 0.978435 0.989740
15 0.995392 0.998041 0.999210 0.999697 0.999889
20 0.999961 0.999987 0.999996 0.999999 1.000000
(Continued )
P1: IML/FFX P2: IML
MOBK042Appe MOBK042Enderle.cls October 30, 2006 19:54
94 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
TABLEA.5: Poisson CDF for λ = 5, 5.5, . . . , 8.5 (Continued)
λ k +0 +1 +2 +3 +4
8 0 0.000335 0.003019 0.013754 0.042380 0.099632
5 0.191236 0.313374 0.452961 0.592547 0.716624
10 0.815886 0.888076 0.936203 0.965819 0.982743
15 0.991769 0.996282 0.998406 0.999350 0.999747
20 0.999906 0.999967 0.999989 0.999996 0.999999
8.5 0 0.000203 0.001933 0.009283 0.030109 0.074364
5 0.149597 0.256178 0.385597 0.523105 0.652974
10 0.763362 0.848662 0.909083 0.948589 0.972575
15 0.986167 0.993387 0.996998 0.998703 0.999465
20 0.999789 0.999921 0.999971 0.999990 0.999997
TABLEA.6: Poisson CDF for λ = 9, 9.5, 10, 11, 12, 13
λ k +0 +1 +2 +3 +4
9 0 0.000123 0.001234 0.006232 0.021226 0.054964
5 0.115691 0.206781 0.323897 0.455653 0.587408
10 0.705988 0.803008 0.875773 0.926149 0.958534
15 0.977964 0.988894 0.994680 0.997574 0.998944
20 0.999561 0.999825 0.999933 0.999975 0.999991
9.5 0 0.000075 0.000786 0.004164 0.014860 0.040263
5 0.088528 0.164949 0.268663 0.391824 0.521826
10 0.645328 0.751990 0.836430 0.898136 0.940008
15 0.966527 0.982273 0.991072 0.995716 0.998038
20 0.999141 0.999639 0.999855 0.999944 0.999979
10 0 0.000045 0.000499 0.002769 0.010336 0.029253
5 0.067086 0.130141 0.220221 0.332820 0.457930
10 0.583040 0.696776 0.791556 0.864464 0.916542
P1: IML/FFX P2: IML
MOBK042Appe MOBK042Enderle.cls October 30, 2006 19:54
DISTRIBUTIONTABLES 95
TABLEA.6: Poisson CDF for λ = 9, 9.5, 10, 11, 12, 13 (Continued)
λ k +0 +1 +2 +3 +4
15 0.951260 0.972958 0.985722 0.992813 0.996546
20 0.998412 0.999300 0.999704 0.999880 0.999953
11 0 0.000017 0.000200 0.001211 0.004916 0.015105
5 0.037520 0.078614 0.143192 0.231985 0.340511
10 0.459889 0.579267 0.688697 0.781291 0.854044
15 0.907396 0.944076 0.967809 0.982313 0.990711
20 0.995329 0.997748 0.998958 0.999536 0.999801
25 0.999918 0.999967 0.999987 0.999995 0.999998
12 0 0.000006 0.000080 0.000522 0.002292 0.007600
5 0.020341 0.045822 0.089505 0.155028 0.242392
10 0.347229 0.461597 0.575965 0.681536 0.772025
15 0.844416 0.898709 0.937034 0.962584 0.978720
20 0.988402 0.993935 0.996953 0.998527 0.999314
25 0.999692 0.999867 0.999944 0.999977 0.999991
13 0 0.000002 0.000032 0.000223 0.001050 0.003740
5 0.010734 0.025887 0.054028 0.099758 0.165812
10 0.251682 0.353165 0.463105 0.573045 0.675132
15 0.763607 0.835493 0.890465 0.930167 0.957331
20 0.974988 0.985919 0.992378 0.996028 0.998006
25 0.999034 0.999548 0.999796 0.999911 0.999962
TABLEA.7: Poisson CDF for λ = 14, 15, 16, 17, 18
λ k +0 +1 +2 +3 +4
14 0 0.000001 0.000012 0.000094 0.000474 0.001805
5 0.005532 0.014228 0.031620 0.062055 0.109399
10 0.175681 0.260040 0.358458 0.464448 0.570437
(Continued )
P1: IML/FFX P2: IML
MOBK042Appe MOBK042Enderle.cls October 30, 2006 19:54
96 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
TABLEA.7: Poisson CDF for λ = 14, 15, 16, 17, 18 (Continued)
λ k +0 +1 +2 +3 +4
15 0.669360 0.755918 0.827201 0.882643 0.923495
20 0.952092 0.971156 0.983288 0.990672 0.994980
25 0.997392 0.998691 0.999365 0.999702 0.999864
30 0.999940 0.999974 0.999989 0.999996 0.999998
15 0 0.000000 0.000005 0.000039 0.000211 0.000857
5 0.002792 0.007632 0.018002 0.037446 0.069854
10 0.118464 0.184752 0.267611 0.363218 0.465654
15 0.568090 0.664123 0.748859 0.819472 0.875219
20 0.917029 0.946894 0.967256 0.980535 0.988835
25 0.993815 0.996688 0.998284 0.999139 0.999582
30 0.999803 0.999910 0.999960 0.999983 0.999993
16 0 0.000000 0.000002 0.000016 0.000093 0.000400
5 0.001384 0.004006 0.010000 0.021987 0.043298
10 0.077396 0.126993 0.193122 0.274511 0.367527
15 0.466745 0.565962 0.659344 0.742349 0.812249
20 0.868168 0.910773 0.941759 0.963314 0.977685
25 0.986881 0.992541 0.995895 0.997811 0.998869
30 0.999433 0.999724 0.999869 0.999940 0.999973
17 0 0.000000 0.000001 0.000007 0.000041 0.000185
5 0.000675 0.002062 0.005433 0.012596 0.026125
10 0.049124 0.084669 0.135024 0.200873 0.280833
15 0.371454 0.467738 0.564023 0.654958 0.736322
20 0.805481 0.861466 0.904728 0.936704 0.959354
25 0.974755 0.984826 0.991166 0.995016 0.997273
30 0.998552 0.999253 0.999626 0.999817 0.999913
P1: IML/FFX P2: IML
MOBK042Appe MOBK042Enderle.cls October 30, 2006 19:54
DISTRIBUTIONTABLES 97
TABLEA.7: Poisson CDF for λ = 14, 15, 16, 17, 18 (Continued)
λ k +0 +1 +2 +3 +4
18 0 0.000000 0.000000 0.000003 0.000018 0.000084
5 0.000324 0.001043 0.002893 0.007056 0.015381
10 0.030366 0.054887 0.091669 0.142598 0.208077
15 0.286653 0.375050 0.468648 0.562245 0.650916
20 0.730720 0.799124 0.855090 0.898890 0.931740
25 0.955392 0.971766 0.982682 0.989700 0.994056
30 0.996669 0.998187 0.999040 0.999506 0.999752
TABLEA.8: Marcum’s Q function for α = 0, 0.01, . . . , 1.99
α +0.00 +0.01 +0.02 +0.03 +0.04
0.00 0.500000 0.496011 0.492022 0.488033 0.484046
0.05 0.480061 0.476078 0.472097 0.468119 0.464144
0.10 0.460172 0.456205 0.452242 0.448283 0.444330
0.15 0.440382 0.436441 0.432505 0.428576 0.424655
0.20 0.420740 0.416834 0.412936 0.409046 0.405165
0.25 0.401294 0.397432 0.393580 0.389739 0.385908
0.30 0.382089 0.378281 0.374484 0.370700 0.366928
0.35 0.363169 0.359424 0.355691 0.351973 0.348268
0.40 0.344578 0.340903 0.337243 0.333598 0.329969
0.45 0.326355 0.322758 0.319178 0.315614 0.312067
0.50 0.308538 0.305026 0.301532 0.298056 0.294598
0.55 0.291160 0.287740 0.284339 0.280957 0.277595
0.60 0.274253 0.270931 0.267629 0.264347 0.261086
0.65 0.257846 0.254627 0.251429 0.248252 0.245097
(Continued )
P1: IML/FFX P2: IML
MOBK042Appe MOBK042Enderle.cls October 30, 2006 19:54
98 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
TABLEA.8: Marcum’s Q function for α = 0, 0.01, . . . , 1.99 (Continued)
α +0.00 +0.01 +0.02 +0.03 +0.04
0.70 0.241964 0.238852 0.235762 0.232695 0.229650
0.75 0.226627 0.223627 0.220650 0.217695 0.214764
0.80 0.211855 0.208970 0.206108 0.203269 0.200454
0.85 0.197662 0.194894 0.192150 0.189430 0.186733
0.90 0.184060 0.181411 0.178786 0.176186 0.173609
0.95 0.171056 0.168528 0.166023 0.163543 0.161087
1.00 0.158655 0.156248 0.153864 0.151505 0.149170
1.05 0.146859 0.144572 0.142310 0.140071 0.137857
1.10 0.135666 0.133500 0.131357 0.129238 0.127143
1.15 0.125072 0.123024 0.121001 0.119000 0.117023
1.20 0.115070 0.113140 0.111233 0.109349 0.107488
1.25 0.105650 0.103835 0.102042 0.100273 0.098525
1.30 0.096801 0.095098 0.093418 0.091759 0.090123
1.35 0.088508 0.086915 0.085344 0.083793 0.082264
1.40 0.080757 0.079270 0.077804 0.076359 0.074934
1.45 0.073529 0.072145 0.070781 0.069437 0.068112
1.50 0.066807 0.065522 0.064256 0.063008 0.061780
1.55 0.060571 0.059380 0.058208 0.057053 0.055917
1.60 0.054799 0.053699 0.052616 0.051551 0.050503
1.65 0.049471 0.048457 0.047460 0.046479 0.045514
1.70 0.044565 0.043633 0.042716 0.041815 0.040929
1.75 0.040059 0.039204 0.038364 0.037538 0.036727
1.80 0.035930 0.035148 0.034379 0.033625 0.032884
1.85 0.032157 0.031443 0.030742 0.030054 0.029379
1.90 0.028716 0.028067 0.027429 0.026803 0.026190
1.95 0.025588 0.024998 0.024419 0.023852 0.023295
P1: IML/FFX P2: IML
MOBK042Appe MOBK042Enderle.cls October 30, 2006 19:54
DISTRIBUTIONTABLES 99
TABLEA.9: Marcum’s Q function for α = 3, 3.01, . . . , 3.99
α +0.00 +0.01 +0.02 +0.03 +0.04
2.00 0.022750 0.022216 0.021692 0.021178 0.020675
2.05 0.020182 0.019699 0.019226 0.018763 0.018309
2.10 0.017864 0.017429 0.017003 0.016586 0.016177
2.15 0.015778 0.015386 0.015003 0.014629 0.014262
2.20 0.013903 0.013553 0.013209 0.012874 0.012545
2.25 0.012224 0.011911 0.011604 0.011304 0.011011
2.30 0.010724 0.010444 0.010170 0.009903 0.009642
2.35 0.009387 0.009137 0.008894 0.008656 0.008424
2.40 0.008198 0.007976 0.007760 0.007549 0.007344
2.45 0.007143 0.006947 0.006756 0.006569 0.006387
2.50 0.006210 0.006037 0.005868 0.005703 0.005543
2.55 0.005386 0.005234 0.005085 0.004940 0.004799
2.60 0.004661 0.004527 0.004397 0.004269 0.004145
2.65 0.004025 0.003907 0.003793 0.003681 0.003573
2.70 0.003467 0.003364 0.003264 0.003167 0.003072
2.75 0.002980 0.002890 0.002803 0.002718 0.002635
2.80 0.002555 0.002477 0.002401 0.002327 0.002256
2.85 0.002186 0.002118 0.002052 0.001988 0.001926
2.90 0.001866 0.001807 0.001750 0.001695 0.001641
2.95 0.001589 0.001538 0.001489 0.001441 0.001395
3.00 0.001350 0.001306 0.001264 0.001223 0.001183
3.05 0.001144 0.001107 0.001070 0.001035 0.001001
3.10 0.000968 0.000936 0.000904 0.000874 0.000845
3.15 0.000816 0.000789 0.000762 0.000736 0.000711
3.20 0.000687 0.000664 0.000641 0.000619 0.000598
3.25 0.000577 0.000557 0.000538 0.000519 0.000501
3.30 0.000483 0.000467 0.000450 0.000434 0.000419
(Continued )
P1: IML/FFX P2: IML
MOBK042Appe MOBK042Enderle.cls October 30, 2006 19:54
100 ADVANCEDPROBABILITY THEORY FORBIOMEDICAL ENGINEERS
TABLEA.9: Marcum’s Q function for α = 3, 3.01, . . . , 3.99 (Continued)
α +0.00 +0.01 +0.02 +0.03 +0.04
3.35 0.000404 0.000390 0.000376 0.000362 0.000350
3.40 0.000337 0.000325 0.000313 0.000302 0.000291
3.45 0.000280 0.000270 0.000260 0.000251 0.000242
3.50 0.000233 0.000224 0.000216 0.000208 0.000200
3.55 0.000193 0.000185 0.000179 0.000172 0.000165
3.60 0.000159 0.000153 0.000147 0.000142 0.000136
3.65 0.000131 0.000126 0.000121 0.000117 0.000112
3.70 0.000108 0.000104 0.000100 0.000096 0.000092
3.75 0.000088 0.000085 0.000082 0.000078 0.000075
3.80 0.000072 0.000070 0.000067 0.000064 0.000062
3.85 0.000059 0.000057 0.000054 0.000052 0.000050
3.90 0.000048 0.000046 0.000044 0.000042 0.000041
3.95 0.000039 0.000037 0.000036 0.000034 0.000033
Copyright © 2006 by Morgan & Claypool
All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means—electronic, mechanical, photocopy, recording, or any other except for brief quotations in printed reviews, without the prior permission of the publisher. Advanced Probability Theory for Biomedical Engineers John D. Enderle, David C. Farden, and Daniel J. Krause www.morganclaypool.com ISBN10: 1598291505 ISBN13: 9781598291506 ISBN10: 1598291513 ISBN13: 9781598291513 paperback paperback ebook ebook
DOI 10.2200/S00063ED1V01Y200610BME011 A lecture in the Morgan & Claypool Synthesis Series SYNTHESIS LECTURES ON BIOMEDICAL ENGINEERING #11 Lecture #11 Series Editor: John D. Enderle, University of Connecticut
Series ISSN: 19300328 Series ISSN: 19300336 First Edition 10 9 8 7 6 5 4 3 2 1
print electronic
Printed in the United States of America
Advanced Probability Theory for Biomedical Engineers
John D. Enderle
Program Director & Professor for Biomedical Engineering, University of Connecticut
David C. Farden
Professor of Electrical and Computer Engineering, North Dakota State University
Daniel J. Krause
Emeritus Professor of Electrical and Computer Engineering, North Dakota State University
SYNTHESIS LECTURES ON BIOMEDICAL ENGINEERING #11
M &C
Mor gan
& Cl aypool
Publishers
We ﬁrst evaluate the probability distribution of a function of one random variable using the CDF and then the PDF. The theory material is presented in a logical manner—developing special mathematical skills as needed. Poisson distributions. The primary subjects of the ﬁnal chapter are methods for determining the probability distribution of a function of a random variable. The exponential. engineers and researchers presenting applications of this theory to a wide variety of problems—as well as pursuing these topics at a more advanced level. Many important properties of jointly Gaussian random variables are presented. Gaussian random variables . KEYWORDS Probability Theory. The aim of all three books is as an introduction to probability theory. the joint probability distribution is found from a function of two random variables using the joint PDF and the CDF. Random Processes. straightforward exercises designed to reinforce concepts and develop problem solution skills. Gaussian distributions Bernoulli PMF and Gaussian CDF. Pertinent biomedical engineering examples are throughout the text.iv ABSTRACT This is the third in a series of short books on probability theory and random processes for biomedical engineers. Next. as well as important approximations to the Bernoulli PMF and Gaussian CDF. Poisson and Gaussian distributions are introduced. Drill problems. Exponential distributions. Then. the probability distribution for a single random variable is determined from a function of two random variables using the CDF. Engineering Statistics. The audience includes students. This book focuses on standard probability distributions commonly encountered in biomedical engineering. follow most sections. The mathematical background required of the reader is basic knowledge of differential calculus. Probability and Statistics for Biomedical Engineers.
. . . . . . . . . . . . . . . . . . .18 5. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 5. . . . . . . . . . . . . . . . . . . . . . . . . . . .4 Poisson Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2. 26 5. . . . . . 75 6. . .6 Problems . . . . . 63 6. . . . . . . . . . . . . . . . . . . . . . . . . .3 Conditional PDF Technique . . . . . . . . . .4 Bivariate Transformations . . . . . . . . . . . . Standard Probability Distributions . . . . . . . . . . . . . .3 Bernoulli Trials . . . . . . . . . . . . . . . . . . .6 Bivariate Gaussian Random Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2. . . . 36 5. . . . . . . . . . . . . . . . . . . . . .1 Bivariate CDF Technique . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57 6. . . 20 5.2 Univariate PDF Technique . . . . . . . . . . . . . . . . . 56 6. . . . 53 6. . . . . . . . . . . .1 Constant Contours. . 53 6. . . .2 Mixed Random Variable . . . . . . . . . . . .1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .3 One Function of Two Random Variables . . . . . . . . . . .4. . . . . . . . . . . . . . . . . . . . . . . .2 CDF Technique with Arbitrary Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1 Univariate CDF Technique . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .4. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .8 Problems . . . . . . . . . . . . .5 Univariate Gaussian Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .3. . . .6. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1 Uniform Distributions . . . . . .2 Bivariate PDF Technique . . . .1. . . . . 1 5. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 5. .1 Interarrival Times. . . . . . . . . . . . . . . . . . . . . . . .v Contents 5. . . . . . . . . . . .32 5. . . . . . . . . . . . .2. . . . . . . . . . . . . . . 59 6.1 Poisson Approximation to Bernoulli . . . . . . . . . . . . . . . . . . . . . . . . . . .5. . . . . . . . .2 Exponential Distributions . . . . . . . . . . . . . .1 CDF Technique with Monotonic Functions . . .4. . . . 12 5. . . . . . . . .2 Gaussian Approximation to Bernoulli . . . . . . . . . . . 45 6. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 5. . . . . . . . . . . . . . . . . . . . 73 6. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65 6. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1 Marcum’s Q Function . . . . . . . . . . . . . . . . . . . . . . . . . 45 6. . . . . . . . . . . . . . . . . . . . . . . . .3. . . . . . . . . . . . . . . . . . . 36 Transformations of Random Variables . .1 Continuous Random Variable . . . . . . . . . . . . . . . . . . . 25 5. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 5. . . . . . . . . 45 6. . . . . . . . . . 1 5. . . . . . . . . 46 6. . . . . . . . . . .7 Summary . . . . . . . . . . . . . . . .5 Summary . . . . . . . . . . . . . . . . . . . . 63 6. . . . . . . . .
.
The applications and examples given reﬂect the authors’ background in teaching probability theory and random processes for many years. Our approach is to present a uniﬁed treatment of the subject. junior or senior level for the application of this theory to a wide variety of problems . These key concepts are all presented in the ﬁrst chapter. the exponential. We have found it best to introduce this material using simple examples such as dice and cards. The mathematical background required of the reader is basic knowledge of differential calculus. This text is written as an introduction to probability theory.developing special mathematical skills as needed. as well as important approximations to the Bernoulli PMF and Gaussian CDF. This short book focuses on standard probability distributions commonly encountered in biomedical engineering. conditional expectation. There are only a few key concepts involved in the basic theory of probability theory. standard deviation. the joint probability distribution is found from a function of two random variables using the joint PDF and the CDF. The goal was to prepare students at the sophomore. We ﬁrst evaluate the probability distribution of a function of one random variable using the CDF and then the PDF. The fourth chapter introduces jointly distributed random variables. Poisson and Gaussian distributions are introduced. along with joint expectation.vii Preface This is the third in a series of short books on probability theory and random processes for biomedical engineers. The second chapter introduces the topic of random variables. conditional moments and the conditional characteristic function are also discussed. and the characteristic function. joint moments.as well as pursue these topics at a more advanced level. The primary subjects of Chapter 6 are methods for determining the probability distribution of a function of a random variable. and the joint characteristic function. Convolution is also developed. The third chapter focuses on expectation. Many important properties of jointly distributed Gaussian random variables are presented. rather than more complex biological . Then. Next. Every effort has been made to be consistent with commonly used notation and terminology—both within the engineering community as well as the probability and statistics literature. In addition. the probability distribution for a single random variable is determined from a function of two random variables using the CDF. Here in Chapter 5. moments. Later chapters simply expand upon these key ideas and extend the range of application. A considerable effort has been made to develop the theory in a logical manner .
. presented in the same general order as covered in the textbook. ranging from simple to difﬁcult. The answers to the drill problems follow the problem statement in random order. Many of the examples and end of chapter problems are based on examples from the textbook by Drake [9]. At the end of each chapter is a wide selection of problems. we do introduce some pertinent biomedical engineering examples throughout the text. However. straightforward exercises designed to reinforce concepts and develop problem solution skills. Students in other ﬁelds should also ﬁnd the approach useful. follow most sections.viii PREFACE and biomedical phenomena. We acknowledge and thank William Pruehsner for the technical illustrations. Drill problems.
b] if x is a lattice RV with span h = (b − a)/(n − 1) and PMF p x (α) = 1/n. k = 0.1 UNIFORM DISTRIBUTIONS Deﬁnition 5.4) .1 CHAPTER 5 Standard Probability Distributions A surprisingly small number of probability distributions describe many natural probabilistic phenomena. 1. the PMF or PDF is derived according to the characteristics of the experiment. the mean. Following this. 12 n − 1 (5. n − 1 otherwise. Additionally. α = kh + a. n 2 nn−1 2 2 (5. 0. 2 σx = (b − a)2 n + 1 . special properties are pointed out along with relationships among other probability distributions. Because of the vast number of probability distributions. we cannot possibly discuss them all here in this chapter. Each section introduces a new PMF or PDF. In some instances. We will see that many random variables and their corresponding experiments have similar properties and can be described by the same probability distribution.1: ηx = and 2 σx = 1 n n−1 (kh + a) = k=0 [2] 1 b − a n(n − 1) b +a h γn +a = +a = . .1) The mean and variance of a discrete uniform RV are easily computed with the aid of Lemma 2. variance. This chapter presents some of these discrete and continuous probability distributions that occur often enough in a variety of problems to deserve special mention.3) Simplifying. . (5. and characteristic function are found.1. The discrete RV x has a uniform distribution over n points (n > 1) on the interval [a. .3.1. 2 (n − 1) n−1 4 (5. . 5.2) 1 n n−1 kh − k=0 b −a 2 2 = (b − a)2 n n−1 k=0 k 1 k2 − + .
7) The mean and variance of a continuous uniform RV are easily computed directly: 1 ηx = b −a b αd α = a b2 − a2 b +a = . 1].1: (a) PMF and (b) characteristic function magnitude for discrete RV with uniform distribution over 20 points on [0.1 illustrates the PMF and the magnitude of the characteristic function for a discrete RV which is uniformly distributed over 20 points on [0. Figure 5.1. n 1 − e jht (5. Deﬁnition 5. b] if x has PDF f x (α) = 1/(b − a). Recall from Section 3. (5.5) Simplifying with the aid of Euler’s identity. φx (t) = exp j a +b t 2 sin n sin b−a n t 2 n−1 b−a 1 t 2 n−1 .5 (a) 1 α 10 30 (b) 50 t FIGURE 5. Thus. 0. The characteristic function can be found using the sum of a geometric series: φx (t) = e jat n n−1 (e jht )k = k=0 e jat 1 − e jhnt .05 1 φ x (t ) 0 .2 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS px ( α ) .8) . 1]. The continuous RV x has a uniform distribution on the interval [a.6) Figure 5. where the span h = 1/19. (5.3 that φx (−t) = φx (t) and that φx (t) is periodic in t with period 2π /h. π /h].1 illustrates onehalf period of φx (·). 2(b − a) 2 (5. a ≤α≤b otherwise.2. The characteristic function is plotted ∗ over [0.
φx (t) = exp a +b j t 2 sin b−a t 2 b−a t 2 . (c) ηx .5 (a) 1 10 30 (b) 50 t FIGURE 5. Deter2 mine: (a) Fx (0). 12 (5. 1]. 1]. . Answers: 20.1.4) is tossed once. 2 (c) E(x).2.10) Figure 5.1. 2. Note that the characteristic function in this case is not ∗ periodic but φx (−t) = φx (t).8. Determine: (a) p x (20). (b) P(10 ≤ x ≤ 50). (5. Random variable x is uniformly distributed on the interval [−1. Drill Problem 5.1. Let x be a random variable equaling ten times the number tossed. Drill Problem 5.2 illustrates the PDF and the magnitude of the characteristic function for a continuous RV uniformly distributed on [0.2.1. 0. (b) Fx (5). Answers: 1. (d) σx . 200. 3.2. −(b−a)/2 Simplifying with the aid of Euler’s identity.3. 5]. 0.9) The characteristic function can be found as 1 φx (t) = b −a b a exp j b+a t 2 e dα = b −a jαt (b−a)/2 e jαt d α.2: (a) PDF and (b) characteristic function magnitude for continuous RV with uniform distribution on [0. (d) σx . and 2 σx 1 = b −a b α− a b +a 2 2 dα = (b − a)2 . A pentahedral die (with faces labeled 0.STANDARD PROBABILITY DISTRIBUTIONS f x (α) 1 1 3 φx (t ) 0 . 1/6.
p2 (5. . (5.09 0 10 (a) 20 α 0 1 (b) 2 3 t FIGURE 5.11) The characteristic function can be found using the sum of a geometric series (q = 1 − p): p φx (t) = q ∞ k=1 q e jt k = pe j t . The results are ηx = 1 .15) φx (t ) . 2. The discrete RV x has a geometric distribution or discrete exponential distribution with parameter p(0 < p < 1) if x has PMF p x (α) = p(1 − p)α−1 . α = 1. It can be shown that a discrete exponentially distributed RV has a memoryless property: p xx> (αx > ) = p x (α − ).2.2 EXPONENTIAL DISTRIBUTIONS Deﬁnition 5. where u(·) is the unit step function. .14) Deﬁnition 5.3 illustrates the PMF and the characteristic function magnitude for a discrete RV with geometric distribution and parameter p = 0.12) The mean and variance of a discrete RV with a geometric distribution can be computed using the moment generating property of the characteristic function. . . ≥ 0.2. px (α) . (5.18127]. 1 − q e jt (5.18127.4 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS 5.3: (a) PMF and (b) characteristic function magnitude for discrete RV with geometric distribution [ p = 0.2. 0. otherwise.13) Figure 5.1.18 1 (5. The continuous RV x has an exponential distribution with parameter λ(λ > 0) if x has PDF f x (α) = λe −λα u(α). p and 2 σx = q .
(5. Determine the number of hours reliable at 99%.2. grew primarily out of military applications and experiences with multicomponent systems. τ ≥ 0. . The exponential probability distribution is also a very important probability density function in biomedical engineering applications. Reliability engineers determined its reliability at 5000 hours to be 95%.2 1 5 φx (t ) . arising in situations involving reliability theory and queuing problems. λ2 (5. λ 2 σx = 1 .16) Figure 5.STANDARD PROBABILITY DISTRIBUTIONS f x (α) . Suppose a system contains a component that has an exponential failure rate. The results are ηx = 1 .1. like its discrete counterpart.18) Example 5. λ − jt (5.4 illustrates the PDF and the magnitude of the characteristic function for a continuous RV with exponential distribution and parameter λ = 0.2.4: (a) PDF and (b) characteristic function magnitude for continuous RV with exponential distribution and parameter λ = 0.17) A continuous exponentially distributed RV. which describes the time to failure for a system or component.1 0 5 10 (a) 15 α 10 30 (b) 50 t FIGURE 5.2. The mean and variance of a continuous exponentially distributed RV can be obtained using the moment generating property of the characteristic function. Queuing theory describes the waiting times between events. satisﬁes a memoryless property: f xx>τ (αx > τ ) = f x (α − τ ). Reliability theory. The characteristic function can be found as ∞ φx (t) = λ 0 e α( j t−λ) d α = λ .
1. representing the length of time in hours to complete an examination in Introduction to Random Processes. Answers: 0.6 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS Solution. First.6.03 × 10−5 .3. we solve for α from P (x > α) = e −λα = 0. (c) 99%. The events for each trial can be thought of as any two events which partition the sample space. 0 < x < 4/3 x ≥ 4/3 otherwise.2. Drill Problem 5.3 BERNOULLI TRIALS A Bernoulli experiment consists of a number of repeated (independent) trials with only two possible events for each trial. λ Drill Problem 5.99) = 980 hours.99 or α= − ln(0. a zero or one in a computer .02. ⎪ ⎩ 0. 5. Determine the average examination grade. 5000 Then. Random variable x. Determine the number of years reliable at: (a) 90%.2. 3 The examination results are given by ⎧ ⎪ 75.2. 2.44(x − 4/3).5. 5. such as a head and a tail in a coin toss. Suppose a system has an exponential failure rate in years to failure with λ = 0.95) = 1. (b) 95%. ⎨ g (x) = 75 + 39. has PDF 4 4 f x (α) = e − 3 α u(α).95 = P (x > 5000) = 5000 λe −λα d α = e −5000λ . Answer: 80. Thus λ= − ln(0. to determine the number of hours reliable at 99%. the parameter λ is determined from ∞ 0.
or an even and odd number in a die toss.09 0 10 (a) 20 α 0 1 (b) 2 t FIGURE 5. where p = probability of success and q = 1 − p.3.18 1 2 σx = npq . . the Bernoulli distribution is used in infectious disease problems and other applications. The Bernoulli PMF describes the probability of k successes in n trials of a Bernoulli experiment. px (α) . . . The ﬁrst two chapters used this PMF repeatedly in problems dealing with games of chance and in situations where there were only two possible outcomes in any given trial.2 and n = 30. Deﬁnition 5. Let us call one of the events a success. the mean and variance of a Bernoulli RV can be shown to be ηx = np. The Bernoulli distribution is also known as a Binomial distribution.20) Figure 5. A discrete RV x is Bernoulli distributed if the PMF for x is ⎧ ⎪ n ⎨ p k q n−k . The characteristic function can be found using the binomial theorem: n (5. 1. the other a failure. (5.2. Using the moment generating property of characteristic functions.5: (a) PMF and (b) characteristic function magnitude for discrete RV with Bernoulli distribution. p = 0.5 illustrates the PMF and the characteristic function magnitude for a discrete RV with Bernoulli distribution. otherwise.21) φx (t ) . p = 0.STANDARD PROBABILITY DISTRIBUTIONS 7 bit. k (5. For biomedical engineers. . and n = 30. n p x (k) = k ⎪ ⎩ 0. .1. k = 0.19) φx (t) = k=0 n ( pe j t )k q n−k = (q + pe j t )n .
k.3. σx = np(1 − p) = 3.4845 = 0. n G(n. n − 1} and deﬁne k G(n.2784 = 0. p) = =0 n p (1 − p)n− . a closed form expression for the Bernoulli CDF is not easily obtained.1.15. we ﬁnd a) P (x ≥ 10) = 1 − Fx (9) = 1. . Using the Binomial Theorem.15.22) This result is easily applied to obtain values of the Bernoulli CDF for values of p > 0. k. what is the probability that they win: (a) at least 10 games.2186.05. and p = 0. k = 9. . using (5. . 2 d) ηx = np = 10. since n n−m = n! = m! (n − m)! n m .7031 − 0.1–A. With x a Bernoulli random variable. . 1 − p).7.3. 15. .8732 − 0.5 and n = 5. we consult Table A. (b) from 9 to 12 games. (c) exactly 2 11 games? (d) With x denoting the number of games won. (5.7.8 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS Unlike the preceding distributions. c) p x (11) = Fx (11) − Fx (10) = 0.5 from Tables A. p) = 1 − G(n. The probability that Fargo Polytechnic Institute wins a game is 0. . k. ﬁnd ηx and σx .0 − 0. p) = m=n−k n n−m p n−m (1 − p)m . 0. Making the change of variable m = n − yields n G(n.22) with n = 15.3 in the Appendix list values of the Bernoulli CDF for p = 0. 0. .2. 10. Solution. n − k − 1. Example 5.1. and 20. k. 0. 1. G(n. b) P (9 ≤ x ≤ 12) = Fx (12) − Fx (8) = 0. .1311 = 0.1–A.5. In a 15 game season.7421. . Let k ∈ {0. p) = m=0 n m n−k−1 p n−m (1 − p) − m m=0 n m p n−m (1 − p)m . Now. Tables A.7216.
.STANDARD PROBABILITY DISTRIBUTIONS 9 We now consider the number of trials needed for k successes in a sequence of Bernoulli trials. The moment generating function for nr can be expressed as Mnr (λ) = Letting m = − r . k = 0. = r. n = k ⎪ ⎩ 0. hence. Discrete RV nr thus has PMF p nr ( ) = ⎧ ⎪ ⎨ ⎪ ⎩ −1 r −1 0. The PMF for the RV nr is called the negative binomial distribution.24) (5. otherwise. 1. . . . 1−x x < 1. . Let RV nr represent the number of trials to obtain exactly r successes (r ≥ 1). Note that P (success in th trial r − 1 successes in previous − 1 trials) = p. e λr p r ∞ (m + r − 1)(m + r − 2) · · · (m + 1)(q e λ )m . r + 1.25) . (5. we have P (nr = ) = p(r − 1. for = r. − 1) p. . also known as the P´ lya and the Pascal distribution. (5.23) where p = p(1. pr q −r (5. r + 1. . (r − 1)! m=0 xk = 1 . . . . we obtain Mnr (λ) = With s (x) = ∞ k=0 ∞ =r ( − 1)( − 2) · · · ( − r + 1) r p q (r − 1)! −r λ e . 1) and q = 1 − p. . . n) = ⎧(k successes in n trials) P ⎪ n ⎨ p k q n−k .26) where the parameter r is a positive integer. Note that with o r = 1 the negative binomial PMF is the geometric PMF. Let p(k. otherwise.
Mnr (λ) = pe λ 1 − qeλ r . p and 2 σn r = rq .28) We note that the characteristic function is simply r φnr (t) = Mnr ( j t) = φx (t). q e λ < 1. p2 (5. Figure 5.18127. r = 3. and p = 0.29) where RV x has a discrete geometric distribution.18127. φx (t ) 1 px (α) .6: (a) PMF and (b) magnitude characteristic function for discrete RV with negative binomial distribution.27) The mean and variance for nr are found to be ηn r = r .06 .03 0 20 (a) 40 α 0 1 (b) 2 t FIGURE 5.6 illustrates the PMF and the characteristic function magnitude for a discrete RV with negative binomial distribution. (5. and p = 0. (5. . r = 3.10 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS we have s (x) = ( ) ∞ k= ∞ m=0 k(k − 1) · · · (k − + 1)x k− (m + )(m + − 1) · · · (m + 1)x m ! +1 = = Hence (1 − x) .
n→∞. i=0 n→∞ lim n−k 1 − λ n −k k−1 i=0 (n − i) = 1. .3. We have p(k) = lim n k p k q n−k ⎧ k −λ ⎨λ e . We note that the limiting value p(k) may be used as an approximation for the Bernoulli PMF when p is small by substituting λ = np. the larger the value of n and the smaller the value of p. .30) Proof. Lemma 5. so that p(k) = lim Now. the Bernoulli PMF approaches another important PMF known as the Poisson PMF.1 Poisson Approximation to Bernoulli When n becomes large in the Bernoulli PMF in such a way that np = λ = constant. the better the approximation. While there are no prescribed rules regarding the values of n and p for this approximation. 1. k = 0. Substituting p = and q = 1 − λ . .1. = k! ⎩ 0. Satisfactory results are obtained with np < 10.np=λ λ n (5. n λ k k 1 p(k) = lim n→∞ k! Note that 1− λ n n−k k−1 (n − i). otherwise.STANDARD PROBABILITY DISTRIBUTIONS 11 5. from which the desired result follows. λ lim ln 1 − n→∞ n so that n→∞ n λk n→∞ k! 1− λ n n . = lim ln 1 − 1 n λ n n→∞ = −λ lim 1− λ n n = e −λ .3. The motivation for using this approximation is that when n is large. Tables A. The Poisson PMF is treated in the following section. .3 are useless for ﬁnding values for the Bernoulli CDF.1–A.
2. Notice that in some circumstances np < 10 and npq > 3. np t q q + p exp t j√ npq n .31) /2 . Incidentally. Lemma 5. φ y (t) = q exp − j t p qn + p exp jt q np n t √ npq . Suppose x is a Bernoulli random variable with n = 5000 and p = 0. p is small and np < 10. This approximation is quite useful since the Bernoulli table lists only CDF values for n up to 20. Find P (x ≤ 5).5) is also used to approximate a Bernoulli PMF under certain conditions. if p is close to one. Then the characteristic function for y satisﬁes φ(t) = lim φ y (t) = e −t n→∞ 2 (5.2 Gaussian Approximation to Bernoulli Previously. when n is large. which results in a value of p close to zero—see (5. . 5. Our solution involves approximating the Bernoulli PMF with the Poisson PMF since n is quite large (and the Bernoulli CDF table is useless).3. Since λ = np = 5. .22). φ y (t) = exp − j Simplifying. and npq > 3. the Poisson PMF was used to approximate a Bernoulli PMF under certain conditions. and p is very close to zero.12 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS Example 5. The Gaussian PDF (see Section 5.6160. The accuracy of this approximation is best when n is large. Let x − np y= √ . p is close to 1/2.32) Proof. npq where x is a Bernoulli RV.3. Then either the Poisson or the Gaussian approximation will yield good results. we can still use this approximation by reversing our deﬁnition of success and failure in the Bernoulli experiment. We have np t φx φ y (t) = exp − j √ npq Substituting for φx (t). Solution. we ﬁnd from Table A.5 (the Poisson CDF table is covered in Section 4) that P (x ≤ 5) = 0. that is.001.3.2. (5.
where 1 F(γ ) = √ 2π is the standard Gaussian CDF. npq (5. 2 n→∞ . for large n and a < b P (a < x < b) = P (a < y < b ) ≈ F(b ) − F(a ).9 of the Appendix.91774. Evaluation of the above integral as well as the Gaussian PDF are treated in Section 5.5. The limiting φ(t) in the above lemma is the characteristic function for a Gaussian RV with zero mean and unit variance. The solution involves approximating the Bernoulli CDF with the Gaussian CDF since npq = 1200 > 3.39. a − np a = √ .34) and Q(·) is Marcum’s Q function which is tabulated in Tables A. p and α= 1 .33) e −α −∞ 2 /2 d α = 1 − Q(γ ) (5.3.641 = 1.STANDARD PROBABILITY DISTRIBUTIONS 13 Letting β= we obtain n→∞ q . α→0 α2 Applying L’Hˆ spital’s Rule twice. α→0 2α 2 2 lim φ y (t) = exp lim ln φ y (t) = e −t n→∞ /2 Consequently. Suppose x is a Bernoulli random variable with n = 5000 and p = 0. . we ﬁnd from Table A. o α→0 lim ln φ y (t) = lim −t 2 p − t 2 β 2 p t2 − j tpβe − j tα/β + j tpβe j tβα = =− . Solution.39) = 0.8 and A. Example 5.35) γ (5.39) = 1 − Q(1. n lim ln φ y (t) = lim ln( pβ 2 e − j tα/β + pe j tβα ) . Find P (x ≤ 2048). With np = 2000.4. Hence. npq = 1200 and b = (2048 − 2000)/34. npq b − np b = √ .8 that P (x ≤ 2048) ≈ F(1.3.
0. Since npq = 5 > 3.5 − 10)/ 5 = −1.5 − 10)/ √ 5 = −0. Rensselaer is an avid albeit inaccurate marksman.67.7625. 4. From the Bernoulli table.4 POISSON DISTRIBUTION A Poisson PMF describes the number of successes occurring on a continuous line. (c) from two to ﬁve of the next twenty cars purchased will be white.25143 − 0. It is important to note that while the approximation is excellent in terms of the CDFs—the PDF of any discrete RV is never approximated with a continuous PDF. which is very close to the above approximation. or within a given region. A survey of residents in Fargo. . (b) the standard deviation for 15 shots.12. Find P (x = 8). to compute the probability that a Bernoulli RV takes an integer value using the Gaussian approximation we must round off to the nearest integer.14 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS When approximating the Bernoulli CDF with the Gaussian CDF. North Dakota revealed that 30% preferred a white automobile over all other colors. Answers: 2. we have p x (8) = P (7. Solution.67) − F(−1. p x (8) ≈ 0.1789. we found that the limit (as n → ∞ and constant mean np) of a Bernoulli PMF is a Poisson PMF.4. p x (8). Determine: (a) the expected number of hits scored in 15 shots. hence. (c) the number of times she must ﬁre so that the probability of hitting the target at least once is greater than 1/2. p x (8) = 0. 1. Suppose x is a Bernoulli random variable with n = 20 and p = 0. For example. and b = (8.12) = 0. Operationally. (b) at least ﬁve of the next twenty cars purchased will be white. 0.1.5) ≈ F(−0. In the previous section.3. The probability she will hit the target is only 0.12007. Example 5.5.7748. the Gaussian approximation is used to evaluate the Bernoulli √ PMF.3. 5.1201. or the number of errors per page in this textbook. Drill Problem 5. Drill Problem 5.3.3. typically a time interval. Determine the probability that: (a) exactly ﬁve of the next 20 cars purchased will be white. Answers: 0. In this section. npq = 5. we derive the Poisson probability distribution from two fundamental assumptions about the phenomenon based on physical characteristics.2. a Poisson random variable might represent the number of telephone calls per hour. Prof.13136.5.4088. a continuous distribution is used to calculate probabilities for a discrete RV. With np = 10. a = (7.5 < x < 8.
37) (5.41) This second property indicates that for a series of very small intervals. τ ]). if I1 ∩ I2 = . . Hence p(1. τ ) p(0. the greater the probability of success. τ ) = P (k successes in interval [0. Since [0. the number of successes depends only on the length of the time interval and not the location of the interval on the time axis. and B = { successes in interval I2 }. τ + h] and [0. τ ] ∪ (τ. we have P (A ∩ B) = P (A)P (B). h) = p(0. we have p(0.39) (5. τ ] ∩ (τ. h→0 h lim For example. h) = 1 − λh + o (h).STANDARD PROBABILITY DISTRIBUTIONS 15 The following development makes use of the order notation o (h) to denote any function g (h) which satisﬁes g (h) = 0.42) (5.38) As we will see. We use the notation p(k. h) = λh + o (h). The longer the interval. with A = {k successes in interval I1 }. the Poisson process is composed of a series of Bernoulli trials. τ + h] = [0. Thus. each with a probability of success p = λh + o (h). (2) The probability of a single success during a very small time interval is proportional to the length of the interval. g (h) = 15h 2 + 7h 3 = o (h). (5. (5. The probability of more than one success occurring during an interval vanishes as the length of the interval approaches zero.40) (5.36) The Poisson probability distribution is characterized by the following two properties: (1) The number of successes occurring in a time interval or region is independent of the number of successes occurring in any other nonoverlapping time interval or region. (5. τ + h) = p(0. τ + h] = . τ )(1 − λh + o (h)). and p(0.
16 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS Noting that p(0. 0) = 0. h h Taking the limit as h → 0 d p(k. For ease in subsequent development.48) . d p(0. 2. τ ) −λh p(0. 0) = 1. (5. τ ) p(1. τ ) = −λτ 0 e λt p(k − 1. τ ) p(0. . τ ) = λp(k − 1. or p(k. .47) The RV x = number of successes thus has a Poisson distribution with parameter λτ and PMF p x (k) = p(k. τ ). τ + h) − p(0. τ ) + o (h) = h h and taking the limit as h → 0. t)d t (5. dτ This differential equation has solution p(0. τ ) = −λp(0. τ ) = λp(k − 1. (5. it is readily seen that p(k. k! (5.43) (5. The characteristic function for a Poisson RV x with parameter λ is found as (with parameter λ. τ ) = e −λτ u(τ ). so that o (h) p(k. 1. τ + h) = p(k − 1. . k! k = 0. τ )λh + p(k. . . . τ ) = λe and hence that p(k.46) (λτ )k e −λτ u(τ ). τ ). p x (k) = p(k. τ ) + .44) k = 1. h) + o (h). τ ). h) + p(k. (5. . τ )(1 − λh) + o (h). τ ) + λp(k. τ ) + λp(k. The rate of the Poisson process is λ and the interval length is τ . we replace the parameter λτ with λ. Applying the above properties. τ + h) = p(k − 1. . dτ with p(k. 8]) that τ p(0. 1)) φx (t) = e −λ ∞ k=0 (λe j t )k = e −λ exp(λe j t ) = exp(λ(e j t − 1)).45) p(k. It can be shown ([7. τ + h) − p(k.
.52) . respectively.51) If x is Poisson distributed with parameter λ. then φx1 +x2 (t) = exp((λ1 + λ2 )(e j t − 1)). (5. so that (k) ψx (γ ) = λk e λ(γ −1) .STANDARD PROBABILITY DISTRIBUTIONS px (α) . (5. It is of interest to note that if x1 and x2 are independent Poisson RVs with parameters λ1 and λ2 .06 0 10 (a) 20 α 0 1 (b) 2 t FIGURE 5.50) (x − i) . Consider the function ψx (γ ) = E(γ x ) and note that k−1 (k) ψx (γ ) = E γ x−k i=0 (5. Figure 5.7 illustrates the PMF and characteristic function magnitude for a discrete RV with Poisson distribution and parameter λ = 10. so that k−1 E i=0 (k) (x − i) = ψx (1). (5.49) i. The moments of a Poisson RV are tedious to compute using techniques we have seen so far. then ψx (γ ) = e λ(γ −1) . x1 + x2 is also a Poisson with parameter λ1 + λ2 .e.12 1 φx (t ) 17 .7: (a) PMF and (b) magnitude characteristic function for Poisson distributed RV with parameter λ = 10.
E(x(x − 1)) = λ2 = E(x 2 ) − λ. however. and then decreases in value as k increases from the mean.7 of the Appendix for selected values of λ ranging from 0. in evaluating the reliability of a medical device. c) P (x ≥ 15) = 1 − Fx (14) = 1 − 0. τ − h)λh + o (h) .1 to 18. 5. the time to failure is far more signiﬁcant to the biomedical engineer than the fact that the device failed. What is the probability that on a given day (a) 8 problems are graded.4A. b) P (8 ≤ x ≤ 10) = Fx (10) − Fx (7) = 0. Then p(τ − h < tr ≤ τ ) = p(r − 1.4.1 Interarrival Times In many instances. With x a Poisson random variable. (b) 8–10 problems are graded.2202 = 0. While it is quite easy to calculate the value of the Poisson PMF for a particular number of successes. τ − h) p(1. and (c) at least 15 problems are graded? Solution. k−1 E i=0 (x − i) = λk . Here.3328 − 0.5830 − 0. note that the table is written with a ﬁnite number of entries for each value of λ because the PMF values are written with six decimal place accuracy. we note that the value of the Poisson PMF increases as the number of successes k increases from zero to the mean. Example 5. From the Poisson CDF table.9165 = 0.1126. and ﬁnd a) p x (8) = Fx (8) − Fx (7) = 0.2202 = 0. For example. Therefore. the subject of reliability theory is so important that entire textbooks are devoted to the topic. Let RV tr denote the length of the time interval from zero to the r th success.4. hand computation of the CDF is quite tedious.1. On the average.18 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS hence. so that σx = λ2 + λ − λ2 = λ. the length of time between successes. the Poisson CDF is tabulated in Tables A. we will brieﬂy examine the subject of interarrival times from the basis of the Poisson PMF. Additionally. Indeed. h) = p(r − 1. of a Poisson random variable is more important than the actual number of successes.53) 2 In particular.3628. known as an interarrival time. E(x) = λ. (5. even though an inﬁnite number of Poisson successes are theoretically possible.0835. we consult the Poisson CDF table with λ = 10. Professor Rensselaer grades 10 problems per day.
STANDARD PROBABILITY DISTRIBUTIONS
19
so that o (h) Ftr (τ ) − Ftr (τ − h) = λp(r − 1, τ − h) + . h h Taking the limit as h → 0 we ﬁnd that the PDF for the r th order interarrival time, that is, the time interval from any starting point to the r th success after it, is f tr (τ ) = λr τ r −1 e −λτ u(τ ), (r − 1)! r = 1, 2, . . . . (5.54)
This PDF is known as the Erlang PDF. Clearly, with r = 1, we have the exponential PDF: f t (τ ) = λe −λτ u(τ ). The RV t is called the ﬁrstorder interarrival time. The Erlang PDF is a special case of the gamma PDF: f x (α) = for any real r > 0, λ > 0, where λr αr −1 e −λα u(α), (r ) (5.56) (5.55)
is the gamma function
∞
(r ) =
0
αr −1 e −α d α.
(5.57)
Straightforward integration reveals that (1) = 1 and (r + 1) = r (r ) so that if r is a positive integer then (r ) = (r − 1)!—for this reason the gamma function is often called the factorial function. Using the above deﬁnition for (r ), it is easily shown that the moment generating function for a gammadistributed RV is Mx (η) = The characteristic function is thus φx (t) = It follows that the mean and variance are r ηx = , λ λ λ − jt
r
λ λ−η
r
,
for η < λ.
(5.58)
.
(5.59)
and
2 σx =
r . λ2
(5.60)
Figure 5.8 illustrates the PDF and magnitude of the characteristic function for a RV with gamma distribution with r = 3 and λ = 0.2.
20
ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS
f x (α) .06 1 φx (t )
.03
0
20 (a)
40
α
0
1 (b)
2
t
FIGURE 5.8: (a) PDF and (b) magnitude characteristic function for gamma distributed RV with r = 3 and parameter λ = 0.2.
Drill Problem 5.4.1. On the average, Professor S. Rensselaer makes ﬁve blunders per lecture. Determine the probability that she makes (a) less than six blunders in the next lecture: (b) exactly ﬁve blunders in the next lecture: (c) from three to seven blunders in the next lecture: (d) zero blunders in the next lecture. Answers: 0.6160, 0.0067, 0.7419, 0.1755. Drill Problem 5.4.2. A process yields 0.001% defective items. If one million items are produced, determine the probability that the number of defective items exceeds twelve. Answer: 0.2084. Drill Problem 5.4.3. Professor S. Rensselaer designs her examinations so that the probability of at least one extremely difﬁcult problem is 0.632. Determine the average number of extremely difﬁcult problems on a Rensselaer examination. Answer: 1.
5.5
UNIVARIATE GAUSSIAN DISTRIBUTION
The Gaussian PDF is the most important probability distribution in the ﬁeld of biomedical engineering. Plentiful applications arise in industry, research, and nature, ranging from instrumentation errors to scores on examinations. The PDF is named in honor of Gauss (1777–1855), who derived the equation based on an error study involving repeated measurements of the same quantity. However, De Moivre is ﬁrst credited with describing the PDF in 1733. Applications also abound in other areas outside of biomedical engineering since the distribution ﬁts the observed data in many processes. Incidentally, statisticians refer to the Gaussian PDF as the normal PDF.
STANDARD PROBABILITY DISTRIBUTIONS
21
Deﬁnition 5.5.1. A continuous RV z is a standardized Gaussian RV if the PDF is 1 1 2 f z(α) = √ e − 2 α . 2π (5.61)
The moment generating function for a standardized Gaussian RV can be found as follows: 1 Mz(λ) = √ 2π 1 =√ 2π
∞
e λα− 2 α d α
1 2
−∞ ∞
e − 2 ((α−λ) −λ ) d α.
1 2 2
−∞
Making the change of variable β = α − λ we ﬁnd
∞
Mz(λ) = e
1 2 2λ
f z(β)dβ = e 2 λ ,
1 2
(5.62)
−∞
for all real λ. We have made use of the fact that the function f z is a bona ﬁde PDF, as treated in Problem 42. Using the Taylor series expansion for an exponential, e 2λ =
1 2
∞ k=0
λ2k = 2k k!
∞ n=0
Mx(n) (0)λn , n!
so that all moments of z exist and E(z 2k ) = and E(z 2k+1 ) = 0, k = 0, 1, 2, . . . . (5.64) (2k)! , 2k k! k = 0, 1, 2, . . . , (5.63)
Consequently, a standardized Gaussian RV has zero mean and unit variance. Extending the range of deﬁnition of Mz(λ) to include the ﬁnite complex plane, we ﬁnd that the characteristic function is φz(t) = Mz( j t) = e − 2 t .
1 2
(5.65)
2 Letting the RV x = σ z + η we ﬁnd that E(x) = η and σx = σ 2 . For σ > 0
Fx (α) = P (σ z + η ≤ α) = Fz((α − η)/σ ),
σ 2 ) to denote that the RV has a Gaussian PDF with mean η and variance σ 2 . the larger the value of the variance.08 . centered now at 25 instead of 75. a closed form expression does not exist for the Gaussian CDF. the Gaussian PDF curve must be symmetrical as previously described. 25. (2πσ 2 )−1/2 .12 . 1 2 2 (5. η = 25. The PDF approaches zero asymptotically as α approaches ±∞. Rather than attempting to tabulate the general Gaussian CDF.22 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS so that x has the general Gaussian PDF f x (α) = √ 1 2πσ 2 exp − 1 (α − η)2 . Note that if x ∼ G(η.9. As can be seen. a normalization is performed to obtain a standardized Gaussian RV (with zero mean f x (α) . the more spread in the distribution and the smaller the maximum value of the PDF. For any combination of the mean and variance. so that f x is as above. If. with σ > 0 and x = −σ z + η we ﬁnd Fx (α) = P (−σ z + η ≤ α) = 1 − Fz((η − α)/σ ). in fact. σ 2 ) then (x = σ z + η) φx (t) = e j ηt e − 2 σ t . and the area under the curve must equal one. which necessitates numerical integration. 2σ 2 (5. identically shaped curves could be drawn.04 65 75 85 α FIGURE 5. . the Gaussian PDF is symmetrical about the vertical axis through the expected value.67) The Gaussian PDF. Unfortunately. occurs at α = η. is a bellshaped curve completely determined by its mean and variance. Additionally.9: Gaussian probability density function for η = 75 and σ 2 = 9. as well as with η = 75 and σ 2 = 9 in Figure 5.66) Similarly. We will have occasion to use the shorthand notation x ∼ G(η. the maximum value of the Gaussian PDF. Naturally. illustrated with η = 75 and σ 2 = 25.
8 and A. 1 2 (5.72) e −t d t 2 (5. σ 2 ). A simple change of variable reveals that √ erfc(α) = 2Q(α/ 2).74) α are also often used to evaluate the standard normal integral. using the fact that Fx (α) = Fz((α − η)/σ ). Note that 1 Fz(α) = √ 2π where Q(·) is Marcum’s Q function: 1 Q(α) = √ 2π ∞ α (5. 1).73) 0 e −t d t = 1 − erf(α) 2 (5. (5. This transformation is always applied when using standard tables for computing probabilities for Gaussian RVs. 1). deﬁned by 2 erf(α) = π and 2 erfc(α) = π ∞ α (5. Compute Fz(−1. The error and complementary error functions. .STANDARD PROBABILITY DISTRIBUTIONS 23 and unit variance). If x ∼ G(η.70) −∞ e − 2 τ d τ. 1 2 (5.5. the RV z = (x − η)/σ is a standardized Gaussian RV: z ∼ G(0.71) α Marcum’s Q function is tabulated in Tables A.75) Example 5.68) (5.69) e − 2 τ d τ = 1 − Q(α). The probability P (α1 < x ≤ α2 ) can be obtained as P (α1 < x ≤ α2 ) = Fx (α2 ) − Fx (α1 ). It is easy to show that Q(−α) = 1 − Q(α) = Fz(α).74).1.5.9 for 0 ≤ α < 4 using the approximation presented in Section 5.1. where z ∼ G(0.
we note that 99.2. Sketch the PDF and then ﬁnd P (37 ≤ x ≤ 51).0027).6) − Fz(0. the range of values random variable x takes on is then approximately η ± 3σ .02 .74) = 0. we ﬁnd Fz(−1.9. The PDF is essentially zero outside the interval [η − 3σ.2 ≤ z ≤ 1. but the probability of it occurring is really very small (2Q(3) = 0. . While the value a Gaussian random variable takes on is any real number between negative inﬁnity and positive inﬁnity. Example 5. To compute Fz(−1. Hence P (37 ≤ x ≤ 51) = Q(0.74) = 1 − Q(−1. From Table A.36594 from Table A.73% of the area under the curve is contained between −3. the realistic range of values is much smaller.72) and Table A.9.5. Indicate this probability on the sketch.74).03 . x − 35 10 f x (α) .2) − Q(1. The sketch of this PDF is shown in Figure 5.24 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS Solution.2).10 along with the indicated probability. using (5.8. 65].01 0 30 60 α FIGURE 5.74) = Q(1.10: PDF for Example 5.5. Solution. This notion does not imply that random variable x cannot take on a value outside this interval.2.6) = Fz(1. η + 3σ ] = [5.6) = 0.0.04093.04 . From the transformation z = (x − η)/σ .0 and 3. With z= we have P (37 ≤ x ≤ 51) = P (0. Suppose x is a Gaussian random variable with η = 35 and σ = 10.
5 × 10−8 . (5.5.31938153 −0. This result is determined by linear interpolation of the CDF between equal 1.356563782 1. 2π The constants are i 1 2 3 4 5 ai 0. and that Fz(−γ ) = Q(γ ). the probability that the measured value exceeds 31 μF is P (31.77) i.80) 1 . 932]: Q(γ ) ≈ e − 2 γ h(t).330274429 (5. Assuming that capacitance follows a Gaussian distribution. 1 + 0.821255978 1. p. Q(γ ) is the complement of the standard Gaussian CDF. ﬁnd the probability that the value of capacitance exceeds 31 μF if capacitance is measured to the nearest μF. .5.083 ≤ z) = Q(1.1 Marcum’s Q Function ∞ Marcum’s Q function. A machine makes capacitors with a mean value of 25 μF and a standard deviation of 6 μF.09. Since we are measuring to the nearest μF.5..76) γ has been extensively studied. 1) then Q(γ ) = 1 − Fz(γ ). A very accurate approximation to Q(γ ) is presented in [1. 5. Q(∞) = 0. Solution. deﬁned by 1 Q(γ ) = √ 2π e−2α dα 1 2 (5. where z = (x − 25)/6 ∼ G(0. Let the RV x denote the value of a capacitor. 1).781477937 −1.13941.5 ≤ x) = P (1.083) = 0. Note that Q(0) = 0.78) where t= and 1 h(t) = √ t(a 1 + t(a 2 + t(a 3 + t(a 4 + a 5 t)))).2316419γ (5.e.STANDARD PROBABILITY DISTRIBUTIONS 25 Example 5. (5.79) The error in using this approximation is less than 7.3.08 and 1. 1 2 γ > 0. If the RV z ∼ G(0.
we discuss the joint Gaussian PDF and its characteristics by drawing on our univariate Gaussian PDF experiences. . .5.5. Students attend Fargo Polytechnic Institute for an average of four years with a standard deviation of onehalf year.32. i = 0. (b) less than 2. it is desired to ﬁnd the value of γ for which Q(γ ) = q .5. (b)P (x > 4). Answers: 0.2316419γi (5.5. The quality point averages of 2500 freshmen at Fargo Polytechnic Institute follow a Gaussian distribution with a mean of 2.83) The procedure is terminated when γi+1 ≈ γi to the desired degree of accuracy. Determine the value of d so that the range. Answers: 167. 0. 322. . Drill Problem 5.0 and 3. Now. (d )Fx (4. 150 ± d .5.82) . Drill Problem 5.81) The ratio of the upper bound to the lower bound is 0. The bound improves as α increases.02275.3.92535. Determine the number of freshmen you would expect to score: (a) from 2. 298] e−2α 2 < Q(α) ≤ √ π α + α2 + 4 1 2 e−2α 2 . and signiﬁcantly expanding the scope of applications.6 to 3.26 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS A very useful bound for Q(α) is [1.6 BIVARIATE GAUSSIAN RANDOM VARIABLES The previous section introduced the univariate Gaussian PDF along with some general characteristics.5. (c) between 3. (5. 639. 1.967 when α = 4.946 when α = 3 and 0.5. Suppose grade point averages are computed to the nearest tenth. Helstrom [14] offers an iterative procedure which begins with an initial guess γ0 > 0.1. Answer: 33.7. 0. √ π α + α 2 + 0. (d) greater than 3. 1179. 5. Professor Rensselaer loves the game of golf. Then compute ti = and γi+1 h(ti ) = 2 ln q 1/2 1 1 + 0. p. 0. Drill Problem 5.2. Determine: (a) P (1 < x < 3). She has determined that the distance the ball travels on her ﬁrst shot follows a Gaussian distribution with a mean of 150 and a standard deviation of 17. (c )P (x = 4). Let the random variable x denote the length of attendance and assume that x is Gaussian. . covers 95% of the shots.0.5π 1 2 (5.5 and a standard deviation of 0.721). Sometimes. .
which we now proceed to do. and σ y = 0 are discussed.y = ±1. In particular. β) = 2π ∞ I (α. σx = 0. −∞ (5.86) . the joint Gaussian PDF is considered the most important joint distribution for biomedical engineers. the special cases ρx. The above deﬁnition of a bivariate Gaussian RV is sufﬁcient for determining the bivariate PDF.y . the marginal PDFs for a joint Gaussian PDF are univariate Gaussian. 2 2 2 t1 + 2t1 t2 ρ + t2 = (t1 + ρt2 )2 + (1 − ρ 2 )t2 . Below. In this case we also say that the RVs x and y are jointly distributed Gaussian RVs.STANDARD PROBABILITY DISTRIBUTIONS 27 Numerous applications of this joint PDF are found throughout the ﬁeld of biomedical engineering and. Let z1 = x − ηx σx and z2 = y − ηy . the results for z1 and z2 are applied to obtain corresponding quantities φx.85) From (6) we have 1 f z1 . like the univariate case. then the conditional PDF f z2 z1 and the joint PDF f z1 . 1). σx ) and 2 y ∼ G(η y . the RV t1 z1 + t2 z2 is univariate Gaussian: 2 2 t1 z1 + t2 z2 ∼ G 0. Let the RV w = ax + by. x ∼ G(ηx . 1) and z2 ∼ G(0.84) so that z1 ∼ G(0. t2 )e − jβt2 d t2 .z2 (t1 . t2 ) = E(e j t1 z1 + j t2 z2 ) = e − 2 (1−ρ 1 2 2 )t2 − 1 (t1 +ρt2 )2 2 e . we assume that ρx. t1 + 2t1 t2 ρ + t2 . and let x and y be jointly distributed Gaussian RVs. σ y ). i. The bivariate RV z = (x. Finally.6. y) is a bivariate Gaussian RV if every linear combination of x and y has a univariate Gaussian distribution.e. f yx and f x.z2 .z2 (α. so that φz1 . σx = 0. (5. σy (5. Since z1 and z2 are jointly Gaussian. we ﬁrst ﬁnd the joint characteristic function for the standardized RVs z1 and z2 . Completing the square. and σ y = 0. Deﬁnition 5. Then 2 w is a univariate Gaussian RV for all real constants a and b.1.y  < 1. Also.. The following development is signiﬁcantly simpliﬁed by considering the standardized versions of x and y.y . Next.
t2 ) = φ(t2 ) f z1 (α). 1 2 −∞ . β) = f z1 (α) 2π ∞ φ(t2 )e − jβt2 d t2 −∞ and recognize that φ is the characteristic function for a Gaussian RV with mean αρ and variance 1 − ρ 2 . Substituting into (5.28 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS where 1 I (α. t2 )e − j αt1 d t1 .z2 (α. From (5.86) we ﬁnd 1 f z1 .88) (5. 2(1 − ρ 2 ) (5.z2 (α.89) After some algebra. 1 2π (1 − ρ 2 ) exp − (β − ρα)2 . Thus f z1 . we obtain I (α. β) = 1 α 2 − 2ραβ + β 2 exp − . t2 ) = 2π ∞ φz1 . β) = f z2 z1 (βα) = f z1 (α) so that E(z2  z1 ) = ρz1 and 2 σz2 z1 = 1 − ρ 2 . where φ(t2 ) = e j αρt2 e − 2 (1−ρ 1 2 2 )t2 2 − 1 (1−ρ 2 )t2 2 1 2π ∞ e − 2 τ e − j α(τ −ρt2 ) d τ.z2 (α. t2 ) = e or I (α. 2π (1 − ρ 2 )1/2 2(1 − ρ 2 ) (5. .85) and letting τ = t1 + t2 ρ. −∞ Substituting (5.84) we ﬁnd that x = σx z1 + ηx and y = σ y z2 + η y .90) We now turn our attention to using the above results for z1 and z2 to obtain similar results for x and y.87) (5. we ﬁnd f z1 .z2 (t1 .
66). (5. (5. τ2 = σ y t2 .87) into (5.90) and (5. β) as 1 f x. −∞ −∞ (5. 1 f z .93) Substituting (5. β) = and exp − f yx (βα) = β−η y −ρσ y α−ηx σx 2 2(1−ρ 2 )σ y 2 1 exp − 2(1−ρ 2 ) (α−ηx )2 2 σx − 2ρ(α−ηx )(β−η y ) σx σ y + (β−η y )2 2 σy 2πσx σ y (1 − ρ 2 )1/2 (5.93) and (5.91) Using (4.y (α.y (α. β) = f yx (βα) f x (α) and f x (α) = we may apply (5.94) we ﬁnd f x.y (α.y (α.z σx σ y 1 2 α − ηx β − η y .92) Making the change of variables τ1 = σx t1 . (5.94) 1 fz σx 1 α − ηx σx . β) = (2π)2 ∞ ∞ φz1 .96) .y (t1 . σ y t2 )e − j (α−ηx )t1 e − j (β−η y )t2 d t1 d t2 .z2 (σx t1 . the joint characteristic function for x and y can be found from the joint characteristic function of z1 and z2 as φx.y (t1 .y (α. we obtain f x.95) 2 2πσ y (1 − ρ 2 ) . Consequently. t2 ) = φz1 .y can be transformed to obtain the joint PDF f x.STANDARD PROBABILITY DISTRIBUTIONS 29 so that the joint characteristic function for x and y is φx. t2 ) = E(e j t1 x+ j t2 y ) = E(e j t1 σx z1 + j t2 σ y z2 )e j t1 ηx + j t2 η y . σ y t2 )e j ηx t1 e j η y t2 . β) = Since f x. (5. σx σy .93) to obtain f yx (βα) = 1 f z z σy 2 1 β − η y α − ηx σy σx . the joint characteristic function φx.z2 (σx t1 .
30 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS It follows that E(yx) = η y + ρσ y and 2 2 σ yx = σ y (1 − ρ 2 ). fx .101) A threedimensional plot of a bivariate Gaussian PDF is shown in Figure 5. β) with σx = σ y = 1.11.98) ⎞ ⎠ . ⎛ exp ⎝− f xy (αβ) = α−ηx −ρσx β−η y σy 2 2(1−ρ 2 )σ y 2 2 2πσx (1 − ρ 2 ) E(xy) = ηx + ρσx and y − ηy .97) (5. Since x and y are jointly Gaussian. and ρ = −0.y (α. (5. x − ηx σx (5. The bivariate characteristic function for x and y is easily obtained as follows.y + t2 σ y .8.265 β =3 α = −3 β = −3 α =3 FIGURE 5.99) By interchanging x with y and α with β.100) 2 2 σxy = σx (1 − ρ 2 ). the RV t1 x + t2 y is a univariate Gaussian RV: 2 2 2 2 t1 x + t2 y ∼ G t1 ηx + t2 η y .y (α.11: Bivariate Gaussian PDF f x. t1 σx + 2t1 t2 σx. (5. σy (5. . β ) . ηx = η y = 0.
2 2 Example 5.105) when ρ = ±1. (5.y = 0. . 1 As the variance of a RV decreases to zero. Interchanging the roles of x and y we ﬁnd that the joint PDF for x and y may also be written as f x. Suppose that ρ = ρx. A very special property of jointly Gaussian RVs is presented in the following theorem. The jointly Gaussian RVs x and y are independent iff ρx. We conclude that x − ηx σx α − ηx σx f x. 1) and such that v and x are independent. from (5. 1 2 2 2 2 (5. i. These results can also be obtained “directly” from the joint characteristic function for x and y. We showed previously that if x and y are independent.y . Theorem 5. σx = σ y = 1.97) and (5.95)) f x. If ρ = 0 then (from (5. the joint characteristic function for x and y is φx.102) which is valid for all σx.6..y = 0. RVs x and y are independent. Then f yx (βα) = f y (β).y (t1 .104) for ρ = ±1. t2 ) = e − 2 (t1 σx +2t1 t2 σx. and ρ = ±1. β) = f x (α)δ β − η y ± σ y (5. then ρx. σx and σ y .y (α.y +t2 σ y ) e j t1 ηx + j t2 η y .y (α.y = 0.STANDARD PROBABILITY DISTRIBUTIONS 31 Consequently. Proof. Find constants a and b such that v = ax + by ∼ G(0. This is an application of the Chebyshev Inequality.e. β) = f x (α) f y (β).98) we ﬁnd E(yx) → η y ± σ y 2 and σ yx → 0. Hence.6.1.1.y (α. β) = f y (β)δ α − ηx ± σx β − ηy σy (5.103) x − ηx σx y → ηy ± σy in probability1 . As ρ → ±1. the probability that the RV deviates from its mean by more than an arbitrarily small ﬁxed amount approaches zero. We now consider some special cases of the bivariate Gaussian PDF. Let x and y be jointly Gaussian with zero means.
(5. We assume that ρ < 1. 1 − 2ρq + q 2 (5. We have E(v) = 0. By inspection of (5.1 Constant Contours Returning to the normalized jointly Gaussian RVs z1 and z2 . We require 2 2 σv = a 2 + b 2 + 2abρx.32 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS Solution. If ρ = 0 the contours where the joint PDF is constant is a circle of radius c centered at the origin. we ﬁnd that f z1 .6. (5. Along the line β = q α we ﬁnd that α 2 (1 − 2ρq + q 2 ) = c 2 so that the constant contours are parameterized by α= and β= ±c q 1 − 2ρq + q 2 .109) ±c 1 − 2ρq + q 2 .y and b 2 = 1/(1 − ρx.z2 (α.108) (5. β) by ﬁnding the locus of points where the PDF is constant. β) is constant for α and β satisfying α 2 − 2ραβ + β 2 = c 2 . 2 Hence a = −bρx. y − ρx. we now investigate the shape of the joint PDF f z1 .y = 1 and E(vx) = a + bρx.90).z2 (α.107) The square of the distance from a point (α. (5. β) on the contour to the origin is given by d 2 (q ) = α 2 + β 2 = c 2 (1 + q 2 ) .y 5.y ).y x 2 1 − ρx.110) .106) where c is a positive constant. so that v= 2 is independent of x and σv = 1.y = 0.
the line β = α intersects the constant contour at ±c ±β = α = . With α = 0 we ﬁnd that one axis is along β − ηy α − ηx =− σx σy .111) 2(1 − ρ) Similarly. (5.113) √ The above equation represents an ellipse with major axis length 2c / 1 − ρ and minor axis √ length 2c / 1 + ρ. we ﬁnd that d 2 (q ) attains its extremal values at q = ±1.118) 2 − 2ρ α − ηx σx β − ηy σy + β − ηy σy 2 = c 2. 1 β =√ 2 β − ηy α − ηx − σy σx (5. √ 2 The rotated coordinate system is a rotation by π/4 counterclockwise.112) √ √ Consider the rotated coordinates α = (α + β)/ 2 and β = (β − α)/ 2.115) (5. (5. (5.STANDARD PROBABILITY DISTRIBUTIONS 33 Differentiating.y (α. the constant contours for f x. the major and minor axes of the ellipse are along the lines β = ±α. From (5.93). intersects the constant contour at ±β = −α = ±c 2(1 − ρ) .116). In the α − β plane. β) are solutions to α − ηx σx Using the transformation 1 α =√ 2 β − ηy α − ηx + σx σy . 1+ρ 1−ρ 1 − ρ2 (5. so that α +β =β √ 2 and α −β = α.116) (5. the line β = −α. Thus.114) (5.117) transforms the constant contour to (5. Thus α 2 − 2ραβ + β 2 = c 2 is transformed into β2 c2 α2 + = .
117). α =√ 1+ρ β =√ β 1−ρ exp − c2 . Letting A denote the region bounded by the ellipse in the α − β plane and A denote the image (a circle) . 2 1−ρ 1−ρ .117). 2(1 − ρ 2 ) (5. 2 1−ρ the length of this axis in the α − β plane is √ 2c 2 2 σx + σ y ±c σ y β=√ √ + ηy . 1 − ρ2 This transformation provides a straightforward way to compute the probability that the joint Gaussian RVs x and y lie within the region bounded by the ellipse speciﬁed by (5.y on this curve is 1 2πσx σ y 1 − ρ 2 A further transformation α . 2 1+ρ the length of this axis in the α − β plane is √ 2c 2 2 σx + σ y ±c σ y β=√ √ + ηy . Points on this ellipse in the α − β plane satisfy (5. With β = 0 we ﬁnd that the other axis is along β − ηy α − ηx = σx σy with endpoints at ±c σx α=√ √ + ηx .34 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS with endpoints at ±c σx α=√ √ + ηx . the value of the joint PDF f x.119) transforms the ellipse in the α − β plane to a circle in the α − β plane: α 2 +β 2 = c2 . 2 1+ρ 1+ρ .
y (α.1. Determine: (a) ηx .6. β) = ∂β ∂α Computing the indicated derivatives. we have f x. β) = 1 σx σ y 1 − ρ 2 . 1 √ √ σy 2 1 − ρ ∂α ∂β ∂β ∂β .y (α.5. pp. Answers: 0. 5. β) d α dβ = A A 1 = 2π = 2π c / 1 − 1 (α 2 +β 2 ) e 2 d α dβ 2π √ 2 1−ρ r e − 2 r dr d θ 1 2 0 c 2 /(2−2ρ 2 ) 0 e −u d u 0 2 = 1 − e −c /(2−2ρ 2 ) . and (e) ρx. Given that x and y are jointly distributed Gaussian random variables with 2 2 E(yx) = 2 + 1. 9. This bivariate Gaussian probability computation is one of the few which is “easily” accomplished. 1 √ √ σy 2 1 + ρ .y .75. . Drill Problem 5. Additional techniques for treating these computations are given in [1. Substituting and transforming to polar coordinates. 1.5x. (c) σx . and σxy = 0. we ﬁnd f x. E(xy) = 7/6 + y/6. (d) 2 σ y . we have 1 √ √ σx 2 1 + ρ J (α. β) = −1 √ √ σx 2 1 − ρ so that J (α. 2. 956–958].STANDARD PROBABILITY DISTRIBUTIONS 35 in the α − β plane. 1 − ρ 2 J (α. (b) η y . β) 1 2 2 where the Jacobian of the transformation is ∂α ∂α J (α. β) d α dβ = A 1 2πσx σ y e − 2 (α +β ) d α dβ .
(h) n = 10. Bernoulli event probabilities may be approximated by the Poisson PMF when np < 10 or by the Gaussian PDF when npq > 3.7. 2.3889. ﬁnd: (a) the smallest angle that either the minor or major axis makes with the positive α axis in the α − β plane. 30◦ .3394.1753.3394. (f ) n = 20. p = 0. p = 0.6.6. p = 0.1. Assume x is a Bernoulli random variable.3. 2.2212 in (154). and ρ = −0. (g) n = 5. Drill Problem 5. p = 0. p = 0. η y = 3. Answers: 0. (b) the length of the minor axis. and ρ = −0.2. Unfortunately. Suppose you are playing a game with a friend in which you roll a die 10 times. 0. Random variables x and y are jointly Gaussian with ηx = −2.3.6. Important approximations to the Bernoulli PMF and Gaussian CDF are developed. (d) n = 5. Random variables x and y are jointly Gaussian with ηx = −2. 5.1. (i) n = 20. σ y = 31. Determine P (x ≤ 3) using the Bernoulli CDF table if: (a) n = 5. If the die comes up with an even number. His play during each game is independent from all other games. Answers: 3.8 PROBLEMS 1. Poisson and Gaussian distributions. (b) the probability that. 2 2 σx = 21. Many important properties of jointly Gaussian random variables are presented.36 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS Drill Problem 5. 5.6.7 SUMMARY This chapter introduces certain probability distributions commonly encountered in biomedical engineering. out of ﬁve games.1. your friend gives you a dollar and if the die comes up with an odd number you give your friend a dollar.5212. The length of time William Smith plays a video game is given by random variable x distributed exponentially with a mean of four minutes. p = 0.3. σ y = 31. For the Gaussian approximation use η = np and σ 2 = npq . (b) n = 10.6. p = 0. p = 0. Drill Problem 5. (e) n = 10. Special emphasis is placed on the exponential. With c 2 = 0. Answers: exp(−1). he has played at least one game for more than four minutes.3. η y = 3. For the Poisson approximation use η = np. Find: (a) E(y  x = 0). 2 2 σx = 21. (c) n = 20. (c) the length of the major axis. Determine: (a) the probability that William is still playing after four minutes. (c )P (−1 < x < 7). p = 0. the die is loaded . 2.899.1. 0. (b)P (1 < y ≤ 10  x = 0).
Determine the probability that: (a) 3 of the next 20 rolls are strikes. (b) the probability of your friend winning more than 4 dollars. (a) If we take a sample of 20 cars. Each student is given only one transistor to use.100 will start. what is the probability of him hitting it at least 5 but less than 10 times? (b) What is the average number of hits in 30 tries? (c) What is the probability of him getting exactly the average number of hits in 30 tries? (d) How many times must the man try to hit the target if he wants the probability of hitting it to be at least 2/3? (e) What is the probability that no more than three tries are required to hit the target for the ﬁrst time? 6. (b) exactly 12. (c) the number starting is between 11. The probability of a man hitting a target is 0. 3. The probability that Professor Rensselaer bowls a strike is 0.2. (f ) variance for the number of strikes in 20 rolls. what is the probability he will make: (a) at least 4 baskets.STANDARD PROBABILITY DISTRIBUTIONS 37 so that a 1 or a 3 are three times as likely to occur as a 2.900 and 12. Determine: (a) how many dollars your friend can expect to win in this game. Determine the probability it will take more than 5 rolls. (c) from 7 to 9 baskets. If there are 20. The probability that a basketball player makes a basket is 0. 9. One lab class has 5 students and they will perform this experiment next week. The probability of a student destroying a transistor is 0.150. (d) less than 2 baskets. 8.6? . (c) the probability that fewer than 2 transistors are destroyed. (a) Sketch the PMF for x. one experiment introduces students to the transistor.500. a 5 or a 6. If he makes 10 attempts.000 will start.000 cars to be started. 5. (c) from 3 to 7 of the next 20 rolls are strikes.2. Determine: (b) the expected number of destroyed transistors. the probability that a car parked outside will start is 0. (b) 4 baskets. How many times must he throw the dart so that the probability of hitting the dart board is at least 0. (d) She is to keep rolling the ball until she gets a strike. what is the probability that exactly 12 cars will start and 8 will not? (b) What is the probability that the number of cars starting out of 20 is between 9 and 15. Consider Problem 7. 7. ﬁnd the probability that: (a) at least 12.4. (e) the expected number of baskets. a 4. North Dakota. In Junior Bioinstrumentation Lab. 4. (g) standard deviation for the number of strikes in 20 rolls.3. (a) If he tries 15 times to hit the target. Determine the: (e) expected number of strikes in 20 rolls. On a frosty January morning in Fargo. Let random variable x show the possible numbers of students who destroy transistors.6.7. (b) at least 4 of the next 20 rolls are strikes. (d) the number starting is less than 12. A dart player has found that the probability of hitting the dart board in any one throw is 0.
a sample of radioactive material emits 20 alpha particles per minute. and. What is the probability that more than 2 accidents will occur during any given week? 15. (a) If you give him a one page letter to type. Determine: (a) q. Determine the probability that the student makes: (a) at least 1 mistake. (c) exactly 2 mistakes. (b) from 3 to 5 mistakes. (c) p. after the lessons.5. 14. 17. the material averages 6 alpha particles emitted per minute. Suppose that on the average. Repeat Problem 1. What proportion of 20% defective components will be shipped? 13. Determine the probability that: (a) from 8 to 12 quizzes are given during the quarter. 80% of the students take the course at most one time.) (a) Determine the probability that a student takes the course more than once. (d) more than the expected number of mistakes. 12. what is the probability of him making fewer than 50 mistakes in a 25 page report? 18. (Professor Rensselaer. Determine E(x 2 ). What is the probability of 10 alpha particles being emitted in: (a) 1 min. Suppose a typist makes an average of 30 mistakes per page. An electronics manufacturer is evaluating its quality control program. The current procedure is to take a sample of 5 from 1000 and pass the shipment if not more than 1 component is found defective. Suppose the average number of times a student takes Introduction to Random Processes is 1. 11. You give him another one page letter to type. (b) The academic vicepresident of FPI wants to ensure that on the average. What is the probability of at least 6 alpha particles being emitted in 1 min? 19. using the Poisson approximation to the Bernoulli PMF. what is the probability that he makes exactly 30 mistakes? (b) The typist decides to take typing lessons. (d) at most 9 quizzes are given during the quarter. At Fargo Polytechnic Institute (FPI). (b) exactly 11 quizzes are given during the quarter.4. when appropriate. Let random variable x be Bernoulli with n = 15 and p = 0. On the average. Professor Rensselaer gives 11 quizzes per quarter in Introduction to Random Processes. What is the probability of him making fewer than 5 mistakes. (c) With the 5 mistakes per page average. To what value should the mean be adjusted to ensure this? . (b) 10 min? (c) Many years later. A certain intersection averages 3 trafﬁc accidents per week. the course instructor. Suppose x is a Bernoulli random variable with η = 10 and σ 2 = 10/3. thinks so many students repeat the course because they enjoy it so much. a student makes 6 mistakes per test. (c) at least 10 quizzes are given during the quarter. he averages 5 mistakes per page.38 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS 10. 16. On the average. a student may take a course as many times as desired. (b) n.
. 23. wants to ensure that on the average. Assuming that . (c) 0 defective transistors in a sample of 20 transistors. If 15. 95% of the spoonfuls will each have at least one lizard. A perfect car is assembled with a probability of 2 × 10−5 . The distribution for the number of students seeking advising help from Professor Rensselaer during any particular day is given by P (x = k) = 3k e −3 . determine: (a) E(z). what is the probability that none are perfect? 22. 21. (a) Every time a carpenter pounds in a nail. the probability that he hits his thumb is 0. 1. how many extra minutes can he expect to take off in building a house with 3000 nails? 24. a bran cereal with milk expanding (exploding) marshmallow lizards. . If in building a house he pounds 1250 nails. what is the probability of him hitting his thumb at least once while working on the house? (b) If he takes ﬁve extra minutes off every time he hits his thumb. Determine the probability that fewer than 9 students major in Bioengineering. The probability that a student will major in Bioengineering is 0. (b) σz. a computer store is going to run an advertising campaign in which the employees will telephone 5840 people selected at random from the population of North America. Otherwise. FPI admits only 1000 freshmen per year. Determine the probability that there are: (a) 3 defective transistors in a sample of 200 transistors. If random variable z equals the total number of students Professor Rensselaer helps each day. k! k = 0. Assuming that the lizards are randomly distributed in the cereal box.000 cars are produced in a month. . This year. (b) more than 15 defective transistors in a sample of 1000 transistors. then that lucky person will be mailed a brand new programmable calculator. on its anniversary day. If it is. 26. that person will get nothing. to what value should the mean of the lizards per spoonful be set at to ensure this? 25.01.002. Suppose 1% of the transistors in a box are defective. The caller will ask the person answering the phone if it’s his or her birthday.STANDARD PROBABILITY DISTRIBUTIONS 39 20. The manufacturer of Leaping Lizards. . The PDF for the time interval between students seeking help for Introduction to Random Processes from Professor Rensselaer during any particular day is given by f t (τ ) = e −τ u(τ ).
Find: (a) P (−1 < x < 0).40 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS the person answering the phone won’t lie and that there is no such thing as leap year. ﬁnd: (a) the mean.1 to 4.08% of their batteries last less than 2. In a class of 25 students. Determine: (a) the probability it takes the runner exactly 4 min to run a mile. 35. 34. Assuming the battery lives are Gaussian distributed. If resistances are measured to the nearest ohm. 29. (e) P (−2.3 years and 2. D: 60–70. (b) variance. Assume x is a standard Gaussian random variable. Random variable x is uniform between −2 and 3. determine: (a) P (x = 0).1 < x ≤ −0. when appropriate. A light bulb manufacturer distributes light bulbs that have a length of life that is normally distributed with a mean equal to 1200 h and a standard deviation of 40 h. (c) σx . The time it takes a runner to run a mile is equally likely to lie in an interval from 4. (e) FxA (αA). (g) FxB (αB).5). what is the probability that grades will be equally distributed? 36. Using Tables A. (b) P (x < 0). (c) P (h − ηh  < 3σh ). Repeat Problem 1. ﬁnd the probability that: (a) the computer store mails out exactly 16 calculators. Find: (a) P (h > 74). Assume that the scores on an examination are Gaussian distributed with mean 75 and standard deviation 10. Grades are assigned as follows: A: 90–100.2 min. Consider Problem 36.9 and A. (b) the computer store mails out from 20 to 40 calculators. using the Gaussian approximation to the Bernoulli PMF. 32. (d) P (−1. Find the value of d so that the range 77 + d covers 95% of the current gains.15 min.2). (a) Write the PDF for the resistance value.5% of their batteries last more than 3. (b) P (49 < r < 54). and F: below 60. Find the probability that a bulb burns between 1000 and 1300 h. 31. h. (b) the probability it takes the runner from 4. (d) f xA (αA). Event A = {0 < x ≤ 2} and B = {−1 < x ≤ 0} ∪ {1 < x ≤ 2}.583 ≤ x < 1. B: 80–90. (f ) f xB (αB). A battery manufacturer has found that 8. (c) Find P (49 < r < 54).0 to 4. . 28. 30. ﬁnd: (a) the probability that a particular resistor is within 2 ohms of the mean.471).98 years. A certain transistor has a current gain. Consider Problem 32. A certain type of resistor has resistance values that are Gaussian distributed with a mean of 50 ohms and a variance of 3.10. (c) P (x < 0. 37. (b) P (73 < h ≤ 80). (b) ηx . that is Gaussian distributed with a mean of 77 and a variance of 11. (f ) P (x is an integer). 27. 33. C: 70–80. (b) Find the probability that a particular resistor is within 2 ohms of the mean.
A large box contains 10. 40. Suppose x and y are jointly distributed Gaussian random variables with E(yx) = 2. 44. 43. 13).1? 41. y ∼ G(−2.y (∞.32x.STANDARD PROBABILITY DISTRIBUTIONS 41 38. (Hint: assume RV y is independent of x and has PDF f y (β) = 2 f x (β) and evaluate Fx.y = 0. 4). Consider Problem 45. Determine the probability that a student guesses the correct answers for 20–25 of 85 questions about which the student has no knowledge. If the average resistance is 1000 ohms with a standard deviation of 200 ohms.y .5y. ∞). y ∼ G(3. (b) f yx (β0).8 + 0. E(xy) = −1 + 0. (b) P (x ≤ 78). A 250 question multiple choice ﬁnal exam is given.2. Draw a sketch of the joint Gaussian constant contour for x and y. Assume x and y are jointly Gaussian with x ∼ G(2. (a) Draw a sketch of the constant contour equation for the standardized RVs z1 and z2 . and ρx. The RV x has PDF f x (α) = a exp − 1 (α − η)2 . Each question has 5 possible answers and only one correct answer. (b) η y . 8).y .0 and a standard deviation of 0. y ∼ G(1. Assume x and y are jointly distributed Gaussian random variables with x ∼ G(−2.) (b) Using direct integration. and σ yx = 3.y = −5. Assume x ∼ G(3. The average golf score for Professor Rensselaer is 78 with a standard deviation of 3. what is the probability that at least 2 have a length greater than 2. 2σ 2 (a) Find the constant a.67. (d) P (3 < x < 10). ﬁnd E(x). Determine: (a) E(yx = 0). 46. If 5 of these components are removed from different systems. 1). Assuming a Gaussian distribution for random variable x describing her golf game. (f ) σx. determine: (a) P (x = 78). Determine: (a) ηx . (c) P (−1 < x < 1. 45. Suppose a system contains a component whose length is normally distributed with a mean of 2. Draw a sketch of the joint Gaussian contour equation showing the original and the translatedrotated sets of axes.8835. and that x and y are jointly Gaussian with ρx. Find: (a) P (1 < y < 7  x = 0). 9). (d) σ y . (c) P (70 < x ≤ 80).y = −0. (b) P (1 < y < 7). . 47. (c) σx .000 resistors with resistances that are Gaussian distributed. (c) Find σx using direct integration. (d) the probability that x is less than 75 if the score is measured to the nearest unit. 1). (b) Using the results of (a). and ρx. (c) P (0 < y < 4x = 0). (e) ρx. how many resistors have resistances that are within 10% of the average? 42. 39.5. 1 < y < 7).
A component with an exponential failure rate is 90% reliable at 10. 49. 4). For successful operation of the circuit. Consider Problem 49. if he does not pin his opponent with his trick move. Determine: (a) E(y  x = 0). (c) P (0 < y < 4x = 0). 55. Determine the number of hours reliable at 95%. (b) f yx (β  0). (a) Draw a sketch of the constant contour equation for the standardized RVs z1 and z2 . Random variable t is exponential with η = 2.000 h. (b) 95%. 53. . Suppose a system has an exponential failure rate in years to failure with η = 2. Consider Problem 52.000 h.6. Determine the number of years reliable at: (a) 90%. Determine the probability that the circuit is operating successfully at 10. 7).12: Circuit for Problem 54. each of the four components operate independently of one another and have an exponential failure rate (in hours) with η = 105 . y ∼ G(1. 54.42 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS 48.12. 51. and ρx. Determine: (a) E(y  x = 0).3 years. The survival rate of individuals with a certain type of cancer is assumed to be exponential with η = 4 years.5. Assume x and y are jointly Gaussian with x ∼ G(−1. (b) f z(α) if z = t − T.378. (d) P (−3 < x < 0). (c) 99%.0433 years. (c) P (0 < y < 4x = 0). (d) P (3 < x < 10). 56. Determine: (a) P (t > η). Five individuals have this cancer. In the circuit shown in Figure 5. William’s 1 A 3 2 B 4 FIGURE 5. 52.y = 0. Determine the probability that at most three will be alive at the end of 2. determine the probability that 10 are operating at the end of 2. Consider Problem 47. If 20 of these systems are installed. at least two of these components must connect A with B. Without exception. he loses the match on points. where T is a constant. 50. Draw a sketch of the joint Gaussian contour for x and y. William Smith is a varsity wrestler on his high school team. (b) f y  x (β  0). (b) Using the results of (a). 57.
period 3 : f xA (αA) = 0. 9 months later)? . prints brochures describing their vacation packages as “Unique. Outer Space Adventures. The length of time it takes William to pin his opponent in each period of a wrestling match is given by: period 1 : f x (α) = 0. having an average length of 6 months.1149599 exp(−0.4598394 exp(−0. What is the probability that you will be back home in time for next fall quarter (that is. 62.2299197 exp(−0. 58. Suppose Fargo Community Hospital has a backup emergency generator to provide auxilary power during a power failure. What is the probability that the hospital will be without power during the next 24 h? 61. Moreover. Determine: (a) the probability that there will be at least one power failure during the coming year. Assume each period is 2 min and the match is 3 periods. Find the probability that William Smith wins: (a) at least 4 of his ﬁrst 5 matches.1149599α)u(α). (c) the probability that you will have more than ﬁve people in the queue ahead of you. where A = {Smith did not pin his opponent during the previous periods}. (b) the probability that you will have less than four people in the queue ahead of you.2299197α)u(α). In fact. the queue is 6.STANDARD PROBABILITY DISTRIBUTIONS 43 trick move also prevents him from ever getting pinned. Suppose that you have signed up for a vacation trip that starts at the end of this quarter. Determine: (a) your expected waiting time to make a purchase. Each customer takes 4 min to check out. Assume that a power failure will last at least 24 h. they are so unexpected that the length of the vacations is random. 60. The average time between power failures in a Coop utility is once every 1. (b) the probability that there will be at least two power failures during the coming year.” The vacations certainly live up to their advertisement. period 2 : f xA (αA) = 0. Inc.4 years.3 people long. (b) more matches than he is expected to during a 10 match season. Consider Problem 57. following an exponential distribution. the emergency generator has an expected time between failures of once every 200 h. Determine the probability that William Smith: (a) pins his opponent during the ﬁrst period: (b) pins his opponent during the second period: (c) pins his opponent during the third period: (d) wins the match. On the average. Unexpected and Unexplained. The queue for the cash register at a popular package store near Fargo Polytechnic Institute becomes quite long on Saturdays following football games.4598394α)u(α). Consider Problem 59 statement. 59.
Sevenhundred and ﬁfty of these bulbs were put in light ﬁxtures in four rooms.44 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS 63. the bulbs were taken from the ﬁxtures and placed in a box. what is the probability that it is burnt out? . A certain brand of light bulbs has an average lifeexpectancy of 750 h. The lights were turned on and left that way for a different length of time in each room as follows: ROOM 1 2 3 4 TIME BULBS LEFT ON. HOURS 1000 750 500 1500 NUMBER OF BULBS 125 250 150 225 After the speciﬁed length of time. The failure rate of these light bulbs follows an exponential PDF. If a bulb is selected at random from the box.
) Then. Next. 6. the difﬁculties associated with computing the probability distribution of z = g (x) are investigated when the restrictions on g (x) are relaxed.1. consequently. and mixed random variables. +∞}) = 0. we require that the function z : S → R∗ . For example. Then. we require z to be a measurable function on the measurable space (S. and we wish to ﬁnd the CDF for random variable z = g (x). a full wave rectiﬁer circuit produces an output that is the absolute value of the input. Of course. and for all types of continuous. discrete.1 UNIVARIATE CDF TECHNIQUE This section introduces a method of computing the probability distribution of a function of a random variable using the CDF. P ). the probability distribution for a single random variable is determined from a function of two random variables using the CDF. To make the CDF technique easier to understand. The primary subjects of this chapter are methods for determining the probability distribution of a function of a random variable.1 CDF Technique with Monotonic Functions Consider the problem where the CDF Fx is known for the RV x. the joint probability distribution is found from a function of two random variables using the joint PDF and the CDF. We will refer to this method as the CDF technique. (Recall that if g is a monotonic function then a onetoone correspondence between g (x) and x exists. We ﬁrst evaluate the probability distribution of a function of one random variable using the CDF and then the PDF. we have for a . ) and P (z(ζ ) ∈ {−∞. we start the discussion of computing the probability distribution of z = g (x) with the simplest case. The input/output characteristics of many physical devices can be represented by a nonlinear memoryless transformation of the input. Proceeding from the deﬁnition of the CDF for z. The CDF technique is applicable for all functions z = g (x). a continuous monotonic function g . is a random variable on the probability space (S. . with z(ζ ) = g (x(ζ )). The ease of use of the CDF technique depends critically on the functional form of g (x). 6.45 CHAPTER 6 Transformations of Random Variables Functions of random variables occur frequently in many applications of probability theory.
more than one solution of z = g (x) can exist—resulting in a manytoone mapping from x to z. (γ − 1)/2 < 0 ⎨ γ −1 Fz(γ ) = Fx = (γ − 1)/8. then Fz(γ ) = P (z = g (x) ≤ γ ) = P (x ≥ g −1 (γ )) = 1 − Fx (g −1 (γ )− ). 0 ≤ (γ − 1)/2 < 4 ⎪ 2 ⎩ 1. including discontinuities. 0 ≤ α < 4 ⎪ ⎩ 1.1). α<0 ⎨ Fx (α) = α/4.2 CDF Technique with Arbitrary Functions In general. Solution. However.46 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS monotonic increasing function g (x) Fz(γ ) = P (z = g (x) ≤ γ ) = P (x ≤ g −1 (γ )) = Fx (g −1 (γ )). γ <1 1≤γ <9 9 ≤ γ. Simplifying. this case is conceptually no more difﬁcult than . under these conditions it is impossible to write a general expression for Fz(γ ) using the CDF technique. 6.1) indicates. Applying (6. In this general case. As (6. Since random variable x is uniformly distributed. Similarly. the CDF of x is ⎧ ⎪ 0.1. we obtain Fz(γ ) = which is also a uniform distribution. Example 6. the CDF for z is given by ⎧ ⎪ 0. (γ − 1)/8. (6.2) 0.1. with the argument α replaced by g −1 (γ ). 4 ≤ α. 4 ≤ (γ − 1)/2. Letting g (x) = 2x + 1. Additionally. we see that g is monotone increasing and that g −1 (γ ) = (γ − 1)/2. ⎪ ⎩ 1. ⎧ ⎪ ⎨ (6. the function does not have to be monotonic.1) where g −1 (γ ) is the value of α for which g (α) = γ . if z = g (x) and g is monotone decreasing. Random variable x is uniformly distributed in the interval 0 to 4. Find the CDF for random variable z = 2x + 1. the only requirement on g is that z = g (x) be a random variable. the CDF of random variable z is written in terms of Fx (α). In fact. the relationship between x and z can take on any form. The following example illustrates this technique.1. Fz(γ ) is no longer found by simple substitution. In general.
2.7) The success of this method clearly depends on our ability to partition A(γ ) into disjoint intervals.4) Ai (γ ). Since the Ai ’s are disjoint.3) (6.5) Note that the intervals as well as the number of nonempty intervals depends on γ .TRANSFORMATIONS OF RANDOM VARIABLES 47 the previous case. and involves only careful book keeping. Example 6. Partition A(γ ) into disjoint intervals {Ai (γ ) : i = 1. Then Fz(γ ) = P (g (x) ≤ γ ) = P (x ∈ A(γ )).10) (6. γ ]).8) (6. b i (γ )]. Find the CDF for the RV z = x 2 . i=1 (6. 2.9) (6. Fz(γ ) = ∞ i=1 P (x ∈ Ai (γ )).1. (6.} so that A(γ ) = ∞ (6. Using this method. α < −1 −1 ≤ α < 1 1 ≤ α. . ⎨ Fx (α) = (3α − α 3 + 2)/4. If interval Ai (γ ) is of the form Ai (γ ) = (a i (γ ). any function g and CDF Fx is amenable to a solution for Fz(γ ). if interval Ai (γ ) is of the form Ai (γ ) = [a i (γ ). . Note that A(γ ) = g −1 ((−∞. then P (x ∈ Ai (γ )) = Fx (b i (γ )) − Fx (a i (γ )− ). then P (x ∈ Ai (γ )) = Fx (b i (γ )) − Fx (a i (γ )). (6. Similarly. The following examples illustrate various aspects of this technique.6) The above probabilities are easily found from the CDF Fx . Random variable x has the CDF ⎧ ⎪ 0. b i (γ )]. . ⎪ ⎩ 1. Let A(γ ) = {x : g (x) ≤ γ }. .
1 we ﬁnd ⎧ ∅. ⎪ 3 3 3 ⎪ ⎩ R Consequently. ⎪ 0≤x<1 ⎪ ⎪ ⎩ 2. α < −3 ⎨ Fx (α) = (α + 3)/6. γ <0 ⎨ √ √ Fz(γ ) = (3 γ − ( γ )3 )/2. √ √ Fz(γ ) = Fx ( γ ) − Fx ((− γ )− ). ⎪ ⎪ ⎪ 3x + 5. 0≤γ <1 ⎪ ⎩ 1 1 ≤ γ. −1 ≤ γ < 2 2 ≤ γ. −3 ≤ α < 3 ⎪ ⎩ 1. Referring to Figure 6. γ +1 − Fx 3 1 −γ − 1 3 − γ < −1 −1 ≤ γ < 2 2 ≤ γ. 1 ≤ x. we ﬁnd A(γ ) = g −1 ((−∞. Find Fz(γ ).48 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS Solution.3.1. Fz(γ ) = + Fx .1. Random variable z is deﬁned by ⎧ x < −2 ⎪ −1. ⎪ −2 ≤ x < −1 ⎨ z = g (x) = −3x − 1. Example 6. γ < −1 . Solution. we obtain ⎧ ⎪ 0. Letting g (x) = x 2 . −1 ≤ x < 0 ⎪ ⎪ 3x − 1. γ <0 γ ≥ 0. Let A(γ ) = g −1 ((−∞. ⎪ ⎪ ⎨ γ −5 −γ − 1 γ + 1 A(γ ) = −∞. . Plots for this example are given in Figure 6. Noting that Fx (α) is continuous and has the same functional form for −1 < α < 1. γ ]. ∪ . 3 ≤ α. Random variable x is uniformly distributed in the interval −3 to 3. The CDF for random variable x is ⎧ ⎪ 0. ⎧ ⎪ ⎪ ⎪ ⎨ ⎪ ⎪ ⎪ ⎩ Fx γ −5 3 0. ∞)). γ ]) = so that ∅. √ √ [− γ .
This results in a jump in Fz and an impulse function in f z. random variable z is mixed due to the constant value of g (x) in the intervals −3 ≤ x < −2 and 1 ≤ x < 3.3. γ −5 γ + 1 −γ − 1 +3+ − 3 3 3 1 γ +2 = . z = g (x) is a discrete random variable if g (α) changes values only on intervals where f x (α) = 0. Substituting.TRANSFORMATIONS OF RANDOM VARIABLES g(x) 4 2 x 49 −3 −1 −1 Fx (α) 1 1 3 −3 −1 Fz (γ) 1 1 3 α −3 −1 1 3 γ FIGURE 6. z is a mixed random variable whenever g (α) is constant in an interval where f x (α) = 0. then random variable z can be continuous. Fz(γ ) = Note that Fz(γ ) has jumps at γ = −1 and at γ = 2.1. we obtain ⎧ ⎪ ⎪ ⎨1 ⎪6 ⎪ ⎩ 0. As presented in the previous example. discrete or mixed. 6 γ < −1 −1 ≤ γ < 2 2 ≤ γ. dependent on the CDF for random variable x. Whenever random variable x is continuous and g (x) is constant over some interval or intervals. In fact.1: Plots for Example 6. . Random variable z is continuous if x is continuous and g (x) is not equal to a constant over any interval where f x (α) = 0. Moreover.
The CDF for random variable x is ⎧ ⎪ 0. To ﬁnd f z.2: Transformation for Example 6. −1 < α < 2 otherwise. x Fz(γ ) = ⎪ Fx (0. Solution. −1 −1 −2 1 2 FIGURE 6. g( x ) 1 x γ ≤ −1 −1 ≤ γ < 0 0≤γ <1 1 ≤ γ. γ + 1]. ⎪ ⎩ 1. ⎨ z = g (x) = 0. ⎪ ⎪ ⎨ F (0). Consequently.1. . Random variable x has the PDF f x (α) = (1 + α 2 )/6. ∞). 0. then differentiate this result.5 0. γ ≤ −1 ⎪ ⎪ ⎨ (−∞.50 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS Example 6. Find the PDF of random variable z deﬁned by ⎧ ⎪ x − 1.4.1. 0. 0]. −1 ≤ α < 2 ⎪ ⎩ 1. x≤0 0 < x ≤ 0.4.5]. 0≤γ <1 ⎪ ⎪ ⎩ (−∞. we evaluate Fz ﬁrst. −1 ≤ γ < 0 A(γ ) = {x : g (x) ≤ γ } = ⎪ (−∞.5 < x.2. With the aid of Figure 6. ⎧ ⎪ Fx (γ + 1). 1 ≤ γ. 2 ≤ α.5). Figure 6. ⎪ ⎪ ⎩ 1. α < −1 ⎨ 3 Fx (α) = (α + 3α + 4)/18.2 shows a plot of g (x). we ﬁnd ⎧ ⎪ (−∞.
1) ln(0. ⎪ ⎨ Fz(γ ) = 2/9. the CDF technique is used without any changes or modiﬁcations as shown in the next example.TRANSFORMATIONS OF RANDOM VARIABLES 51 Substituting. Example 6. ⎪ ⎪ ⎩ 1. the CDF technique is applicable if random variable x is mixed or discrete. The previous examples illustrated evaluating the probability distribution of a function of a continuous random variable using the CDF technique. The CDF for x is ⎧ ⎪ 0.1. ⎪ ⎩ 1.5δ(α − 0. Differentiating Fz. We ﬁnd Fz(γ ) = 0.1γ .5). γ < −2 −2 ≤ γ < −1 −1 ≤ γ < 0 0≤γ <1 1 ≤ γ. For γ > 0. α<0 0 ≤ α < 10 10 ≤ α. we ﬁnd A(γ ) = {x : − ln(x) ≤ γ } = (e −γ . 18 144 16 Example 6. as required since g (x) = − ln(x) is not deﬁned (or at least not real–valued) for x ≤ 0. ⎧ 0. Random variable x is uniformly distributed in the interval from 0 to 10. 1 − e −0. ⎪ ⎪ ⎪ 5/16. Find the CDF for random variable z = g (x) = − ln(x). respectively. γ < ln(0.6. Note that P (x ≤ 0) = 0. . continuous and discontinuous. f z(γ ) = 13 11 3γ 2 + 6γ + 6 (u(γ + 2) − u(γ + 1)) + δ(γ ) + δ(γ − 1).1) ≤ γ . Note that Fz(γ ) has jumps at γ = 0 and γ = 1 of heights 13/144 and 11/16. ⎪ ⎪ ⎪ ⎪ ((γ + 1)3 + 3γ + 7)/18. Random variable x has PDF f x (α) = 0. ∞). Additionally.1.5(u(α) − u(α − 1)) + 0. so that Fz(γ ) = 1 − Fx ((e −γ )− ). ⎨ Fx (α) = α/10. Find the CDF for z = g (x) = 1/x 2 . For mixed random variables. Solution. This technique is applicable for all functions z = g (x).5.
5 + 0. . (d ) f z(1/2).5γ −1/2 . (b)Fz(1). 1≤γ <4 ⎪ ⎩ 1 − 0.5γ −1/2 . γ √ Since Fx (−1/ γ ) = 0 for all real γ .5 ≤ (1/ γ )− < 1 √ − 1 ≤ (1/ γ ) . After some algebra.5 − 0. Fz(γ ) = 0. 1 ≤ α. For γ > 0. The mixed random variable x has CDF ⎧ ⎪ 0. 0. 0 ≤ α < 0. α<0 ⎪ ⎪ ⎨ 0. Random variable x has the PDF f x (α) = 0. ⎨ z= x. Random variable z is deﬁned by ⎧ ⎪ −1.1. − √ γ so that 1 ∪ √ .5γ −1/2 . (c )Fz(3/2). 1. √ (1/ γ )− < 0 √ 0 ≤ (1/ γ )− < 0. Answers: 0. 1/16. ⎪ ⎪ ⎨ 0. 1/4. 1 A(γ ) = {x : x −2 ≤ γ } = −∞. 1/15.5 + 0.1.1. Random variable x is uniformly distributed in the interval −1 to 4. Fz(γ ) = 1 − ⎪ 0. (c ) f z(15). ⎪ ⎪ ⎩ 1.5α.5 ≤ α < 1 ⎪ ⎪ ⎩ 1. x<1 −1 ≤ x ≤ 1 x > 1. 1/15. (b)Fz(1/2). For γ < 0. 0. Determine: (a) Fz(0).5γ . Random variable z = 3x + 2. Answers: 0. ⎧ ⎪ ⎨ √ √ Fz(γ ) = Fx (−1/ γ ) + 1 − Fx ((1/ γ )− ). 2/15. we have ⎧ ⎪ 0. 4 ≤ γ.∞ .5 √ 0.5α. ⎪ ⎩ 1.2. Determine: (a) Fz(−1/2). Drill Problem 6.5 Fx (α) = ⎪ 0.5α(u(α) − u(α − 2)). Drill Problem 6.52 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS Solution. (c ) f z(0). γ <1 −1/2 Fz(γ ) = 0.
Then. in many situations it is much simpler to use than the CDF technique.TRANSFORMATIONS OF RANDOM VARIABLES 53 Drill Problem 6. First. Drill Problem 6. x + 0. called the PDF technique. Answers: 0.1. is only applicable for functions of random variables in which z = g (x) is continuous and does not equal a constant in any interval in which f x is nonzero. Random variable z = 1/x 2 .089.022. The PDF technique. 2. 6. we will ﬁnd the PDF method most useful in extensions to multivariate functions. we introduce a second method for calculating the probability distribution of a function z = g (x) using the probability density function. 0. 0. Random variable x has the PDF f x (α) = 0. (c )Fz(4). we discuss a wide variety of situations using the PDF technique with functions of continuous random variables. Answers: 1/4. (c )Fz(3/2). Determine: (a) Fz(1/8). however. Second. ⎪ ⎩ 3. . 1. 6.5α(u(α) − u(α − 2)). Random variable x has PDF f x (α) = e −α−1 u(α + 1).5 0. 1/16. Finally. Let x be a continuous RV with PDF f x (α) and let the RV z = g (x) . (d )Fz(4).1. Let αi = αi (γ ) = g −1 (γ ).11) . i = 1.2.1.1 Continuous Random Variable Theorem 6. In this section. (d ) f z(4). a method for handling mixed random variables with the PDF technique is introduced. (6. x ≤ 0. (b)Fz(0). . (b)Fz(1/2). We introduce the PDF technique for two reasons.0519. 0.2. Assume g is continuous and not constant over any interval for which f x = 0 .617.4. .5 < x ≤ 1 x > 1.2 UNIVARIATE PDF TECHNIQUE The previous section solved the problem of determining the probability distribution of a function of a random variable using the cumulative distribution function. Now. 1/16. Random variable z is deﬁned by z= ⎧ ⎪ ⎨ −1.3. we consider computing the conditional PDF of a function of a random variable using the PDF technique. . Determine: (a) Fz(−1).5.
h) = αi (γ ). h) into disjoint intervals of the form Ii (γ . h) The desired result follows by taking the product of the above limits. h)) − g (a i (γ . Let h > 0 and deﬁne I (γ . g (1) (αi (γ )) if f x (αi (γ ) = 0. h) = {x : γ − h < g (x) ≤ γ }. . b i (γ . h)) − Fx (a i (γ . . h). b i (γ . g (1) (αi (γ )) (6. such that I (γ . h) . The absolute value appears because by construction we have b i > a i and h > 0. h) = (a i (γ . h)) h g (αi (γ )) lim and that h→0 lim Fx (b i (γ . h) b i (γ . Proof. h) − a i (γ . . Partition I (γ . h→0 ∞ i=1 ∞ i=1 ∞ i=1 f x (αi (γ )) . h) = lim b i (γ . h). = lim = (1) h→0 h→0 g (b i (γ . h) − a i (γ . Ii (γ . .12) i = 1. h)). whereas g (1) may be positive or negative. . Then f z(γ ) = where we interpret f x (αi (γ )) = 0. h→0 Note that (for all γ with f x (αi (γ )) = 0) 1 b i (γ . (Fx (b i (γ . h)) − Fx (a i (γ .54 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS denote the distinct solutions to g (αi ) = γ . h))) lim a i (γ . 2. h)) = f x (αi (γ )). h) = Then Fz(γ ) − Fz(γ − h) = By hypothesis. h) − a i (γ .
we have g (1) (α) = −2α −3 . .2. 0.75(1 − α 2 )(u(α + 1) − u(α − 1)). 6 Find the PDF for random variable z = g (x) = x 2 .2. Find the PDF for random variable z = g (x) = 2x + 1. f z(γ ) = or f z(γ ) = 0. hence. there is only one solution to the equation γ = 2α + 1. In this case. 2γ 3/2 0. so that f z(γ ) = 0 for γ < 0.3.75(γ − 2 − γ − 2 )u(γ − 1). For γ > 0 there are two solutions to g (αi ) = γ : 1 α1 (γ ) = − √ . Since γ = g (α) = α −2 . Find the PDF for random variable z = g (x) = 1/x 2 . Example 6. Random variable x has PDF f x (α) = 0. γ and 1 α2 (γ ) = √ .75(1 − γ −1 )(u(γ − 1) − 0 + 1 − u(1 − γ )) u(γ ). We easily ﬁnd g (1) (α) = 2. g (1) (αi ) = 2/αi 3 = 2γ 3/2 .2. 1<γ <9 otherwise.TRANSFORMATIONS OF RANDOM VARIABLES 55 Example 6. and f z(γ ) = Substituting. 2γ 3/2 1 f x (α) = (1 + α 2 )(u(α + 1) − u(α − 2)). Random variable x is uniformly distributed in the interval 0–4.1. Solution. there are no solutions to g (αi ) = γ . γ 1/8.2. f z(γ ) = Simplifying.75(1 − γ −1 )(u(1 − γ −1/2 ) − 0 + 1 − u(γ −1/2 − 1)) u(γ ). 2γ 3/2 0. Solution. given by α1 = (γ − 1)/2. For γ < 0. Random variable x has PDF 3 5 f x (−γ −1/2 ) + f x (γ −1/2 ) u(γ ). Hence f z(γ ) = f x ((γ − 1)/2)/2 = Example 6.
We have g (1) (α1 ) = −e ln(γ ) = −γ . respectively. The continuous part of the PDF of x is handled by the PDF technique. For all γ > 0 such that . and √ √ f x (− γ ) + f x ( γ ) u(γ ).5) + δ(α − 0. f z(γ ) = √ 2 γ Substituting.56 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS Solution.5 and e −0. The probability masses of 1/8 for x at −0. ⎨ f z(γ ) = (γ −1/2 + γ 1/2 )/12. 12 γ ⎧ ⎪ (γ −1/2 + γ 1/2 )/6.4. √ Since γ = g (α) = α 2 .2. so that f z(γ ) = 0 for γ < 0. f z(γ ) = or 1+γ √ (u(1 − γ ) − 0 + 1 − u(γ − 4))u(γ ). To illustrate the use of this technique. consider the following example.2. we treat the discrete and continuous portions of f x separately. and we wish to ﬁnd the PDF for z = g (x). 1+γ √ √ √ √ √ (u(1 − γ ) − u(−2 − γ ) + u( γ + 1) − u( γ − 2))u(γ ).5 are mapped to probability masses of 1/8 for z at e 0. 12 γ 0<γ <1 1<γ <4 elsewhere. hence. ⎪ ⎩ 0. For γ < 0 there are no solutions to g (αi ) = γ . g (1) (αi ) = 2αi  = 2 γ . and then combine the results to yield f z. Here.2 Mixed Random Variable Consider the problem where random variable x is mixed. Solution. 8 8 8 Find the PDF for the RV z = g (x) = e −x . we have g (1) (α) = 2α. For γ > 0 there are two solutions to g (αi ) = γ : √ √ α1 (γ ) = − γ . 6. Example 6.5 and 0.5). Random variable x has PDF 1 1 3 f x (α) = (u(α + 1) − u(α − 1)) + δ(α + 0. There is only one solution to g (α) = γ : α1 (γ ) = − ln(γ ). f z(γ ) = Simplifying.5 . and α2 (γ ) = γ .
one of the following two methods may be used to determine the conditional PDF of z using the PDF technique.5 ) + δ(γ − e −0. given event A. (6. given that event A occurred. g (1) (αi (γ )) (6.5 > 0 we have f z(γ ) = Combining these results. we will use the conditional PDF of x to evaluate the conditional PDF for z as f zA (γ A) = ∞ i=1 f zA (αi (γ )A) . the above techniques apply to ﬁnd the conditional PDF for z = g (x).5 ).2. 9 . Consider the problem where random variable x has PDF f x . γ 6. (i) If A is an event deﬁned for an interval on z.14) Example 6.TRANSFORMATIONS OF RANDOM VARIABLES 57 α1 (γ ) ± 0. (ii) If A is an event deﬁned for an interval on x. We have P (A) = 1 6 2 0 7 (1 + α 2 )d α = .5. Depending on whether the event A is deﬁned on the range or domain of z = g (x). 6 Find the PDF for random variable z = g (x) = x 2 . is computed by ﬁrst evaluating f z using the technique in this section.2.3 Conditional PDF Technique Since a conditional PDF is also a PDF. by the deﬁnition of a conditional PDF. P (A) γ ∈ A. 8γ 8 8 f x (− ln(γ )) . based on f xA . and we wish to evaluate the conditional PDF for random variable z = g (x). the conditional PDF. Solution. f zA . we ﬁnd f z(γ ) = 3 1 1 (u(γ − e −1 ) − u(γ − e )) + δ(γ − e 0. Random variable x has the PDF 1 f x (α) = (1 + α 2 )(u(α + 1) − u(α − 2)).13) and f zA (γ A) = 0 for γ ∈A. we have f zA (γ A) = f z(γ ) . given A = {x : x > 0}. we solve for the conditional PDF for x and then ﬁnd the conditional PDF for z. First. Then.
Determine: (a) P (A). 8/9.5 ≤ α ≤ 1 ⎪ ⎩ 0. Answers: 13. 1/3. Consequently.2. Drill Problem 6. 0 ≤ α < 0. otherwise. 0. 0. 14 There is only one solution to γ = g (α) = α 2 on the interval 0 < α < 2 where f xA = 0. Answers: 4/9.1. (d ) f z(2).2. f zA (γ A) = 3 √ 1 γ+√ 28 γ (u(γ ) − u(γ − 4)). 9 Random variable z = 2x 2 and event A = {x : x ≥ 0}.Use the PDF method to determine: (a) f z(1/27). We √ √ have α1 (γ ) = γ and g (1) (α1 (γ )) = 2 γ . 1/2. (c ) f zA (2A). Drill Problem 6.5 ⎨ 2 f x (α) = 3(1 − α ). 3.(d ) f zA (9A).58 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS so that f xA (αA) = 3 (1 + α 2 )(u(α) − u(α − 2)). 5/27. 1/8. (c )E(z).2. Random variable x has the PDF 2 f x (α) = (α + 1)(u(α + 1) − u(α − 2)). 0.Use the PDF method to determine: (a) f z(0). (d )σz . Random variable z = x 3 . Answers: 1.43.52. Random variable x has the PDF 2 f x (α) = α(u(α) − u(α − 3)). (d )E(zz ≤ 1). Random variable x has the PDF ⎧ ⎪ 9α 2 . (b) f xA (1A). 1.2. 48.3. (b) f z(1/4). Random variable 2 z = 3x + 1 . 9 Random variable z = (x − 1)2 . (c ) f zz>1/8 (1/4z > 1/8). 2. Use the PDF method to determine: (a) f z(1/4). (b) f z(9/4). (c ) f zz≤1 (1/4z ≤ 1). (b) f z(6).2. Answers: 0. Random variable x has a uniform PDF in the interval 0–8. 1/24. Drill Problem 6. Drill Problem 6. .4.
Remember that in the case of a single random variable. Sketching the support region for f x. y).3. β) : α + β ≤ γ }. y) created from jointly distributed random variables x and y.y (the region where f x.y (α.y = 0. In this section. With z = g (x. y) are considered. . β).TRANSFORMATIONS OF RANDOM VARIABLES 59 6.1. y) is computed using a CDF technique similar to the one at the start of this chapter.y (α. The CDF for the RV z can then be found by evaluating the integral Fz(γ ) = A(γ ) (6. our efforts are concentrated on evaluating Fz through integrals. Solution. where A(γ ) = {(x. y) ≤ γ }. y) ≤ γ ) = P ((x. Example 6. we have Fz(γ ) = P (z ≤ γ ) = P (g (x.15) (6. Random variables x and y have joint PDF f x.y is not constant) and the region A(γ ) is often most helpful. β) where α ≤ γ − β: Fz(γ ) = ∞ −∞ γ −β −∞ 1/4. in the problem solution. β) = Find the CDF for z = x + y. 0 < α < 2.y = 0. y) ∈ A(γ )). f x. Because we are dealing with regions in a plane instead of intervals on a line. y) : g (x. these problems are not as straightforward and tractable as before.3 ONE FUNCTION OF TWO RANDOM VARIABLES Consider a random variable z = g (x. even crucial. the probability distribution of z = g (x. our efforts primarily dealt with algebraic manipulations. We have A(γ ) = {(α. Let us consider several examples to illustrate the mechanics of the CDF technique and also to provide further insight. Pay careful attention to the limits of integration to determine the range of integration in which the integrand is zero because f x. or Fx.17) This result cannot be continued further until a speciﬁc Fx.y (α. Here. We require the volume under the surface f x. (6. 0. β)d α dβ. with the ease of solution critically dependent on g (x. y). The ease in solution for Fz is dependent on transforming A(γ ) into proper limits of integration.16) dFx.y and g (x.y (α. 0 < β < 2 otherwise.
For 0 ≤ γ < 2. for −1 ≤ γ < 0. β)d αdβ. with the aid of Figure 6. for 4 ≤ γ . We have A(γ ) = {(α. Solution. With the aid of Figure 6.3. f x.3(b).4(a).y (α.1. Fz(γ ) = 1.y (α.2. we consider the complementary region (to save some work): Fz(γ ) = 1 − Finally. We require the volume under the surface f x. 0 < β < 1 otherwise. β) : α − β ≤ γ }. For γ < −1 we have Fz(γ ) = 0 and f z(γ ) = 0. 0 < α < 1.3: Plots for Example 6. 2 . β) = Find the PDF for z = x − y. For γ < 0 we have Fz(γ ) = 0.60 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS γ β γ 2 2 β γ −2 2 (a)0 < γ < 2 0 γ α 0 γ −2 2 (b)2 < γ < 4 γ α FIGURE 6. Random variables x and y have joint PDF f x. Fz(γ ) = 1 −γ 0 γ +β 1 d αdβ = (1 + γ )2 .3(a) we obtain Fz(γ ) = 0 2 0 γ −β 1 1 d αdβ = γ 2 . β) where α ≤ γ + β: Fz(γ ) = ∞ −∞ γ +β −∞ 2 γ −2 2 γ −β 1 1 d αdβ = 1 − (4 − γ )2 . referring to Figure 6. 0. 4 8 1.y (α. Example 6. 4 8 For 2 ≤ γ < 4.3.
3.3. Fz(γ ) = 1. Inside the support region for f x. so that f z(γ ) = 1 + γ .TRANSFORMATIONS OF RANDOM VARIABLES 61 β 1 1 1− γ 1+ γ β −γ 0 1 (a) −1 < γ < 0 α 0 1 (b) 0 < γ < 1 γ α FIGURE 6. Find the CDF for z = x/y. β) = 1/α. Example 6. for 1 ≤ γ . Fz(γ ) = 0 1 α α/γ 1 1 dβd α = 1 − . for γ < 1 we have Fz(γ ) = 0. For 0 ≤ γ < 1. β) : α/β ≤ γ }.y . we consider the complementary region shown in Figure 6. α γ β 1 1 γ 0 1 α FIGURE 6. 2 γ +β 1 so that f z(γ ) = 1 − γ . hence. we have α/β > 1.2. 0. Solution.3. . We have A(γ ) = {(α. Finally. 0<β<α<1 otherwise.4(b) (to save some work): Fz(γ ) = 1 − 0 1−γ 1 d αdβ = 1 − (1 − γ )2 . so that f z(γ ) = 0.3.4: Plots for Example 6.5 it is easiest to integrate with respect to β ﬁrst: for 1 ≤ γ . where x and y have the joint PDF f x. As shown in Figure 6.y (α.5: Integration region for Example 6.3.
β) = 3α. we obtain Fz(γ ) = 0. where x and y have the joint PDF f x. β) : α 2 + β 2 ≤ γ }. Fz(γ ) = 0 π/4 0 √ γ 1 3 3r 2 cos(θ)dr d θ = √ γ 2 . Fz(γ ) = 0 2π r 2 ≤γ f x.6(b) that Fz(γ ) = or 3 1 3 Fz(γ ) = √ γ 2 − (γ − 1) 2 . β = r sin(θ). for 0 ≤ γ < 1. 0 < β < α < 1 0.62 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS Example 6.y (r cos(θ). γ we ﬁnd with the aid of Figure 6. 2 π/4 θ1 0 √ γ √ α γ −1 0 3r cos(θ)dr d θ + 2 0 1 3αdβ d α.6(a). Solution. We have A(γ ) = {(α. . Find the CDF for z = x 2 + y 2 . Referring to Figure 6.4. the other using rectangular coordinates. With √ γ −1 sin(θ1 ) = √ .4. 2 For 1 ≤ γ < 2.3. For γ < 0.y (α.3. Transforming to polar coordinates: α = r cos(θ ). r sin(θ))r dr d θ. we split the integral into two parts: one with polar coordinates.6: Integration regions for Example 6. Finally. otherwise. β γ γ −1 1 β γ 1 0 γ 1 α 0 (b) 1 < γ < 2 1 γ α (a) 0 < γ < 1 FIGURE 6. we have Fz(γ ) = 1 for 2 ≤ γ .
w .3. ﬁnd the joint PDF f z. . β). Note that A(γ . (c )Fz(1). y) ≤ γ . We have g −1 ((−∞. A(γ . Answers: 1 e −1 . the case of one function of two random variables is treated by using the joint PDF technique with an auxiliary random variable. the conditional joint PDF is presented.y (α. y) and w = h(x. Then Fz. 6.4 BIVARIATE TRANSFORMATIONS In this section. With z = g (x. (6. ψ) = g −1 ((−∞. 0 ≤ α ≤ 2. γ ]) ∩ h −1 ((−∞. Find (a) Fz(−1/3). 0 ≤ β ≤ 2 otherwise. 1 − 3 e −1 .4. y) and w = h(x. we ﬁnd the joint distribution of random variables z = g (x. Example 6. Random variable z = x − y . y) ≤ ψ}.4.ψ) (6. Random variables x and y have the joint PDF f x. Finally. i. Next. y) ≤ γ and w = h(x.y (α.TRANSFORMATIONS OF RANDOM VARIABLES 63 Drill Problem 6. the joint PDF technique for ﬁnding the joint PDF for random variables z and w formed from x and y is described. Solution. y) = x + y and w = h(x. First. Then. we consider a bivariate CDF technique.. β) = 0. y) : x + y ≤ γ } . β) = e −α e −β u(α)u(β). 3 e −1 .1. 0. ψ]).1. y) = y.18) (6.w (γ . ψ) = {(x. and let z = g (x. y). y) ≤ ψ. Random variables x and y have joint PDF f x.1 Bivariate CDF Technique Let x and y be jointly distributed RVs on the probability space (S. (b) f z(−1). 4 4 4 4 6.25. 3 e −1 .y (α.20).e. ψ) = A(γ . and (d) f z(1). h(x.19) dFx.20) It is often difﬁcult to perform the integration indicated in (6. y) : g (x. P ). y) from jointly distributed random variables x and y. γ ]) = {(x. ψ) to be the region of the x − y plane for which z = g (x. Deﬁne A(γ .
w (γ . The intersection of these regions is A(γ . y) : y ≤ ψ}. 0. ψ). With the aid of Figure 6.w When the RVs x and y are jointly continuous.y (α. We ﬁnd 1 ∂ Fz. ψ) = {(x. ψ)}.20) we ﬁnd Fz.7 and (6. which is illustrated in Figure. ψ) = ψ −∞ γ −β −∞ d Fx.w (γ .w (γ . and h −1 ((−∞. β). ψ]) = {(x. ψ) = lim h 1 →0 h 1 ∂γ and ∂ 2 Fz. ψ) = f x.4. ψ) = 0. dFx. we obtain f z.y (γ − ψ.y (α.7: Integration region for Example 6.7. ψ) 1 = lim lim h 2 →0 h 1 →0 h 1 h 2 ∂ψ ∂γ ψ ψ−h 2 γ −β γ −h 1 −β ψ −∞ γ −β γ −h 1 −β d Fx. we differentiate the above integral to obtain the PDF f z. β). we ﬁnd that f z.y (α. β). 0 < ψ < 2 otherwise.20). Performing the indicated limits.w (γ .25.w (γ .w directly. f z. 6. substituting. 0 < γ − ψ < 2.64 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS y ψ γ γ −ψ x FIGURE 6. . Instead of carrying out the above integration and then differentiating the result. y) : y ≤ min(γ − x. it is usually easier to ﬁnd the joint PDF than to carry out the integral indicated in (6.1.
γ ]) ∩ h −1 ((ψ − h 2 . h 1 . (6.25) i = 1. βi ) denote the unique element of h 1 →0 h 2 →0 lim lim Ii (γ . Let h 1 > 0. β) ∂β ∂h(α. ψ).h 1 . ψ.23) Dividing both sides of the above by h 1 h 2 and letting h 1 → 0 and h 2 → 0. . h 1 .27) .1. Then for small h 1 and small h 2 we have h 1 h 2 f v. (6. y) and w = h(x. .21) Ii (γ .4. .22) Let (αi .26) (6. βi ) = ψ. ψ.h 2 ) d α dβ. (6. βi ) Ii (γ . ψ]). the approximation becomes an equality.2 Bivariate PDF Technique A very important special case of bivariate transformations is when the RVs x and y are jointly continuous RVs and the mapping determined by g and h is continuous with g and h having continuous partial derivatives. β) ∂α ∂h(α. h 2 ).4. h 2 > 0 and deﬁne I (γ . Deﬁne the Jacobian ∂g (α. ψ) ≈ i f x. ψ. ψ. h 2 ). h 1 . . Partition I into disjoint regions so that I (γ . βi (γ .24) J (α. β) ∂β (6. ψ)).y . h 1 .ψ.w (γ . Let (αi (γ .y (αi . y) . Let x and y be jointly continuous RVs with PDF f x. be the distinct solutions to the simultaneous equations g (αi . h 2 ) = i (6. The result is summarized by the theorem below. β) ∂α ∂g (α.TRANSFORMATIONS OF RANDOM VARIABLES 65 6. Theorem 6. 2. (6. βi ) = γ and h(αi . β) = . and let z = g (x. h 2 ) = g −1 ((γ − h 1 .
y) = x 2 y and w = h(x. y) = x + y and w = h(x. . Random variables x and y have the joint PDF f x. 0 < ψ < 2 otherwise.4. Then f z. ψ) = ∞ i=1 f x. ψ) = Substituting.w (γ . The Jacobian of the transformation is J = Applying Theorem 1. we obtain f z. Let us consider several examples to illustrate the mechanics of this PDF technique. ψ).2. Random variables x and y have the joint PDF f x.3. Example 6.66 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS where the indicated partial derivatives are assumed to exist. 1 The region of support for f z. 0.y (γ − ψ.y (α.y (αi . f x. βi ) and ψ = h(αi . 0. ψ) = 1/4. J  1 0 1 = 1 · 1 − 1 · 0 = 1. Find the joint PDF for z = g (x. y) = y.4. 0 < α < 2. There is only one solution to γ = α + β and ψ = β: α1 = γ − ψ and β1 = ψ. 0 < α < 1. we ﬁnd f z. 0.w . y) = y. 0 < β < 2 otherwise. βi ) for αi and βi . β) = 1/4. Solution. β) = 12αβ(1 − α).w is illustrated in Figure 6. 0 < β < 1 otherwise. β1 ) = f x.y (α. Find the joint PDF for z = g (x. 0 < γ − ψ < 2.w (γ . J (αi .w using the joint PDF technique are solving the simultaneous equations γ = g (αi . and determining the support region for f z.28) The most difﬁcult aspects to ﬁnding f z.8.w (γ . Example 6. βi ) . βi ) (6.y (α1 .
We ﬁnd J = so that J 1 = 2ψ = J 2 . Find the joint PDF for z = g (x. 0. 1 √ f x. −1 < β < 1 otherwise. γ ψ) f x.w (γ .y ( γ /ψ.y (α. we ﬁnd α = γ /β = β/ψ. y) = xy and w = h(x. 0< √ γ /ψ < 1.4. Solving γ = αβ and ψ = β/α. y) = y/x. we ﬁnd 2α1 β1 0 2 α1 = 2α1 β1 = 2 γ ψ.4. so that β 2 = γ ψ. β) = 0 for α < 0. ψ) = + . β) = 0. β1 ) is in the ﬁrst quadrant of the α − β plane. − γ ψ) f z. The other solution α2 = −α1 is not needed since f x.w (γ . Solving γ = α 2 β and ψ = β. Then β2 = − γ ψ and α2 = − γ /ψ. Solution.y ( γ /ψ. 0 < ψ < 1 otherwise. 1/α α √ √ √ √ f x.8: Support region for Example 6. −1 < α < 1. we ﬁnd α1 = γ /ψ and β1 = ψ. the solution (α2 . β2 ) is in the third quadrant of the α − β plane.TRANSFORMATIONS OF RANDOM VARIABLES 67 ψ 2 1 0 1 2 3 4 γ FIGURE 6. Example 6. Hence β −β/α 2 β α =2 .25(α + 1). The Jacobian is J1 = Applying Theorem 1. = 0. ψ) f z. 2ψ 2ψ . ψ) = √ 2 γψ √ 6(1 − γ /ψ). Random variables x and y have the joint PDF f x.4. √ Solution. Let√ √ √ √ ting β1 = γ ψ we have α1 = γ /ψ.2.y (α. Note that the solution (α1 .y (− γ /ψ.
the two components of the PDF f z.w (γ . 0. (6. √ √ f x. y) to ﬁnd f z.68 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS In the γ − ψ plane. in the third quadrant. In the ﬁrst quadrant of the γ − ψ plane. the support region is γ −1 < ψ < γ < 0.9. . − γ ψ) 2ψ has support region speciﬁed by −1 < − γ /ψ < 0 and or 0< γ <1 ψ and 0 < γ ψ < 1.w have identical regions of support in the γ − ψ plane. ψ)d ψ.y to random variable z = g (x. ψ) = 1/(4ψ). γ ψ) 2ψ has support region speciﬁed by 0< or 0< Similarly. Similarly.y ( γ /ψ. this support region is easily seen to be 0 < γ < ψ < γ −1 . √ √ f x. This method is an alternative to the CDF technique presented earlier. Finally.w (γ . γ /ψ < 1 and 0 < γ ψ < 1.w and then ﬁnding the marginal PDF f z using f z(γ ) = ∞ −∞ f z. or γ −1 < ψ < γ < 0 otherwise. Consequently. This support region is illustrated in Figure 6.29) It is usually advisable to let the auxiliary random variable w equal a quantity which allows a convenient solution of the Jacobian and/or the inverse equations. Auxiliary Random Variables The joint PDF technique can also be used to transform random variables x and y with joint PDF f x.y (− γ /ψ. γ <1 ψ and 0 < γ ψ < 1. we ﬁnd f z. 0 < γ < ψ < γ −1 . y) by introducing auxiliary random variable w = h(x. − 1 < − γ ψ < 0.
y (α. Let auxiliary variable w = h(x. for 0 < γ < 1 and f z(γ ) = 0. The only solution inside the support region for f x. β) = Find the PDF for z = g (x. The Jacobian of the transformation is J = so that 2α 0 0 = 2α. ψ)d ψ = 0 1 2ψ d ψ. We ﬁnd the marginal PDF for z as f z(γ ) = ∞ −∞ f z. Solving γ = α 2 and ψ = β.y ( γ .w (γ . Random variables x and y have the joint PDF f x. we ﬁnd α = √ √ ± γ and β = ψ.4.9: Support region for Example 6. . Solution. y) = x 2 .5. 0 < ψ < 1 otherwise. ψ) ⎨ √ = 2ψ. 0 < γ < 1.y is α = γ and β = ψ. y) = y.w (γ . 1 4αβ. 0. 0 < α < 1. 0 < β < 1 otherwise. 2 γ f z. Example 6. otherwise.TRANSFORMATIONS OF RANDOM VARIABLES 69 ψ 3 2 1 −3 −2 −1 1 2 3 γ −1 −2 −3 FIGURE 6. ψ) = = √ ⎩ 2 γ 0.4. ⎧ √ 4 γψ √ f x.4.
we ﬁnd α = γ − ψ and β = ψ. 1 The support region for f z. Consider the problem where random variables x and y have joint 2 8 4(γ − ψ)ψ d ψ = − γ 3 + 4γ − . y). ψ) = 4(γ − ψ)ψ. f z = 0. Conditional PDF Technique Since a conditional PDF is also a PDF.10: Support region for Example 6. given event A. 1 0 1 = 1. 0. y) = y.y (γ − ψ. the above techniques apply to ﬁnd the conditional PDF for z = g (x. Solving γ = α + β and ψ = β. 0 < γ − ψ < 1.6. 3 For 1 < γ < 2. Random variables x and y have the joint PDF f x. Referring to Figure 6. 0 < β < 1 otherwise. 0 < ψ < 1 otherwise. Let auxiliary variable w = h(x. 3 3 γ −1 1 .6. 0. Find the PDF for z = g (x. for 0 < γ < 1. Solution.70 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS ψ 1 0 1 2 γ FIGURE 6. ψ) = f x.10.w is shown in Figure 6. β) = 4αβ. 0 < α < 1.4.y (α.10. y) = x + y. We ﬁnd J = so that f z. f z(γ ) = 0 γ 2 4(γ − ψ)ψ d ψ = γ 3 .w (γ . Example 6. f z(γ ) = Otherwise.4.
is computed by ﬁrst evaluating f z using the technique in this section. From Figure 6.yA (αi (γ . introduce an auxiliary random variable w = h(x.y (α. ψ). Then. 8 consequently. We begin by ﬁnding the conditional PDF for x and y. If A is an event deﬁned for a region in the x − y plane. given that event A occurred.11(a) we ﬁnd P (A) = A f x. and we wish to evaluate the conditional PDF for random variable z = g (x.5}. First.5 0 α 1 3α dβ d α = .y (α. and evaluate f z. .yA to evaluate the conditional PDF f zA as follows. β)d α dβ = 0 0. J (αi (γ . one of the following two methods may be used to determine the conditional PDF of z using the bivariate PDF technique.30) ii. βi ).31) where (αi .5 otherwise. . β) in region A. ψ)) (6. (6. ψ)A) . (6. i.wA (γ .y .4. Solution. β) = 3α. f x. are the solutions to γ = g (α. P (A) γ ∈ A. y). β) and ψ = h(α.7. ψA) = ∞ i=1 f x. . βA) = P (A) ⎩ 0. β) = 24α. y) ≤ 0. 2. βi (γ .32) Example 6. ψ). the conditional PDF. Find the PDF for z = x + y. i = 1. by the deﬁnition of a conditional PDF. y). 0<β<α<1 otherwise. given A. 0. 0 < β < α < 0. βi (γ . If A is an event deﬁned for an interval on z.wA (γ . Then evaluate the marginal conditional PDF f zA (γ A) = ∞ −∞ f z. ψA)d ψ.TRANSFORMATIONS OF RANDOM VARIABLES 71 PDF f x.y (α. y).yA (α. given event A = {max(x. f zA . we will use the conditional PDF f x. Depending on whether the event A is deﬁned on the range or domain of z = g (x.. . Random variables x and y have joint PDF f x. we have f zA (γ A) = f z(γ ) . ⎧ ⎨ f x.
0 < ψ < γ − ψ < 0. ψA) .wA (γ .5 otherwise. β) = α + β and ψ = h(α. . the only solution to γ = g (α. Letting the auxiliary random variable w = h(x. ψA) = Substituting. given A.wA (γ . f x. y) = y. The conditional PDF for z. can be found from f zA (γ A) = ∞ −∞ f z.5 < γ < 1 we obtain f zA (γ A) = For γ < 0 or γ > 1. The integration is easily carried out with the aid of Figure 6.wA (γ . 0. The Jacobian of the transformation is J = 1 so that f z.72 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS FIGURE 6. β) = β is α = γ − ψ and β = ψ.5γ γ −0. we ﬁnd f z.yA (γ − ψ. ψA) = 24(γ − ψ).5 24(γ − ψ)d ψ = 3(1 − γ 2 ).wA is illustrated in Figure 6.7.5γ 24(γ − ψ)d ψ = 9γ 2 . ψA)d ψ.11(b).5 < γ < 1 we have f zA (γ A) = 0 0.11(b). f zA (γ A) = 0. For 0. 0.4.11: Support region for Example 6. 1 The support region for f z. For 0.
The ﬁrst. 3/4. (b) f z. β) = 2β. 1). 1).3. (d ) f zA (1/2A).2. e −2 . (c ) f z(1/2).4.w (1.3.y (α. 0 < β < 1 otherwise.y (α. (d ) f z(3/2). Random variables x and y have joint PDF f x. Random variable z = x and w = xy. β) = e −α−β u(α)u(β). Drill Problem 6. 0 < β < 1 otherwise. 0 < β < 1 otherwise. 0. 1/4). (b) f z(−1/2).4. 0. 0 < α < 1. Random variable z = x + y and w = y 2 .w (1. y) : x + y ≤ 1}.5 SUMMARY This chapter presented a number of different approaches to ﬁnd the probability distribution of functions of random variables.4. Two methods are presented to ﬁnd the probability distribution of a function of one random variable. (6. Drill Problem 6. Determine: (a) f z.1. (c )FzA (0A). Random variables x and y have joint PDF f x. β) = 4αβ. (c ) f z(1/2). Find: (a) f z. 13/12.4. 0 < α < 1.y (α. (b) f zA (−1/2A). From the deﬁnition of the CDF. 0. 0.7172. Random variables x and y have joint PDF f x. 1. Random variable z = x 2 y. z = g (x). 3/16. 1/12. (b) f z(1/4). Random variable z = x − y and event A = {(x.4. is called the CDF technique. β) = 1. 0. (d ) f z(3/2). Drill Problem 6. Answers: 0.y (α. Answers: 0. 15/16. Answers: 0. Answers: 0.33) . Determine: (a) f zA (−3/ 2A).TRANSFORMATIONS OF RANDOM VARIABLES 73 Drill Problem 6.w (−1. Determine: (a) f z(−1). and the most general approach. Random variables x and y have joint PDF f x. 0 < α < 1. 1. 6.2(α 2 + β). we write Fz(γ ) = P (z ≤ γ ) = P (g (x) ≤ γ ) = P (x ∈ A(γ )).
The CDF for the RV z can then be found by evaluating the integral Fz(γ ) = A(γ ) (6. (6.40) ∞ i=1 f x. a bivariate CDF as well as a joint PDF technique were presented.38) The ease of solution here involves transforming A(γ ) into proper limits of integration. Next. However. y) created from jointly distributed random variables x and y using two approaches. y) ∈ A(γ )).35) ∞ i=1 f x (αi (γ )) . the PDF technique is much simpler to use than the CDF technique. . 2.37) dFx. . . the joint PDF for z and w can be found as f z.y (α. called the PDF technique. the CDF Fz may be found using the CDF Fx . The ﬁrst approach. 2. . .36) (6. βi ) . involves evaluating Fz(γ ) = P (z ≤ γ ) = P (g (x. ψ).34) denote the distinct solutions to g (αi ) = γ . ψ)). y) and w = h(x. The second approach involves introducing an auxiliary random variable and using the PDF technique applied to two functions of two random variables. We wish to remind the reader that the special case of a convolution integral is obtained when random variables x and y are independent and z = x + y. the PDF technique is applicable only when z = g (x) is continuous and does not equal a constant in any interval in which f x is nonzero. .74 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS where A(γ ) = {x : g (x) ≤ γ }. Using the joint PDF technique.w (γ . i = 1. y) ≤ γ ) = P ((x. g (1) (αi (γ )) (6. To ﬁnd the joint probability distribution of random variables z = g (x. ψ) = where (αi (γ . . we evaluated the probability distribution of a random variable z = g (x. y) from jointly distributed random variables x and y. β). βi ) (6. (6. J (αi . . (6. By partitioning A(γ ) into disjoint intervals. i = 1.39) . y) : g (x. Typically.y (αi . where A(γ ) = {(x. βi (γ . a CDF technique. involves evaluating f z(γ ) = where αi = αi (γ ) = g −1 (γ ). y) ≤ γ }. The second approach.
41) (6. Find Fz using the CDF technique. Given f x (α) = and z= Use the CDF technique to ﬁnd Fz. ⎪ ⎩ 1. ⎨ Fx (α) = α 2 . βi ) = ψ. βi ) = γ and h(αi . x − 1.42) (6.TRANSFORMATIONS OF RANDOM VARIABLES 75 are the distinct solutions to the simultaneous equations g (αi . Let random variable x be uniform between 0 and 2 with z = exp(x).y by introducing an auxiliary random variable w = h(x. β) ∂β . β) ∂α ∂g (α. β) ∂β (6. The joint PDF technique can also be used to ﬁnd the probability distribution for z = g (x. β) ∂α J (α. α<0 0≤α<1 1≤α and z = exp(−x).5(1 + α). −1 < α < 1 otherwise. 0. y) from f x.43) where the indicated partial derivatives are assumed to exist. ﬁnd Fz. 6. x<0 x > 0. 2. Using the CDF technique. Suppose random variable x has the CDF ⎧ ⎪ 0. the most difﬁcult aspect of the joint PDF technique is in solving the simultaneous equations. . and then integrating to obtain the marginal PDF f z. Typically. The Jacobian ∂g (α. x + 1.6 PROBLEMS 1. 0. β) = ∂h(α. 3. ∂h(α.w . y) to ﬁnd f z.
9. 8. (c) f z. ⎨ z= x. so that z = x. Determine: (a) a. 0<α<1 otherwise. Random variable x represents the input to a fullwave rectiﬁer and z represents the output.76 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS 4. Given that x is uniformly distributed between −2 and 2. . Random variable x represents the input to a halfwave rectiﬁer and z represents the output. ⎨ z= x. (c) f z. (b) Fz using the CDF method. (d) E(z) using f z (compare with the answer to part a). 5. so that z = u(x). ⎧ ⎪ −1. 2α. 0. ﬁnd: (a) E(z) using f x . x < −1 −1 ≤ x ≤ 2 x > 2. The PDF of random variable x is 1 1 f x (α) = e − 2 (α+2) u(α + 2). determine Fz. (d) E(z) using f z (compare with the answer to part a). (b) Fz using the CDF method. Given that x is uniformly distributed between −2 and 2. Random variable x has the PDF f x (α) = e −α u(α). ⎪ ⎩ 2. ﬁnd: (a) E(z) using f x . Given f x (α) = and a 1 + α2 x ≤ −1 −1 < x < 1 x ≥ 1. 6. (b) Fz. Suppose random variable x has the PDF f x (α) = Random variable z is deﬁned by ⎧ ⎪ −1. ⎪ ⎩ 1. Using the CDF method. 2 Find Fz using the CDF technique when z = x. Find Fz for z = x 2 using the CDF method. 7.
Random variable x has the following PDF f x (α) = 4α 3 . and ⎧ ⎪ x. Suppose f x (α) = (1 + α 2 )/6. With z = 100 − 25x. ⎨ z = 0. ⎪ ⎩ 1. Let z = 1/x 2 . (b) Fz using the PDF technique.5 −0. ⎪ ⎪ ⎪ 1. Random variables x and z are the input and output of a quantizer. Given that the input follows a Gaussian distribution with x ∼ G(2.5 ≤ x < 2. 0<α<1 otherwise. 0. Find the PDFs for the following random variables: (a) z = x 3 . ⎪ ⎪ 3. Determine Fz. ﬁnd f z using the CDF method. ⎨ Fx (α) = (α 2 + 2α + 1)/9. 14. Random variable x has the PDF f x (α) = e −α u(α). x < 0. (b) z = (x − 1/2)2 .49).5. (c) z = (x − 1/4)2 .5 ≤ x < 1.5 1. 13. Random variable x has the CDF ⎧ ⎪ 0. −1 < α < 2 otherwise. Use the CDF technique to determine Fz.5 ≤ x ≤ 0 0 < x.5 0. 0. 0.5 2. ⎪ ⎪ ⎪ ⎩ 4. ⎪ ⎨ z = 2.5 x ≥ 3. α < −1 −1 ≤ α < 2 2≤α x < −0.25. 11. ﬁnd: (a) Fz using the CDF technique. ⎪ ⎩ x.5 ≤ x < 3.TRANSFORMATIONS OF RANDOM VARIABLES 77 10. . 12. The relationship between them is deﬁned by: ⎧ ⎪ 0.
Random variable x is Gaussian with mean ηx = 0 and standard deviation σx = 1.12: Plot for Problem 15. Find the CDF Fz. The voltage x in Figure 6. Assume the diode is ideal. The voltage x in Figure 6. and g (x) shown in Figure 6. 15. The voltage x in Figure 6. ⎪ ⎩ 1. Assume the diode and the operational ampliﬁer are ideal.78 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS z = g( x ) 1 −2 −1 −1 1 2 3 x FIGURE 6. Find the PDF f z. 17. Find the PDF f z. Fx (α) = 0.16. Find the PDF f z. Assume that g (x) is a second degree polynomial for x ≥ 0.14 is a Gaussian random variable with mean ηx = 0 and standard deviation σx = 3. ﬁnd the PDF f z. 16.15 is a Gaussian random variable with mean ηx = 1 and standard deviation σx = 1. . 18. ⎧ ⎪ ⎨ α < −1 −1 ≤ α < 3 3 ≤ α. with RV z = g (x). With z = g (x) shown in Figure 6. Assume the diodes are ideal. Random variable x has the CDF 0.12.25(α + 1). + 1K Ω + x 3K Ω z − − FIGURE 6.13 is a random variable which is uniformly distributed from −1 to 2.13: Circuit for Problem 16. 19.
15: Circuit for Problem 18.5). 21. 20. 0<α<1 otherwise.5δ(α − 0. Find f z if z = 1/x 2 and x is uniform on −1 to 2. Random variable z = 4x + 2. 0. 5K Ω 1K Ω x z FIGURE 6. Random variable x has the following PDF f x (α) = Find Fz if z = x 2 . α < π/2 otherwise. Find the PDF of z = 2x 2 using the PDF technique. Find f z using the PDF technique. Random variable x has the PDF f x (α) = 2(α + 1)/9. 0. 22. −1 < α < 2 otherwise. .14: Circuit for Problem 17. 24. α + 0. Random variable x is uniform in the interval 0 to 12. Let z = cos(x).TRANSFORMATIONS OF RANDOM VARIABLES + R + 79 x 2V 3V z − − FIGURE 6. 23. Find f z if: (a) f x (α) = 1/π. 0.
Find (a) f w . Find the PDF of z = −2 ln(x) using the PDF technique. 27. 0 < α < π/2 otherwise. ⎪ ⎩ 1.4 −2 −1 −0. Find the PDF of random variable z. 0<α<3 otherwise. α<0 0≤α<1 1 ≤ α. Random variable z = (x − 1)2 and event A = {x : x ≥ 1/2}. Random variable x is uniform between −1 and 1. (b) f wA if A = {v ≥ 0}. Random variable w = v2 /R represents the power dissipated in a resistor of R with v volts across the resistor. given event A. (b) 25. Random variable z= Using the PDF technique. 0. ﬁnd f z. ⎧ ⎪ 0. 28. 0. 26.16: Transformation for Problem 19. x. ⎨ Fx (α) = α.80 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS z = g( x ) 1. A voltage v is a Gaussian random variable with ηv = 0 and σv = 2. . x 2. Given that x has the CDF f x (α) = 8α/π 2 . Random variable x has the PDF f x (α) = 2α/9. x<0 x ≥ 0.5 1 2 x FIGURE 6.
(d) E(z2 ). Determine f z using the PDF technique. Let f x (α) = 0. . 36. Given 9α 2 . Determine f z using the PDF technique. ⎪ ⎩ 0. f x (α) = 3(1 − α 2 ). Using the PDF technique. Determine: (a) f z. Given that random variable x has the PDF ⎧ ⎪ α. Random variable x has an exponential distribution with mean one. 33. 35. ⎧ ⎪ ⎨ 0 < α < 0. and z = 8x 3 . ﬁnd the PDF of z = x 2 .75(1 − α 2 ). and z = −2 ln(x). ﬁnd f z if z = x and x is a standard Gaussian random variable. If z = e . ⎪ ⎩ 0. and z = x 2 .5 0. −1 < α < 1 otherwise. 0. 0<α<1 1<α<2 otherwise. + 1) 31.5α(u(α) − u(α − 2)). 0. 0<α<1 otherwise. 34. (c) E(z).5 < α < 1 otherwise. Using the PDF technique. Find a transformation g such that z = g (x) has PDF f z(γ ) = c γ 2 (u(γ ) − u(γ − 1)).TRANSFORMATIONS OF RANDOM VARIABLES −x 81 29. ⎨ f x (α) = 2 − α. Suppose f x (α) = 2α. Suppose RV x has PDF f x (α) = 0. (b) f zB (γ B) if B = {z : z < 1/2}. 32. (b) Fz. Find f z if z = 1/x and f x (α) = π(α 2 1 . 30. use the PDF technique to determine: (a) f zA (γ A) if A = {x : x > 2}.
⎨ f x (α) = 0. 1/4. 4 4 Random variable z = g (x).17. 40. 3 z = g(x) 2 1 x −1 0 1 2 FIGURE 6. Random variable x has PDF f x (α) = α2 (u(α + 1) − u(α − 2)).82 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS 37. (a) Find the transformation z = g (x) so that ⎧ ⎪ 1 − 0. Find the PDF f z. 41. −1 < α < 0 0<α<2 otherwise. Random variable x has the PDF ⎧ ⎪ 0. Random variable x has PDF 1 1 f x (α) = δ(α + 1) + δ(α) + (u(α) − u(α − 2)). 0 ≤ α < 2π otherwise.5. ⎪ ⎩ 0.25α. ⎨ f z(γ ) = 0. with g shown in Figure 6.5 − 0. ⎪ ⎩ 0. ﬁnd f z. Find f z if z = sin(x) and f x (α) = 1/(2π). 0<γ <1 1<γ <2 otherwise. 0. in addition. If z = 1/x. (b) Determine f z if z = (2x + 2)u(x). 39. P (x = −2) = P (x = 2) = 0.25γ . Let random variable x have the PDF f x (α) = 1/12. .25. 38. −2 < α < 1 1 < α < 2.5 − 0.25γ .17: Transformation for Problem 40.
With g denoting the equivalent conductance of a parallel connection of R1 and R2 . 42.18. β) = e −α . determine: (a) f zA (γ A) where A = {x : x > 2}. 45. 44. 0. 0<α<1 otherwise. 0.y (α. 47. write an expression(s) for Fz(γ ) (do not solve. With z = x + y. If z = x/y.y (α. Random variable z = g (x). With z = e −x . Use the PDF technique to ﬁnd: (a) f z. Suppose the voltage v across a resistor is a random variable with PDF f v (α) = 6α(1 − α).TRANSFORMATIONS OF RANDOM VARIABLES z = g( x ) 2 1 x 83 −1 0 1 2 FIGURE 6. With r denoting the equivalent resistance of a series connection of R1 and R2 . . ﬁnd Fz when x and y have the joint PDF f x. 0. (b) f zB (γ B) where B = {z : z < 1/2}. Resistors R1 and R2 have values r 1 and r 2 which are independent RVs uniformly distributed between 1 and 2 . 0<β<α<1 otherwise. β) = 0. 43. 0<β<α otherwise. Resistors R1 and R2 have values r 1 and r 2 which are independent RVs uniformly distributed between 1 and 2 . 46. just write the integral(s) necessary to ﬁnd Fz. ﬁnd the PDF f g using convolution. with g shown in Figure 6. The joint PDF of x and y is f x.25α.18: Transformation for Problem 41. The PDF for random variable x is f x (α) = e −α u(α). (b) f zA with A = {x > 0}. ﬁnd the PDF fr using convolution.
y) : x > y}. The joint PDF for random variables x and y is f x. β) = 3(α 2 + β 2 ).y (α. 0 < α < 1. 48. 0. β) = 8αβ. 0.y (α. (b) f zA . 0. 0 < α. 0 < β < 1 otherwise. The joint PDF for random variables x and y is f x.y (α.y (α. 52. 0 < β otherwise. 0.y (α. The joint PDF for random variables x and y is f x.5βe −α . 0. The joint PDF for random variables x and y is f x. β) = Let z = x/y and w = y. Let z = 2x + y and w = x + 2y. Find: (a) f z. 0. Let z = x 2 and w = xy. 0 < α < 1. . 50. Find f z. β) = With z = y/x. 0<β<α<1 otherwise. The joint PDF for random variables x and y is f x.84 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS and that resistance r is a random variable with PDF fr (β) = 1/12. z = x and w = x 2 + y 2 . Find f z. 51.wA . β) = α + β.y (α. 1/α. p = v2 /r .w .w . 94 < β < 106 otherwise. 0 < α 2 + β 2 < 1. 0 < β < 2 otherwise. Find: (a) f z. Find the joint PDF for random variables z = x + y and w = x − y. 0 < α. Moreover. 0<β<α<1 otherwise. β) = 4αβ. 49. ﬁnd f z. 0. The joint PDF for random variables x and y is f x. 0 < β < 1 otherwise. suppose that v and r are independent random variables. 0. Find the PDF for power. 53. Let A = {(x. (b) f z.w .
(b) f z. 12αβ(1 − α). The joint PDF for random variables x and y is f x. 0. 0. The joint PDF for random variables x and y is f x. 57. 0 < β < π otherwise. 0 < β < 1 otherwise. β) = α sin(β).y (α.w . 0 < β otherwise. β) = 0.y (α. Find: (a) f z.y (α. Let z = x 2 and w = cos(y). β < 1 otherwise.w using the joint CDF technique. Find f z.w . β) = Let z = x 2 y. Find f z. 0 < α. . Let z = x + y and w = x/(x + y). Let z = xy and w = y/x. Find f z. 55.TRANSFORMATIONS OF RANDOM VARIABLES 85 54.25(α + β). 56. β) = e −α−β . The joint PDF for random variables x and y is f x. 0 < α < 1. The joint PDF for random variables x and y is f x. 0 < α < 1. 0.y (α. α < 1. 0.
.
428415 0.981547 0.737280 0.05 0.836920 0.997773 0.991440 0.1 0.000000 0.25 0.998617 +1 0.764831 0.544300 0.3 0.999990 0.000000 0.256218 0.35 0.773781 0.590490 0.977408 0.999680 0.999970 0.918540 0.500000 +3 0.868780 0.5 p 0.187500 n = 10 +1 0.835210 0.968750 .528220 0.950030 1.168070 0.077760 0.999865 +2 0.15 0.116029 0.000000 0.997570 0.973388 0.000000 0.987205 1.632813 0.000000 (Continued ) +2 0.999853 0.4 0.998365 1.990126 1.000000 +4 0.896484 0.593127 0.2 0.994748 0.998971 1.443705 0.736099 0.945978 0.45 0.998842 0.000000 0.682560 0.999991 +3 0.000000 0.05 0.999924 0.031250 +0 0.87 APPENDIX A Distribution Tables TABLE A.812500 +4 1.999997 0.999540 0.050328 0.999936 1.15 k 0 0 0 0 0 0 0 0 0 0 k 0 5 0 5 0 5 +0 0.598737 0.989760 0.999023 0.1: Bernoulli CDF for n = 5 and n = 10 n=5 p 0.336960 0.000000 0.196874 0.942080 0.929809 1.1 0.327680 0.913862 1.988496 1.993280 0.984375 0.348678 0.000000 0.969220 0.912960 0.999991 0.820197 0.237305 0.
921873 0.1: Bernoulli CDF for n = 5 and n = 10 (Continued) n = 10 p 0.993631 0.987705 0.999460 0.023257 0.525593 0.999972 0.002533 0.999999 0.987280 1.149308 0.999385 1.244025 0.999895 0.944444 0.000000 0.013463 0.513827 0.2: Bernoulli CDF for n = 15 n = 15 p 0.998410 0.4 0.999856 0.999584 0.999689 1.171875 0.000977 0.989408 0.633103 0.88 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS TABLE A.999659 0.463291 0.996494 0.999023 TABLE A.085954 0.649611 0.980272 0.046357 0.000000 0.000000 0.382783 0.006047 0.998322 0.000000 1.999996 0.999996 1.261607 0.999136 0.995498 0.905066 0.010742 0.000000 +1 0.952651 0.963800 1.972608 0.000000 0.054688 0.995179 0.999997 1.829047 0.382281 0.000000 +2 0.028248 0.751495 0.000000 +3 0.623047 +1 0.967206 1.056314 0.775875 0.989258 +4 0.2 0.205891 0.000000 1.828125 +2 0.45 0.000000 .376953 0.549043 0.000000 +4 0.000000 1.3 0.35 0.999970 0.973976 0.677800 0.997750 1.999922 0.738437 0.000000 0.05 k 0 5 10 0.25 0.815939 0.000000 0.833761 0.999947 1.107374 0.375810 0.504405 0.945313 +3 0.266038 0.945238 0.000000 1.898005 0.1 0 5 10 +0 0.849732 0.5 k 0 5 0 5 0 5 0 5 0 5 0 5 0 5 +0 0.994533 1.999966 1.999994 0.167290 0.879126 0.099560 0.
236088 0.949987 0.15 k 0 5 10 0.2 0 5 10 0.999988 0.818240 0.017578 0.999999 0.999975 0.835766 0.999969 .999521 0.982422 +2 0.999328 0.648162 0.4 0 5 10 0.126828 0.172696 0.5 0 5 10 +0 0.999919 1.996347 1.080181 0.999390 1.943380 0.923071 0.999205 1.984757 1.013363 0.998072 0.000127 0.849121 0.061734 0.938949 0.3 0 5 10 0.001692 0.609813 0.DISTRIBUTION TABLES 89 TABLE A.999988 0.2: Bernoulli CDF for n = 15 (Continued) n = 15 p 0.990652 0.999908 0.982700 0.035268 0.966167 0.000031 0.983190 0.999879 0.000000 0.059235 0.005172 0.452160 0.721621 0.987557 1.999721 0.217278 0.398023 0.997169 0.938295 0.786897 0.940765 +1 0.150879 0.974534 0.45 0 5 10 0.403216 0.868857 0.993673 0.604225 0.996394 1.000000 0.999943 0.995760 1.957806 0.886769 0.000000 0.25 0 5 10 0.010652 0.999991 0.999512 +4 0.999994 0.004748 0.000000 0.014179 0.167126 0.999885 0.000000 0.000000 0.999996 0.000000 0.000000 0.999992 1.998893 0.754842 0.653504 0.087354 0.035184 0.686486 0.851632 0.696381 0.981941 0.999887 1.003693 0.904953 0.000000 0.999999 0.027114 0.318586 0.999999 0.995807 1.000470 0.351943 0.042421 0.120399 0.296868 0.000488 0.515491 0.822655 0.999215 1.000000 0.000000 0.35 0 5 10 0.001562 0.564282 0.999999 0.090502 0.000000 0.303619 0.260760 0.461287 0.500000 0.996307 +3 0.
785782 0.952038 0.629648 0.007637 0.000000 0.000000 1.002133 0.772272 0.999898 1.000000 1.867047 0.000000 0.000000 1.000000 0.000000 1.999957 1.024313 0.999995 1.647725 0.1 0 5 10 15 0.999961 1.000000 0.175558 0.980421 0.898188 0.011529 0.999739 1.411449 0.999966 1.913307 0.15 0 5 10 15 0.924516 0.000000 0.000000 0.003171 0.984098 1.996058 1.000000 1.000000 0.000000 0.932692 0.416625 0.391747 0.959075 0.994079 0.999816 1.999999 1.676927 0.000000 1.237508 0.121577 0.998721 1.000000 0.000000 +4 0.000000 1.358486 0.997405 1.000000 0.999689 1.000000 1.000000 0.012118 0.988747 0.206085 0.091260 0.735840 0.3: Bernoulli CDF for n = 20 n = 20 p 0.999999 1.999997 1.956825 0.038760 0.601027 0.993985 0.000000 0.999994 0.999985 1.946833 0.05 k 0 5 10 15 0.000798 0.878219 0.999998 1.000000 0.414842 0.000000 0.000000 0.000000 0.3 0 5 10 15 0.829847 0.999993 1.967857 0.997614 1.990018 0.035483 0.999065 1.982855 0.978065 0.998479 1.998671 1.90 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS TABLE A.404896 0.997426 1.000181 0.000000 .044376 0.416371 0.999999 +3 0.25 0 5 10 15 0.000000 1.000000 0.000000 1.000000 0.999950 +1 0.000000 0.000000 0.225156 0.000000 1.000000 0.000000 0.999999 0.886669 0.118197 0.617173 0.069175 0.804208 0.000000 1.000000 0.999584 1.35 0 5 10 15 +0 0.986136 0.999752 1.000000 0.999671 1.999940 1.999970 1.2 0 5 10 15 0.999994 +2 0.762378 0.999437 1.994862 0.000000 1.000000 1.000000 0.000000 0.107087 0.000000 0.608010 0.999996 1.000000 0.245396 0.
020695 0.45 0 5 10 15 0.000000 0.844195 0.000111 0.414306 0. .5.868412 0.965858 0.999998 0.252006 0.000000 0.904837 0.979305 0.999828 1.992074 0.978971 0.996400 0.999997 0.000524 0.998712 +2 0. .5 0. 1.942341 0.999991 +2 0.748278 0. .548812 0.999939 0.755337 0.251722 0.000001 0. 0.496585 0.3 0. . 4.1 0.963064 0.999943 0.985612 1.999605 1.000000 0.055334 0.998852 0.978586 0.5 0 5 10 15 +0 0.2 0.000020 0. .000000 0.DISTRIBUTION TABLES 91 TABLE A.999999 TABLE A.2.994247 1.998248 1.000000 0.591361 0. 2.999953 0.131588 0.909796 0.057659 0.015961 0.999986 0.004933 0.588099 0.000201 0.4 k 0 5 10 15 0.999723 0. 1.999910 +1 0.000006 0.999964 0.750711 0. . .999995 0.996642 1.999996 0.595599 0.4 0.125599 0.6 0.740818 0.411901 0.999999 0.000927 0.018863 0.818731 0.003611 0.129934 0.999214 1.982477 0.000000 0.999999 +3 0.999845 0.250011 0. .1.999980 +4 0.943474 0.000037 0.7 k 0 0 0 0 0 5 0 5 0 5 +0 0.050952 0.999984 0.000000 0.999997 0.606531 0.938448 0.878099 0.872479 0.976885 1.000000 +4 1.000000 0.999799 +3 0.995321 0.999224 0.415893 0.005909 0.5 λ 0.998388 1.999961 0.998469 0.3: Bernoulli CDF for n = 20 (Continued) n = 20 p 0.993566 1.000000 (Continued ) .670320 0.994091 +1 0.993534 1.4: Poisson CDF for λ = 0.999683 0.000000 0.941966 0.001288 0.999734 0.000000 0.869235 0.
757576 0.998589 1. 2.995753 0.287297 0.999406 0.999924 0.999997 0.999996 0. .999984 0.990920 1.997656 1.628837 0.433470 0.92 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS TABLE A.199148 0.082085 0.808792 0.999816 0.952577 0.973261 0.702930 0.000000 3 0 5 10 3.999995 +2 0.536633 0.991868 0.937143 0.406006 0.173578 0.999990 0.981012 0.808847 0.8 0.986541 1.000000 +4 0.342296 0.947347 0.018316 0.999926 1.831051 0.4: Poisson CDF for λ = 0.999726 0.889326 0.647232 0.000000 0.011109 0.978637 0.998981 0.999987 0. .999980 +1 0.998903 0.959743 0.934358 0.857123 0. .999708 0.423190 0.000000 0.999972 0.735759 0.815263 0.676676 0.957979 0.999996 0. .999763 0.557825 0.996340 1.000000 0.999195 0.985813 0.999954 0.367879 0.966491 0. .999711 0.981424 0.5 0 5 10 15 .996197 0.999998 0.999999 0.934712 0.532104 0.000000 0.1.785130 0.998860 1.999924 0.135335 0.238103 0.406570 0.061099 0.223130 0.997160 0.000000 0.999830 0. .5 k 0 5 0 5 0 5 0 5 0 5 0 5 10 +0 0.995466 0.999981 0.999938 0.9 1 1.916082 0.982907 0.948866 0.2.5 2 2. 4.049787 0. .543813 0.913414 0.999085 0. 1.999748 1.772482 0.449329 0.999957 0.091578 0.999999 0.996685 0.000000 0.999999 +3 0.999998 0. 1. 0.999917 0.990126 0.857614 0.998897 0.999995 0.999656 0.320847 0.030197 0.983436 0.993331 0.999929 0.999074 0.999979 0.725445 0.5.988095 0.999723 1.5 0 5 10 4 0 5 10 4.997596 0.919699 0.135888 0.995544 0.5 (Continued) λ 0.891178 0.000000 0. .999980 0.
974749 0. 5.999825 0.5: Poisson CDF for λ = 5. .998600 0.043036 0.992900 0.847238 0.999889 1.920759 0.378155 0.999999 +4 0.946650 0.999949 0.995392 0.999042 0.999996 +3 0.999999 0.999984 0.000912 0.598714 0.997593 0. .999981 0.111850 0.017351 0.997044 0.151204 0.445680 0.999999 0.997981 0.894357 0. 8.020257 0.DISTRIBUTION TABLES 93 TABLE A.440493 0.916076 0.369041 0.999982 0.5 0 5 10 15 20 .968172 0.983973 0.528919 0.931906 0.999995 0.998041 0.999491 0.172992 0.124652 0. .132062 0.5 0 5 10 15 6 0 5 10 15 6.686036 0.729091 0.004701 0.300708 0.357518 0.524639 0.994283 0.029636 0.978435 0.999956 0.999961 +1 0.059145 0.241436 0.002479 0.973000 0.999774 1.999849 0.081765 0.946223 0.791573 0.5.809485 0. .999995 0.991173 0.989012 0.004087 0.000000 0.661967 0.996372 0.449711 0.999302 0.001503 0.285057 0.999995 0.606303 0.998840 0.743980 0.999870 0.615961 0.026564 0.999210 0.088376 0.999570 0.5 0 5 10 15 7 0 5 10 15 7.5 λ 5 k 0 5 10 15 +0 0.201699 0.223672 0.999937 0.998315 0.999638 0.995549 0.989740 0.999401 0.830496 0.999987 +2 0.007295 0.999800 0.999943 0.000000 (Continued ) 5.877384 0.999980 0.986305 0.979908 0.999931 0.762183 0.265026 0.933161 0.672758 0.526524 0.776408 0.011276 0.966120 0.994547 0.987189 0.866628 0.901479 0.999697 0.040428 0.957334 0.061969 0.957379 0.862238 0.000553 0.006738 0.
5 0 5 10 15 20 TABLE A.998038 0. 9.999997 8.94 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS TABLE A.014860 0.996282 0.999975 0.042380 0.323897 0.000786 0.592547 0.999350 0.999999 0.010336 0.995716 0. 12.523105 0.6: Poisson CDF for λ = 9.926149 0.999906 0.206781 0.220221 0. 13 λ 9 k 0 5 10 15 20 +0 0.999639 0.999990 +4 0. .385597 0.021226 0.998944 0.099632 0.009283 0.001933 0.836430 0.999989 0. 5.999933 0.999944 0.004164 0.521826 0.5.256178 0.391824 0.994680 0.999465 0.452961 0.999979 0. .587408 0.115691 0.191236 0.991072 0.000203 0.074364 0.696776 +2 0.716624 0.130141 0.986167 0.000335 0.965819 0.002769 0.982743 0.006232 0.998703 0.5 (Continued) λ 8 k 0 5 10 15 20 +0 0.001234 0.645328 0.5.652974 0. 11.982273 0.999789 +1 0.966527 0.999747 0.803008 0.149597 0.999921 +2 0.999855 0.972575 0.993387 0.000499 0.977964 0. 10.000123 0.583040 +1 0.5: Poisson CDF for λ = 5.999996 0.763362 0.999991 0.029253 0.958534 0.000075 0.067086 0.751990 0.864464 +4 0.999971 +3 0.909083 0.991769 0.003019 0.998406 0.313374 0.705988 0. .875773 0.888076 0.940008 0.791556 +3 0.898136 0.164949 0.997574 0.030109 0.988894 0.815886 0.999825 0.996998 0.999141 0.457930 0.948589 0.268663 0. 8.999967 0.455653 0.088528 0.916542 9.5 0 5 10 15 20 10 0 5 10 . .999561 0.000045 0.013754 0.936203 0.054964 0.040263 0.848662 0.332820 0.
DISTRIBUTION TABLES
95
TABLE A.6: Poisson CDF for λ = 9, 9.5, 10, 11, 12, 13 (Continued)
λ
k 15 20
+0 0.951260 0.998412 0.000017 0.037520 0.459889 0.907396 0.995329 0.999918 0.000006 0.020341 0.347229 0.844416 0.988402 0.999692 0.000002 0.010734 0.251682 0.763607 0.974988 0.999034
+1 0.972958 0.999300 0.000200 0.078614 0.579267 0.944076 0.997748 0.999967 0.000080 0.045822 0.461597 0.898709 0.993935 0.999867 0.000032 0.025887 0.353165 0.835493 0.985919 0.999548
+2 0.985722 0.999704 0.001211 0.143192 0.688697 0.967809 0.998958 0.999987 0.000522 0.089505 0.575965 0.937034 0.996953 0.999944 0.000223 0.054028 0.463105 0.890465 0.992378 0.999796
+3 0.992813 0.999880 0.004916 0.231985 0.781291 0.982313 0.999536 0.999995 0.002292 0.155028 0.681536 0.962584 0.998527 0.999977 0.001050 0.099758 0.573045 0.930167 0.996028 0.999911
+4 0.996546 0.999953 0.015105 0.340511 0.854044 0.990711 0.999801 0.999998 0.007600 0.242392 0.772025 0.978720 0.999314 0.999991 0.003740 0.165812 0.675132 0.957331 0.998006 0.999962
11
0 5 10 15 20 25
12
0 5 10 15 20 25
13
0 5 10 15 20 25
TABLE A.7: Poisson CDF for λ = 14, 15, 16, 17, 18
λ 14
k 0 5 10
+0 0.000001 0.005532 0.175681
+1 0.000012 0.014228 0.260040
+2 0.000094 0.031620 0.358458
+3 0.000474 0.062055 0.464448
+4 0.001805 0.109399 0.570437 (Continued )
96
ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS
TABLE A.7: Poisson CDF for λ = 14, 15, 16, 17, 18 (Continued)
λ
k 15 20 25 30
+0 0.669360 0.952092 0.997392 0.999940 0.000000 0.002792 0.118464 0.568090 0.917029 0.993815 0.999803 0.000000 0.001384 0.077396 0.466745 0.868168 0.986881 0.999433 0.000000 0.000675 0.049124 0.371454 0.805481 0.974755 0.998552
+1 0.755918 0.971156 0.998691 0.999974 0.000005 0.007632 0.184752 0.664123 0.946894 0.996688 0.999910 0.000002 0.004006 0.126993 0.565962 0.910773 0.992541 0.999724 0.000001 0.002062 0.084669 0.467738 0.861466 0.984826 0.999253
+2 0.827201 0.983288 0.999365 0.999989 0.000039 0.018002 0.267611 0.748859 0.967256 0.998284 0.999960 0.000016 0.010000 0.193122 0.659344 0.941759 0.995895 0.999869 0.000007 0.005433 0.135024 0.564023 0.904728 0.991166 0.999626
+3 0.882643 0.990672 0.999702 0.999996 0.000211 0.037446 0.363218 0.819472 0.980535 0.999139 0.999983 0.000093 0.021987 0.274511 0.742349 0.963314 0.997811 0.999940 0.000041 0.012596 0.200873 0.654958 0.936704 0.995016 0.999817
+4 0.923495 0.994980 0.999864 0.999998 0.000857 0.069854 0.465654 0.875219 0.988835 0.999582 0.999993 0.000400 0.043298 0.367527 0.812249 0.977685 0.998869 0.999973 0.000185 0.026125 0.280833 0.736322 0.959354 0.997273 0.999913
15
0 5 10 15 20 25 30
16
0 5 10 15 20 25 30
17
0 5 10 15 20 25 30
DISTRIBUTION TABLES
97
TABLE A.7: Poisson CDF for λ = 14, 15, 16, 17, 18 (Continued)
λ 18
k 0 5 10 15 20 25 30
+0 0.000000 0.000324 0.030366 0.286653 0.730720 0.955392 0.996669
+1 0.000000 0.001043 0.054887 0.375050 0.799124 0.971766 0.998187
+2 0.000003 0.002893 0.091669 0.468648 0.855090 0.982682 0.999040
+3 0.000018 0.007056 0.142598 0.562245 0.898890 0.989700 0.999506
+4 0.000084 0.015381 0.208077 0.650916 0.931740 0.994056 0.999752
TABLE A.8: Marcum’s Q function for α = 0, 0.01, . . . , 1.99
α 0.00 0.05 0.10 0.15 0.20 0.25 0.30 0.35 0.40 0.45 0.50 0.55 0.60 0.65
+0.00 0.500000 0.480061 0.460172 0.440382 0.420740 0.401294 0.382089 0.363169 0.344578 0.326355 0.308538 0.291160 0.274253 0.257846
+0.01 0.496011 0.476078 0.456205 0.436441 0.416834 0.397432 0.378281 0.359424 0.340903 0.322758 0.305026 0.287740 0.270931 0.254627
+0.02 0.492022 0.472097 0.452242 0.432505 0.412936 0.393580 0.374484 0.355691 0.337243 0.319178 0.301532 0.284339 0.267629 0.251429
+0.03 0.488033 0.468119 0.448283 0.428576 0.409046 0.389739 0.370700 0.351973 0.333598 0.315614 0.298056 0.280957 0.264347 0.248252
+0.04 0.484046 0.464144 0.444330 0.424655 0.405165 0.385908 0.366928 0.348268 0.329969 0.312067 0.294598 0.277595 0.261086 0.245097 (Continued )
229650 0.05 1.206108 0.109349 0.125072 0.80 1.065522 0.220650 0.026803 0.55 1.8: Marcum’s Q function for α = 0.070781 0.033625 0.095098 0.217695 0.186733 0.156248 0.080757 0.10 1.85 1.051551 0.111233 0.049471 0.119000 0.042716 0.028067 0.80 0.028716 0.163543 0.024998 +0.173609 0.068112 0.140071 0.200454 0.50 1.091759 0.035148 0.238852 0.146859 0.077804 0.117023 0.03 0.100273 0.093418 0.069437 0.45 1.151505 0.194894 0.95 +0.04 0.137857 0.073529 0.024419 +0. 1.023852 +0.133500 0.211855 0.131357 0.235762 0.208970 0.054799 0.053699 0.086915 0.127143 0.214764 0.40 1.25 1.166023 0.75 1.01.083793 0.032884 0.02 0.241964 0.040929 0.025588 +0.90 0.050503 0.115070 0.60 1.75 0.99 (Continued) α 0.232695 0.121001 0.035930 0.082264 0.171056 0.057053 0.70 1.041815 0.090123 0.040059 0.066807 0. .197662 0.047460 0.192150 0.35 1.153864 0.079270 0.036727 0.135666 0.189430 0. 0.158655 0.072145 0.107488 0.00 1.95 1.226627 0.15 1.30 1.098525 0.85 0.01 0.70 0.048457 0.061780 0.064256 0.90 1.045514 0. .113140 0.076359 0.038364 0.055917 0.096801 0.168528 0.178786 0.058208 0.027429 0.052616 0.030742 0.149170 0.103835 0.65 1.088508 0.059380 0.029379 0.176186 0.123024 0.203269 0.161087 0.144572 0.026190 0.98 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS TABLE A.037538 0.034379 0.102042 0.129238 0.044565 0.043633 0.031443 0.085344 0. .030054 0.032157 0.023295 .20 1.060571 0.223627 0.181411 0.074934 0.063008 0.039204 0. .184060 0.142310 0.046479 0.00 0.105650 0.
30 +0.020675 0.000816 0.20 3.70 2.000538 0.009387 0.001395 0.35 2.007549 0.03 0.000501 0.002477 0.000904 0.55 2.010170 0.002052 0.000598 0.006756 0.000557 0.001001 0.009137 0.003167 0.002890 0.006037 0.015003 0.001223 0.000736 0.007344 0.80 2.000483 +0.25 2.00 3.005234 0.002980 0.85 2.002718 0.011304 0.000467 +0.005543 0.99 α 2.002803 0.002401 0.000519 0.001107 0.001035 0.25 3.00 0.10 2.001070 0.012545 0.001695 0.011011 0.000434 +0.017003 0.010444 0.001306 0.001866 0.022750 0.005386 0.001441 0.003072 0.005703 0.003264 0.04 0.75 2.013903 0.00 2.002327 0.004940 0.016177 0.013553 0.020182 0.012224 0.004025 0.011604 0.004269 0.001144 0.05 2.015386 0.002186 0. .008894 0.002118 0.01 0.003907 0.15 2.95 3.000577 0.004661 0.003573 0. .60 2.001641 0.021178 0.000687 0.012874 0.014629 0.008424 0.018309 0.003681 0.000874 0.006210 0.003364 0.10 3.001589 0.000641 0.001538 0.003467 0.004145 0.011911 0.65 2.014262 0.001489 0.004397 0.000419 (Continued ) .009903 0.30 2.01.40 2.002555 0.20 2.016586 0.9: Marcum’s Q function for α = 3.001350 0.000619 0.006569 0.007143 0.021692 0.000845 0.001807 0.000450 +0.006387 0.DISTRIBUTION TABLES 99 TABLE A.001264 0.005085 0.50 2.013209 0.002256 0.05 3.010724 0.000968 0.022216 0.018763 0.008656 0.000664 0.000936 0.90 2.004799 0.003793 0.009642 0.019226 0.45 2. 3.004527 0.001183 0.008198 0. .001926 0.007760 0. .019699 0.006947 0.000762 0.000789 0.001988 0.02 0.15 3.015778 0.007976 0.017864 0.002635 0.000711 0.001750 0.017429 0. 3.005868 0.
000179 0.000291 0.45 3.9: Marcum’s Q function for α = 3.000096 0.90 3.000251 0.000242 0.100 ADVANCED PROBABILITY THEORY FOR BIOMEDICAL ENGINEERS TABLE A. .40 3.65 3.000050 0.000036 +0.000064 0.000216 0.000034 +0. .000142 0.000092 0.000048 0.01.000067 0.000100 0.000362 0.00 0.000042 0.85 3.000208 0.000112 0.03 0.000121 0.000185 0.000390 0.000280 0.01 0.02 0.000350 0.000108 0.000147 0.35 3.000033 .000104 0. .000126 0.000193 0.000313 0.75 3.000153 0.000037 +0. 3.000325 0.99 (Continued) α 3.000039 +0.000075 0.000404 0.000131 0.50 3.000200 0.000233 0.000072 0.000044 0.95 +0.80 3.000057 0.000159 0.000059 0.000062 0.000085 0.000376 0.04 0.000052 0.55 3.000337 0. .000117 0.000046 0.000088 0.70 3.000070 0.000078 0.60 3.000172 0.000302 0.000270 0. 3.000224 0.000260 0.000136 0.000165 0.000041 0.000082 0.000054 0.
This action might not be possible to undo. Are you sure you want to continue?
We've moved you to where you read on your other device.
Get the full title to continue listening from where you left off, or restart the preview.