You are on page 1of 36

CHAPTER 3

Additional Topics in Probability

3.1 Introduction
3.2 Some special distribution functions
3.3 Joint distributions
3.4 Functions of random variables
3.5 Limit Theorems
3.6 Chapter summary
3.7 Computer examples
Projects for Chapter 3
 Chapter 3

Exercises 3.2

3.2.1.

3.2.2.

3.2.3.

3.2.4.
3.2.5.

3.2.6.
Let X = the number of complete passes, then n = 16 and p = 0.62.

(c) There is a 76.8% chance that he will complete more than half of his passes.

3.2.7.
Let X = the number of people satisfied with their health coverage, then n = 15 and
p = 0.7.
There is a 20.6% chance that exactly 10 people are satisfied with their health coverage.

There is a 51.5% chance that no more than 10 people are satisfied with their health
coverage.

3.2.8.
Let X = the number of times that he hits the target, then n = 6 and p = 0.4.

(b) & (c)


Find n such that

He must fire 3 times so that the probability of hitting the target at least once is greater than
0.77.

3.2.9.
Let X = the number of defective tubes in a certain box of 400, then n = 400 and
p = 3/100=0.03.

(d) Part (c) shows that the probability that at most one defective is 0.0000684, which is very
small.

3.2.10.

The probability of at least one error on a certain page of the book is 0.393.

3.2.11.
3.2.12.
The probability density function is given by

Hence,

Hence, there is a 20% chance that the arrival time measurement will be in error less than 0.01.

3.2.13.
The probability density function is given by

Hence,

Hence, there is a 40% chance that a piece chosen at random will be suitable for kitchen use.

3.2.14.
P( Wrong Decision) P( Right Decision)

.
3.2.15.
The probability density function is given by

(a)

(b)
(c) There is a 20% chance that the efficiency is between 60 and 80 units; there is 10% chance
that the efficiency is greater than 90 units.

3.2.16.

(b) If .

(c)

Note that the failure rate is an increasing function of t.

3.2.17.
Let X = the failure time of the component. And X follows exponential distribution with rate
0.05. Then the p.d.f. of X is given by

Hence,

3.2.18.

Using normal table, we have . Then for given we have the expected

life as .

3.2.19.
The uniform probability density function is given by
.
Hence,
3.2.20.

3.2.21.

The minimum score that a student has to get to an “A” grade is 78.22.

3.2.22.

(d) There is a 6% chance that a score chosen at random will be between 80 and 85. There is a
24% chance that a score will be greater than 75. There is a 99.8% chance that a score will be
less than 90, i.e. almost all the scores are less than 90.

3.2.23.

13.4% of the balls manufactured by the company are defective.

3.2.24.

.
There is a 5.3% chance that an individual picked at random will need to consult a physician.

3.2.25.

(d) There is a 16% chance that a child chosen at random will have a systolic pressure greater
than 125 mm Hg. There is a 2.3% chance that a child will have a systolic pressure less than 95
mm Hg. 95% of this population have a systolic blood pressure below 131.45.

3.2.26.

Hence, about 48.9% of the freshmen are physically fit.

3.2.27.

Using standard normal table, we can find that , and .

Then

, similarly we can

obtain and .

For the probability of surviving 0.2, 0.5 and 0.8 the experimenter should choose doses 0.58, 1
and 1.73, respectively.

3.2.28.

And for a > 1,


3.2.29.

, and

Then .

3.2.30.

Let . Then,
3.2.31.
(a) First consider the following product

Transforming to polar coordinates with and

Hence, we have shown that

.
3.2.32.

The probability that the percentage of major accidents is less than 80% but greater than 60%
is 0.432.

3.2.33.
In this case, the number of breakdowns per month can be assumed to have Poisson
distribution with mean 3.

. There is a 14.94% chance that there will be just one

network breakdown during December.


. There is a 35.28% chance that
there will be at least 4 network breakdowns during December.

. There is a 98.81% chance that there will be at most 7

network breakdowns during December.

3.2.34.
In the time interval , we have

Then, considering n and λ are fixed parameters, and are random in above expression.
Then we see that

, which is the kernel in the pdf of


.

3.2.35.

The probability that an acid solution made by this procedure will satisfactorily etch a tray is
0.6442.

The probability that an acid solution made by this procedure will satisfactorily etch a tray is
0.4712.

3.2.36.

3.2.37.

3.2.38
Exercises 3.3

3.3.1.
(a) The joint probability function is

where .

(d)
y
x 0 1 2 3 4 Sum
0 0.020 0.068 0.064 0.019 0.001 0.172
1 0.090 0.203 0.113 0.015 0.421
2 0.119 0.158 0.040 0.317
3 0.053 0.032 0.085
4 0.007 0.007
Sum 0.289 0.461 0.217 0.034 0.001 1.00

3.3.2.
By substituting the relation we can rewrite the bivariate function into a
univairate function of x as

First, we see that for all x. Next consider


.

Then by definition the function is a probability density function.

3.3.3.

Thus, if c = 1/4, then . And we also see that for all x and

y. Hence, is a joint probability density function.

3.3.4.
First we see that for all x and y, and

Thus, is a joint probability density function.

3.3.5.
By definition, the marginal pdf of X is given by the row sums, and the marginal pdf of Y is
obtained by the column sums. Hence,

xi -1 3 5 otherwise

0.6 0.3 0.1 0

yi -2 0 1 4 otherwise
0.4 0.3 0.1 0.2 0

3.3.6.
The marginal pdf of X is
.

The marginal pdf of Y is

3.3.7.
From Exercise 3.3.5 we can calculate the following.

3.3.8.

Thus, let to make the integral be 1.

3.3.9.
(a) The marginal of X is

3.3.10.
(ii) Given , we have the conditional density as

3.3.11.
Using the joint density in Exercise 3.3.9 we can obtain the joint mgf of as

where

After simplification we then have

3.3.12.
(b) Given , we have the conditional density as

3.3.13.

Given , we have

(b) Given , we have

3.3.14.
For given , we have

Thus, follows .
For given , we have

Hence, is proportional to the kernel pdf of . Thus,


follows .

3.3.15.

, and

Then, .

, and

Then, .

3.3.16.

(b) From Exercise 3.3.10 we have computed and , then we have

, and

Then, .

, and
.

Then, .

3.3.17.
Assume that a and c are nonzero.
,

, and .

3.3.18.

3.3.19.
We fist state the famous Cauchy-Schwarz inequality:

and the equality holds if and only if there exists some

constant α and β, not both zero, such that .

Now, consider

By the Cauchy-Schwarz inequality we have

for some constants a and b.

3.3.20.
(a) .

For , we have the conditional density as

Then, .

, and

Then, .

3.3.21.
(a) First, we compute the marginal densities.

, and

For given , we have the conditional density as

Then, follows Uniform(0, y). Thus, .

, and

Then, .
(c) To check for independence of X and Y
.

Hence, X and Y are not independent.

3.3.22.

Thus, if c = , then . And we also see that for all x

and y. Hence, is the joint probability density function of .


We can find the marginal densities as

, and

We then see that for all x and y, therefore, X and Y are independent.

3.3.23.

Let . Since X and Y are independent, we have

. Then,

, and

Thus, .

Exercises 3.4

3.4.1.

The pdf of X is if and zero otherwise.


Then, if and zero otherwise.

3.4.2.
Let and .
Then and , and

Then the joint pdf of U and V is given by

Then we see that the support can be divided into two parts: one part is when ,
and the other part is when , . Then we have the marginal pdf of as

3.4.3.
Let and .
Then and , and

Then the joint pdf of U and V is given by

Then the pdf of U is given by

3.4.4.
The joint pdf of is

Let and . Apply the result in Exercise 3.4.3 we then have the pdf of
as
.

Note that the integral does not have closed-form solution.

3.4.5.
The joint pdf of is

We can easily show that the marginal densities are

, and

This implies that and .

Also, notice that fo2r all x and y, thus X and Y are independent.

By the definition of Chi-Square distribution and the independency of X and Y, we know that

. Therefore, the pdf of is

3.4.6.
The moment generating function of is

Note that each Xi follows exponential distribution with mean θ which is equivalent to

. Thus, the mgf of each Xi is .

By the independency Xi ‘s we have the mgf of as follows


.

Thus, by the uniqueness of mgf we know that follows with pdf

as .

3.4.7.
Let and .
Then and , and

Then the joint pdf of U and V is given by

Thus, the pdf of U is given by .

3.4.8.

, , and

. Thus, .

To check independency of X and Y we first consider the cdf of Y.

Then, the pdf of Y is and zero otherwise. On the other hand, the

conditional density of Y given is and zero

otherwise. Thus, we see that . This implies that X and Y are not

independent.

3.4.9.
(a) Here let , and hence, . Thus, .

Also, . Therefore, the pdf of Z is

, which is the pdf of

.
(b) The cdf of U is given by

Hence, the pdf of U is

and zero

otherwise, which is the pdf of .

3.4.10.

Here let , and hence, . Thus, .

Also, . Therefore, the pdf of Y is

3.4.11.

Since the support of the pdf of V is , then is a one-to-one function on

the support. Hence, . Thus, .

Therefore, the pdf of E is given by

3.4.12.
Let and .

Let and .
Then and , and

Then the joint pdf of U and V is given by

Then the pdf of U is given by

3.4.13.

Let and . Here U is considered to be the radius and V is the

angle. Hence, this is a polar transformation and hence is one-to-one.


Then and , and

Then the joint pdf of U and V is given by

3.4.14.
Let and .

Then and , and

Then the joint pdf of U and V is given by

3.4.15.
The joint pdf of is

Apply the result in Exercise 3.4.14 with . We have the joint pdf of and

as Thus, the pdf of U is give by

3.4.16.
The joint pdf of is

Let and .

Then and , and

Then the joint pdf of U and V is given by


Exercises 3.5

3.5.1.
(a) Note that X follows , then apply the result in Exercise 3.2.31 we have
and . From the Chebyshev’s theorem

Equating to 0.2 and to 0.8 with and , we


obtain . Hence,

(b) .

3.5.2.
Since X follows a Poisson distribution with , then . From the
Chebyshev’s theorem

Equating to 70 and to 130 with and , we obtain


. Hence,

3.5.3.

Note that for or . Then the above inequality can

be written as
This implies that or .

3.5.4.
Since X follows a Poisson distribution with , then . From the
Chebyshev’s theorem

Equating to 100 and to 140 with and , we obtain


. Hence,

3.5.5.
Apply Chebyshev’s theorem we have

This implies that we want to find n such that

3.5.6.
Apply Chebyshev’s theorem we have

Thus, equating to 0.9, we obtain .

3.5.7.

Let denote each toss of coin with value 1 if head occurs and 0 otherwise. Then,

are independent variables which follow Bernoulli distribution with .

Thus, , and . For any , from the law of large numbers we


have

, i.e. will be near to for large n.

If the coin is not fair, then the fraction of heads, , will be near to the true probability of

getting head for large n.

3.5.8.

Note that is the pdf of a Bernoulli distribution. Then

follows . Thus, , and

. Apply Chebyshev’s theorem we have

This implies that we want to find n such that

Therefore, the smallest sample size needed is 32.

3.5.9.

Note that , and . Hence, are not identically

distributed. Thus, the conditions of the law of large numbers stated in the text are not
satisfied.

There is a weaker version of the weak law of large numbers which requires only

, and . However, in this case

Therefore, the conditions of the weaker version are not satisfied, either.

3.5.10.

Let denote the kth moment of a random variable, W, then the mgf of W can be written as
.

Let . Notice that and . Thus, the first two moments of

are 0 and 1, respectively. Then the mgf of can be written as

Also, we see that .

Since the random variables are independent, it follows that the random variables are

independent, for i = 1,2,…,n. Thus, the mgf of is given by

Now we want to take the limit of the as . To do this we first consider the

natural logarithm of as

The Taylor’s expansion for is

Taking we have

We see that only the first term does not involve n while all the other terms have n to a positive
power in the denominator. Thus it can be shown that
, or , which is the mgf of a standard normal

random variable. Therefore, we can conclude that the liming distribution of is the

standard normal distribution. A brief proof of the central limit theorem is complete.

3.5.11.

First note that . Then, by CLT we know that

approximately follows for large n.

3.5.12.

First note that and . Then, by CLT we know that

approximately follows for large n.

3.5.13.

Let denote the success of ith customer. Then each follows Bernoulli distribution

with probability 0.03, and and . Let .

From the CLT, follows approximately . Hence, we have

3.5.14
From the Chebyshev’s theorem

Equating to 104 and to 140 with and , we obtain .


Hence,

.
3.5.15.
(a) From the Chebyshev’s theorem

Equating to 53.1 and to 108.3 with and , we obtain


. Hence,

3.5.16.

Let if the ith person in the sample is color blind and 0 otherwise. Then each

follows Bernoulli distribution with estimated probability 0.02, and and

. Let .

We want . By the CLT, follows approximately

. Then,

Using the normal table, . Solving this equation, we have

. Thus, the sample size must be at least 360.

3.5.17.

Let if the ith shirt in the lot is defective and 0 otherwise. Then each follows

Bernoulli distribution with estimated probability 0.02, and and

. Let .

We want . By the CLT, follows approximately

. Then,

.
Using the normal table, . Solving this equation, we have .

Thus, the greatest number of shirts constituting a lot to have less than five defectives with
probability 0.95 is 122.

3.5.18
We know that, 500 out of 10000 eyedroppers have missing labels.

Now,

Hence, the probability that, at most, two defective droppers will be detected in a random
sample of 125 is 0.0477
3.5.19.

for , then

for , then

Therefore, this assumes independence or that the covariance is


zero.

3.5.20

for , then

for , then

Therefore, this assumes independence or that the covariance

is zero.
Therefore, by Central Limit theorem,

3.5.21
We have, and . Then, by CLT we know that

approximately follows . Then,

3.5.22

3.5.23

We have, and .

Sample size (n) = 75 and sample mean ( )= = 2.4

3.5.24

(Use z table to find the corresponding z score)

3.5.25

We have, and .

Sample size(n)=60

3.5.26

We have,

and
Sample size(n)=50

You might also like