Professional Documents
Culture Documents
Ch-3 Solution
Ch-3 Solution
3.1 Introduction
3.2 Some special distribution functions
3.3 Joint distributions
3.4 Functions of random variables
3.5 Limit Theorems
3.6 Chapter summary
3.7 Computer examples
Projects for Chapter 3
Chapter 3
Exercises 3.2
3.2.1.
3.2.2.
3.2.3.
3.2.4.
3.2.5.
3.2.6.
Let X = the number of complete passes, then n = 16 and p = 0.62.
(c) There is a 76.8% chance that he will complete more than half of his passes.
3.2.7.
Let X = the number of people satisfied with their health coverage, then n = 15 and
p = 0.7.
There is a 20.6% chance that exactly 10 people are satisfied with their health coverage.
There is a 51.5% chance that no more than 10 people are satisfied with their health
coverage.
3.2.8.
Let X = the number of times that he hits the target, then n = 6 and p = 0.4.
He must fire 3 times so that the probability of hitting the target at least once is greater than
0.77.
3.2.9.
Let X = the number of defective tubes in a certain box of 400, then n = 400 and
p = 3/100=0.03.
(d) Part (c) shows that the probability that at most one defective is 0.0000684, which is very
small.
3.2.10.
The probability of at least one error on a certain page of the book is 0.393.
3.2.11.
3.2.12.
The probability density function is given by
Hence,
Hence, there is a 20% chance that the arrival time measurement will be in error less than 0.01.
3.2.13.
The probability density function is given by
Hence,
Hence, there is a 40% chance that a piece chosen at random will be suitable for kitchen use.
3.2.14.
P( Wrong Decision) P( Right Decision)
.
3.2.15.
The probability density function is given by
(a)
(b)
(c) There is a 20% chance that the efficiency is between 60 and 80 units; there is 10% chance
that the efficiency is greater than 90 units.
3.2.16.
(b) If .
(c)
3.2.17.
Let X = the failure time of the component. And X follows exponential distribution with rate
0.05. Then the p.d.f. of X is given by
Hence,
3.2.18.
Using normal table, we have . Then for given we have the expected
life as .
3.2.19.
The uniform probability density function is given by
.
Hence,
3.2.20.
3.2.21.
The minimum score that a student has to get to an “A” grade is 78.22.
3.2.22.
(d) There is a 6% chance that a score chosen at random will be between 80 and 85. There is a
24% chance that a score will be greater than 75. There is a 99.8% chance that a score will be
less than 90, i.e. almost all the scores are less than 90.
3.2.23.
3.2.24.
.
There is a 5.3% chance that an individual picked at random will need to consult a physician.
3.2.25.
(d) There is a 16% chance that a child chosen at random will have a systolic pressure greater
than 125 mm Hg. There is a 2.3% chance that a child will have a systolic pressure less than 95
mm Hg. 95% of this population have a systolic blood pressure below 131.45.
3.2.26.
3.2.27.
Then
, similarly we can
obtain and .
For the probability of surviving 0.2, 0.5 and 0.8 the experimenter should choose doses 0.58, 1
and 1.73, respectively.
3.2.28.
, and
Then .
3.2.30.
Let . Then,
3.2.31.
(a) First consider the following product
.
3.2.32.
The probability that the percentage of major accidents is less than 80% but greater than 60%
is 0.432.
3.2.33.
In this case, the number of breakdowns per month can be assumed to have Poisson
distribution with mean 3.
3.2.34.
In the time interval , we have
Then, considering n and λ are fixed parameters, and are random in above expression.
Then we see that
3.2.35.
The probability that an acid solution made by this procedure will satisfactorily etch a tray is
0.6442.
The probability that an acid solution made by this procedure will satisfactorily etch a tray is
0.4712.
3.2.36.
3.2.37.
3.2.38
Exercises 3.3
3.3.1.
(a) The joint probability function is
where .
(d)
y
x 0 1 2 3 4 Sum
0 0.020 0.068 0.064 0.019 0.001 0.172
1 0.090 0.203 0.113 0.015 0.421
2 0.119 0.158 0.040 0.317
3 0.053 0.032 0.085
4 0.007 0.007
Sum 0.289 0.461 0.217 0.034 0.001 1.00
3.3.2.
By substituting the relation we can rewrite the bivariate function into a
univairate function of x as
3.3.3.
Thus, if c = 1/4, then . And we also see that for all x and
3.3.4.
First we see that for all x and y, and
3.3.5.
By definition, the marginal pdf of X is given by the row sums, and the marginal pdf of Y is
obtained by the column sums. Hence,
xi -1 3 5 otherwise
yi -2 0 1 4 otherwise
0.4 0.3 0.1 0.2 0
3.3.6.
The marginal pdf of X is
.
3.3.7.
From Exercise 3.3.5 we can calculate the following.
3.3.8.
3.3.9.
(a) The marginal of X is
3.3.10.
(ii) Given , we have the conditional density as
3.3.11.
Using the joint density in Exercise 3.3.9 we can obtain the joint mgf of as
where
3.3.12.
(b) Given , we have the conditional density as
3.3.13.
Given , we have
3.3.14.
For given , we have
Thus, follows .
For given , we have
3.3.15.
, and
Then, .
, and
Then, .
3.3.16.
, and
Then, .
, and
.
Then, .
3.3.17.
Assume that a and c are nonzero.
,
, and .
3.3.18.
3.3.19.
We fist state the famous Cauchy-Schwarz inequality:
Now, consider
3.3.20.
(a) .
Then, .
, and
Then, .
3.3.21.
(a) First, we compute the marginal densities.
, and
, and
Then, .
(c) To check for independence of X and Y
.
3.3.22.
, and
We then see that for all x and y, therefore, X and Y are independent.
3.3.23.
. Then,
, and
Thus, .
Exercises 3.4
3.4.1.
3.4.2.
Let and .
Then and , and
Then we see that the support can be divided into two parts: one part is when ,
and the other part is when , . Then we have the marginal pdf of as
3.4.3.
Let and .
Then and , and
3.4.4.
The joint pdf of is
Let and . Apply the result in Exercise 3.4.3 we then have the pdf of
as
.
3.4.5.
The joint pdf of is
, and
Also, notice that fo2r all x and y, thus X and Y are independent.
By the definition of Chi-Square distribution and the independency of X and Y, we know that
3.4.6.
The moment generating function of is
Note that each Xi follows exponential distribution with mean θ which is equivalent to
as .
3.4.7.
Let and .
Then and , and
3.4.8.
, , and
. Thus, .
Then, the pdf of Y is and zero otherwise. On the other hand, the
otherwise. Thus, we see that . This implies that X and Y are not
independent.
3.4.9.
(a) Here let , and hence, . Thus, .
.
(b) The cdf of U is given by
and zero
3.4.10.
3.4.11.
3.4.12.
Let and .
Let and .
Then and , and
3.4.13.
3.4.14.
Let and .
3.4.15.
The joint pdf of is
Apply the result in Exercise 3.4.14 with . We have the joint pdf of and
3.4.16.
The joint pdf of is
Let and .
3.5.1.
(a) Note that X follows , then apply the result in Exercise 3.2.31 we have
and . From the Chebyshev’s theorem
(b) .
3.5.2.
Since X follows a Poisson distribution with , then . From the
Chebyshev’s theorem
3.5.3.
be written as
This implies that or .
3.5.4.
Since X follows a Poisson distribution with , then . From the
Chebyshev’s theorem
3.5.5.
Apply Chebyshev’s theorem we have
3.5.6.
Apply Chebyshev’s theorem we have
3.5.7.
Let denote each toss of coin with value 1 if head occurs and 0 otherwise. Then,
If the coin is not fair, then the fraction of heads, , will be near to the true probability of
3.5.8.
3.5.9.
distributed. Thus, the conditions of the law of large numbers stated in the text are not
satisfied.
There is a weaker version of the weak law of large numbers which requires only
Therefore, the conditions of the weaker version are not satisfied, either.
3.5.10.
Let denote the kth moment of a random variable, W, then the mgf of W can be written as
.
Since the random variables are independent, it follows that the random variables are
Now we want to take the limit of the as . To do this we first consider the
natural logarithm of as
Taking we have
We see that only the first term does not involve n while all the other terms have n to a positive
power in the denominator. Thus it can be shown that
, or , which is the mgf of a standard normal
random variable. Therefore, we can conclude that the liming distribution of is the
standard normal distribution. A brief proof of the central limit theorem is complete.
3.5.11.
3.5.12.
3.5.13.
Let denote the success of ith customer. Then each follows Bernoulli distribution
3.5.14
From the Chebyshev’s theorem
.
3.5.15.
(a) From the Chebyshev’s theorem
3.5.16.
Let if the ith person in the sample is color blind and 0 otherwise. Then each
. Let .
. Then,
3.5.17.
Let if the ith shirt in the lot is defective and 0 otherwise. Then each follows
. Let .
. Then,
.
Using the normal table, . Solving this equation, we have .
Thus, the greatest number of shirts constituting a lot to have less than five defectives with
probability 0.95 is 122.
3.5.18
We know that, 500 out of 10000 eyedroppers have missing labels.
Now,
Hence, the probability that, at most, two defective droppers will be detected in a random
sample of 125 is 0.0477
3.5.19.
for , then
for , then
3.5.20
for , then
for , then
is zero.
Therefore, by Central Limit theorem,
3.5.21
We have, and . Then, by CLT we know that
3.5.22
3.5.23
We have, and .
3.5.24
3.5.25
We have, and .
Sample size(n)=60
3.5.26
We have,
and
Sample size(n)=50