Professional Documents
Culture Documents
Estadistica Matematica Mendehall Cap 6
Estadistica Matematica Mendehall Cap 6
6.1
u +1
2
fU 2 (u ) = FU2 (u ) =
u +1
2
, 1 u 1.
1
u
1, 0 u 1 .
d. E (U 1 ) = 1 / 3, E (U 2 ) = 1 / 3, E (U 3 ) = 1 / 6.
e. E (2Y 1) = 1 / 3, E (1 2Y ) = 1 / 3, E (Y 2 ) = 1 / 6.
y
6.2
Thus, fU 2 (u ) = FU2 (u ) = 23 (3 u ) 2 , 2 u 4 .
c. FU 3 (u ) = P(U 3 u ) = P(Y 2 u ) = P( u Y u ) = FY ( u ) FY ( u ) = u 3 / 2 .
Thus, fU 3 (u ) = FU3 (u ) =
6.3
3
2
u, 0 u 1.
y2 / 2
0 y 1
u +4
10
) = FY ( u10+4 ) . So,
+4 )
+4
( u200
4u 6
u100
u 1
1
FU (u ) = 10
6 < u 11 , and fU (u ) = FU (u ) = 10
1
0
u > 11
b. E(U) = 5.583.
c. E(10Y 4) = 10(23/24) 4 = 5.583.
2
6.4
4u 6
6 < u 11 .
elsewhere
fU (u ) = FU (u ) = 121 e ( u 1) / 12 , u 1 .
b. E(U) = 13.
121
122
6.5
1 / 2
6.6
u 3
2
) = FY (
u 3
2
)=
u 3
2
1
4
. Differentiating,
, 5 u 53 .
1dy dy
1
= u2 / 2 .
0 2 y2
2 u
1dy dy
= 1 (2 u )2 / 2 .
0 y2+u
0 u <1
u
Thus, fU (u ) = FU (u ) = 2 u 1 y 2 .
0
elsewhere
b. E(U) = 1.
6.7
Let FZ(z) and fZ(z) denote the standard normal distribution and density functions
respectively.
a. FU (u ) = P(U u ) = P( Z 2 u ) = P( u Z u ) = FZ ( u ) FZ ( u ). The
density function for U is then
fU (u ) = FU (u ) = 2 1 u f Z ( u ) + 2 1 u f Z ( u ) = 1u f Z ( u ), u 0 .
Evaluating, we find fU (u ) =
1
2
u 1 / 2 e u / 2 u 0 .
).
Let FY(y) and fY(y) denote the beta distribution and density functions respectively.
a. FU (u ) = P(U u ) = P(1 Y u ) = P(Y 1 u ) = 1 FY (1 u ). The density function
for U is then fU (u ) = FU (u ) = fY (1 u ) =
b. E(U) = 1 E(Y) =
( + )
( ) ( )
u 1 (1 u ) 1 , 0 u 1 .
c. V(U) = V(Y).
6.9
2dy dy
1
fU (u ) = FU (u ) = 2u , 0 u 1.
b. E(U) = 2/3.
c. (found in an earlier exercise in Chapter 5) E(Y1 + Y2) = 2/3.
= u 2 . Thus,
123
Instructors Solutions Manual
6.10
y1
dy1 dy 2 = 1 e u , so that
y2
6.11
Y1 +Y2
2
u ) = P(Y1 2u Y2 ) =
2 u 2 u y2
e
0
y1 y2
dy1dy2 = 1 e 2u 2ue 2u ,
y2
6.12
Let FY(y) and fY(y) denote the gamma distribution and density functions respectively.
a. FU (u ) = P(U u ) = P(cY u ) = P(Y u / c ) . The density function for U is then
fU (u ) = FU (u ) =
1
c
f Y (u / c ) =
1
( )( c )
gamma distribution.
b. The shape parameter is the same (), but the scale parameter is c.
6.13
y1 y2
dy1dy2 = 1 e u ue u .
Thus, fU (u ) = FU (u ) = ue , u 0.
u
6.14
18( y
u u / y2
= 9u 8u + 6u lnu.
fU (u ) = FU (u ) = 18u(1 u + u ln u ) , 0 u 1.
6.15
Let U have a uniform distribution on (0, 1). The distribution function for U is
FU (u ) = P(U u ) = u , 0 u 1. For a function G, we require G(U) = Y where Y has
2
124
6.16
b
1u
y 1
, 0 y .
()
b. Following Ex. 6.15 and 6.16, let u = y so that y = u1/. Thus, the random variable
Y = U1/ has distribution function FY(y).
c. From part (b), the transformation is y = 4 u . The values are 2.0785, 3.229, 1.5036,
1.5610, 2.403.
6.18
()
so that y =
(1u )1 /
. Thus, Y = (1 U ) 1 / .
c. From part (b), y = 3 / 1 u . The values are 3.0087, 3.3642, 6.2446, 3.4583, 4.7904.
6.19
6.20
6.21
By definition, P(X = i) = P[F(i 1) < U F(i)] = F(i) F(i 1), for i = 1, 2, , since for
any 0 a 1, P(U a) = a for any 0 a 1. From Ex. 4.5, P(Y = i) = F(i) F(i 1), for
i = 1, 2, . Thus, X and Y have the same distribution.
6.22
Let U have a uniform distribution on the interval (0, 1). For a geometric distribution with
parameter p and distribution function F, define the random variable X as:
X = k if and only if F(k 1) < U F(k), k = 1, 2, .
Or since F(k) = 1 qk, we have that:
X = k if and only if 1 qk1 < U 1 qk, OR
X = k if and only if qk, < 1U qk1, OR
X = k if and only if klnq ln(1U) (k1)lnq, OR
X = k if and only if k1 < [ln(1U)]/lnq k.
6.23
a. If U = 2Y 1, then Y =
U +1
2
. Thus,
b. If U = 1 2Y , then Y =
1U
2
. Thus,
c. If U = Y2 , then Y = U . Thus,
dy
du
dy
du
dy
du
1
2
1
2
1
2 u
and fU (u ) =
1
2 u
2(1 u ) = 1 uu , 0 u 1.
125
Instructors Solutions Manual
6.24
If U = 3Y + 1, then Y =
fU ( u ) =
6.25
1 1
3 4
e ( u 1) / 12 =
U 1
3
1 ( u 1) / 12
12
. Thus,
dy
du
, 1 u.
dy1
du
Y1 +Y2
2
= 2 . The joint density of U and Y2 is g(u, y2) = 2e2u, u 0, y2 0, and y2 < 2u.
2u
6.26
dy
du
b. E (Y ) = E (U
k
k/m
6.27
and
dy
dw
= 2w . Then, f Y ( y ) = 2 we w
dy
du
= 12 e u / 2 .
6.29
a. With W =
mV 2
2
,V =
2W
m
and |
fW ( w) =
dv
dw
|=
a(2w / m)
2 mw
1
2 mw
. Then,
e 2bw / m =
a 2
m3 / 2
w1 / 2 e w / kT , w 0.
The above expression is in the form of a gamma density, so the constant a must be
chosen so that the density integrate to 1, or simply
a 2
= ( 3 )(1kT )3 / 2 .
m3 / 2
2
1
( 23 )( kT )3 / 2
3
2
w1 / 2 e w / kT .
kT .
P / 2 and
126
6.31
2
1
1
8
y e
y1 (1+ u ) / 2
dy2
du
|= y1 . The joint
2
(1+u )3
, u 0.
6.32
1
8 2 ( u 3 )
(U23 )1/ 2
and |
dy
du
|=
1
4
( ). Thus,
2
u 3
, 5 u 53.
6.33
dy
If U = 5 (Y/2), Y = 2(5 U). Thus, | du
| = 2 and fU (u ) = 4(80 31u + 3u 2 ) , 4.5 u 5.
6.34
dy
a. If U = Y2, Y = U . Thus, | du
|=
1
2 u
dy2
du
1
y1
g ( y1 , u ) = 1 / y1 , 0 y1 0, 0 u y1.
1
6.36
By independence, f ( y1 , y2 ) =
4 y1 y2
2
dy 2
du
1
2 u y12
so
Then, fU (u ) =
2
2
4 y1 u y12
2
y1e u / dy1 =
1
2
e u /
1
2 u y12
u.
6.37
a. mY1 (t ) = E ( e tY1 ) = e ty p( y ) = 1 p + pe t .
x =0
n
b. mW (t ) = E (e tW ) = mYi (t ) = [1 p + pe t ]n
i =1
c. Since the mgf for W is in the form of a binomial mgf with n trials and success
probability p, this is the distribution for W.
127
Instructors Solutions Manual
6.38
Let Y1 and Y2 have mgfs as given, and let U = a1Y1 + a2Y2. The mdf for U is
mU (t ) = E (eUt ) = E (e ( a1Y1 +a2Y2 ) t ) = E ( e ( a1t )Y1 ) E (e ( a2t )Y2 ) = mY1 ( a1t )mY2 ( a 2 t ) .
6.39
The mgf for the exponential distribution with = 1 is m(t ) = (1 t ) 1 , t < 1. Thus, with
Y1 and Y2 each having this distribution and U = (Y1 + Y2)/2. Using the result from Ex.
6.38, let a1 = a2 = 1/2 so the mgf for U is mU (t ) = m(t / 2)m(t / 2) = (1 t / 2) 2 . Note that
this is the mgf for a gamma random variable with = 2, = 1/2, so the density function
for U is fU (u ) = 4ue 2u , u 0 .
6.40
It has been shown that the distribution of both Y12 and Y22 is chisquare with = 1. Thus,
both have mgf m(t ) = (1 2t ) 1 / 2 , t < 1/2. With U = Y12 + Y22 , use the result from Ex.
6.38 with a1 = a2 = 1 so that mU (t ) = m(t )m(t ) = (1 2t ) 1 . Note that this is the mgf for a
exponential random variable with = 2, so the density function for U is
fU (u ) = 12 e u / 2 , u 0 (this is also the chisquare distribution with = 2.)
6.41
(Special case of Theorem 6.3) The mgf for the normal distribution with parameters and
2 2
is m(t ) = et + t / 2 . Since the Yis are independent, the mgf for U is given by
n
i =1
i =1
The probability of interest is P(Y2 > Y1) = P(Y2 Y1 > 0). By Theorem 6.3, the
distribution of Y2 Y1 is normal with = 4000 5000 = 1000 and 2 = 4002 + 3002 =
( 1000 )
250,000. Thus, P(Y2 Y1 > 0) = P(Z > 0250
) = P(Z > 2) = .0228.
, 000
6.43
a. From Ex. 6.41, Y has a normal distribution with mean and variance 2/n.
b. For the given values, Y has a normal distribution with variance 2/n = 16/25. Thus,
the standard deviation is 4/5 so that
P(|Y | 1) = P(1 Y 1) = P(1.25 Z 1.25) = .7888.
c. Similar to the above, the probabilities are .8664, .9544, .9756. So, as the sample size
increases, so does the probability that P(|Y | 1).
6.44
by Theorem 6.3 U has a normal distribution with mean 15n and variance 4n. We require
that .05 = P (U > 140) = P( Z > 140415n n ) . Thus, 140415n n = z.05= 1.645. Solving this
nonlinear expression for n, we see that n 8.687. Therefore, the maximum number of
watermelons that should be put in the container is 8 (note that with this value n, we have
P(U > 140) = .0002).
128
6.45
By Theorem 6.3 we have that U = 100 +7Y1 + 3Y2 is a normal random variable with mean
= 100 + 7(10) + 3(4) = 182 and variance 2 = 49(.5)2 + 9(.2)2 = 12.61. We require a
182
182
value c such that P(U > c) = P( Z > c12
). So, c12
= 2.33 and c = $190.27.
.61
.61
6.46
6.47
By Ex. 6.46, U = 2Y/4.2 has a chisquare distribution with = 7. So, by Table III,
P(Y > 33.627) = P(U > 2(33.627)/4.2) = P(U > 16.0128) = .025.
6.48
From Ex. 6.40, we know that V = Y12 + Y22 has a chisquare distribution with = 2. The
density function for V is fV ( v ) = 12 e v / 2 , v 0. The distribution function of U = V is
2
The mgfs for Y1 and Y2 are, respectively, mY1 (t ) = [1 p + pe t ]n1 , mY2 (t ) = [1 p + pe t ]n2 .
Since Y1 and Y2 are independent, the mgf for Y1 + Y2 is mY1 (t ) mY2 (t ) = [1 p + pe t ]n1 +n2 .
This is the mgf of a binomial with n1 + n2 trials and success probability p.
6.50
6.51
From Ex. 6.50, the distribution of n2 Y2 is binomial with n2 trials and success
probability 1 .8 = .2. Thus, by Ex. 6.49, the distribution of Y1 + (n2 Y2) is binomial
with n1 + n2 trials and success probability p = .2.
6.52
1
1 + 2
The mgf for a binomial variable Yi with ni trials and success probability pi is given by
n
mYi (t ) = [1 pi + pi et ]ni . Thus, the mgf for U = i =1 Yi is mU (t ) = i [1 pi + pi e t ]ni .
a. Let pi = p and ni = m for all i. Here, U is binomial with m(n) trials and success
probability p.
n
b. Let pi = p. Here, U is binomial with i =1 ni trials and success probability p.
c. (Similar to Ex. 5.40) The cond. distribution is hypergeometric w/ r = ni, N =
d. By definition,
129
Instructors Solutions Manual
P(Y1 + Y2 = k | i =1 Yi ) =
n
P ( Y1 +Y2 = k , Yi = m )
P ( Yi = m )
P ( Y1 +Y2 = k ,
P ( Yi = m )
P ( Yi = m )
n
n1 + n2
n
i =3 i
k m k
i =1 ni
n
e. No, the mgf for U does not simplify into a recognizable form.
6.54
Y
Poisson w/ mean .
n
i =1 i
is mU (t ) = e
( et 1)
c. Following the same steps as in part d of Ex. 6.53, it is easily shown that the conditional
distribution is binomial with m trials and success probability 1 +2 .
i
6.55
Let Y = Y1 + Y2. Then, by Ex. 6.52, Y is Poisson with mean 7 + 7 = 14. Thus,
P(Y 20) = 1 P(Y 19) = .077.
6.56
Let U = total service time for two cars. Similar to Ex. 6.13, U has a gamma distribution
4ue
2 u
du = .1991.
1.5
6.57
For each Yi, the mgf is mYi (t ) = (1 t ) i , t < 1/. Since the Yi are independent, the mgf
for U =
i=1 Yi is mU (t ) = (1 t ) i = (1 t ) i=1 i .
n
pet
(1 qet )
i =1
( ) , which is the
pet
1 qet
( )
pet
1 qet
r 1
(1peqet )2
t
t =0
r
p
= E(Y).
pr 2 + r ( r +1) q
p2
= E(Y2).
130
P(W1 = k | Wi ) =
6.59
P (W1 = k , Wi = m )
P ( Wi = m )
P (W1 = k ,
m k 1
r 2
m1
r 1
6.60
Note that since Y1 and Y2 are independent, mW(t) = mY1 (t ) mY2 (t ) . Therefore, it must be
so that mW(t)/ mY1 (t ) = mY2 (t ) . Given the mgfs for W and Y1, we can solve for mY2 (t ) :
(1 2t )
= (1 2t ) ( 1 ) / 2 .
(1 2t ) 1
This is the mgf for a chisquared variable with 1 degrees of freedom.
mY2 (t ) =
6.61
Similar to Ex. 6.60. Since Y1 and Y2 are independent, mW(t) = mY1 (t ) mY2 (t ) . Therefore,
it must be so that mW(t)/ mY1 (t ) = mY2 (t ) . Given the mgfs for W and Y1,
e ( e 1)
t
mY2 (t ) =
= e ( 1 )( e 1) .
t
e 1 ( e 1)
This is the mgf for a Poisson variable with mean 1.
6.62
E{exp[t1 (Y1 + Y2 ) + t2 (Y1 Y2 )]} = E{exp[(t1 + t2 )Y1 + (t1 + t2 )Y2 ]} = mY1 (t1 + t2 )mY2 (t1 + t2 )
2
6.63
a. By independence, the joint distribution of Y1 and Y2 is the product of the two marginal
densities:
f ( y1 , y 2 ) = ( ) ( 1 )1 +2 y11 1 y 22 1e ( y1 + y2 ) / , y1 0, y2 0.
1
With U and V as defined, we have that y1 = u1u2 and y2 = u2(1u1). Thus, the Jacobian of
transformation J = u2 (see Example 6.14). Thus, the joint density of U1 and U2 is
131
Instructors Solutions Manual
f (u1 , u2 ) =
1
( 1 ) ( a )1 + 2
1
( 1 ) ( a )1 + 2
u11 1 (1 u1 ) 2 1 u2
b. fU1 (u1 ) =
1
( 1 ) ( a )
1 1
1
(1 u1 )
2 1
1
1 + 2
1 + 2 1 u2 /
v 1 +2 1e v / dv =
( 1 + a )
( 1 ) ( a )
u11 1 (1 u1 ) 2 1 , with
1 + 2
u2
1 + 2 1 u2 /
1
( 1 ) ( a )
u11 1 (1 u1 ) 2 1du1 =
1
1 + 2 ( 1 +2 )
u2
1 +2 1 u2 /
6.65
a. By independence, the joint distribution of Z1 and Z2 is the product of the two marginal
densities:
2
2
f ( z1 , z 2 ) = 21 e ( z1 + z2 ) / 2 .
With U1 = Z1 and U2 = Z1 + Z2, we have that z1 = u1 and z2 = u2 u1. Thus, the Jacobian
of transformation is
1 0
J=
= 1.
1 1
Thus, the joint density of U1 and U2 is
2
2
2
2
f (u1 , u2 ) = 21 e[ u1 + (u2 u1 ) ]/ 2 = 21 e (2u1 2u1u2 + u2 ) / 2 .
b. E (U 1 ) = E ( Z1 ) = 0, E (U 2 ) = E ( Z1 + Z 2 ) = 0, V (U 1 ) = V ( Z1 ) = 1,
V (U 2 ) = V ( Z1 + Z 2 ) = V ( Z1 ) + V ( Z 2 ) = 2, Cov(U 1 ,U 2 ) = E ( Z12 ) = 1
c. Not independent since 0.
d. This is the bivariate normal distribution with 1 = 2 = 0, 12 = 1, 22 = 2, and =
6.66
a. Similar to Ex. 6.65, we have that y1 = u1 u2 and y2 = u2. So, the Jacobian of
transformation is
1 1
J=
= 1.
0 1
Thus, by definition the joint density is as given.
b. By definition of a marginal density, the marginal density for U1 is as given.
1
2
132
c. If Y1 and Y2 are independent, their joint density factors into the product of the marginal
densities, so we have the given form.
6.67
6.68
The joint density factors into the product of the marginal densities, thus independence.
6.69
1
y12 y22
, y1 > 1, y2 > 1.
b. We have that y1 = u1u2 and y2 = u2(1 u1). The Jacobian of transformation is u2. So,
f (u1 , u2 ) = u 2u3 (11u )2 ,
1 2
1
u12u23 (1u1 )2
du2 =
1 / u1
1
2 (1u1 )2
1
u12u23 (1u1 )2
1 /(1u1 )
du2 =
1
2 u12
e. Not independent since the joint density does not factor. Also note that the support is
not rectangular.
133
Instructors Solutions Manual
6.70
transformations are y1 =
u1 +u2
2
and y2 =
J =
u1 u2
2
1
2
1
2
12
1
2
= 12 , so that
u1
1
2
du2 = u1 .
u1
2 u1
1
2
u1 2
du2 = 2 u1 .
2 +u2
1
2
u2
2 u2
1
2
u2
du2 = 1 + u2 .
du2 = 1 u2 .
1
2
u2
1+u2
1
1+u2
1
2
u1
(1+u2 )2
u1
(1+u2 )2
e u1 /
u1
(1+u2 )2
u1
(1+u2 )2
, u1 > 0, u2 > 0.
b. Yes, U1 and U2 are independent since the joint density factors and the support is
rectangular (Theorem 5.5).
6.72
6.73
6.74
n
n +1
, V(Y(n)) =
n2
( n +1)2 ( n +2 )
134
6.75
Following Ex. 6.74, the required probability is P(Y(n) < 10) = (10/15)5 = .1317.
6.76
b. E(Y(k)) =
n!
( k 1)!( n k )!
y k ( y ) n k
n
() ( )
y k 1 y n k 1
n!
( k 1)!( n k )!
dy =
( n +2 )
k
n +1 ( k +1) ( n k +1)
y k 1 ( y )n k
n!
( k 1)!( n k )!
n
( ) (1 )
y k
y nk
, 0 y .
dy . To evaluate this
integral, apply the transformation z = y and relate the resulting integral to that of a
beta density with = k + 1 and = n k + 1. Thus, E(Y(k)) = nk+1 .
c. Using the same techniques in part b above, it can be shown that E (Y(2k ) ) =
so that V(Y(k)) =
( n k +1) k
( n +1)2 ( n +2 )
k ( k +1)
( n +1)( n +2 )
2 .
d. E(Y(k) Y(k1)) = E(Y(k)) E(Y(k1)) = nk+1 kn+11 = n1+1 . Note that this is constant for
all k, so that the expected order statistics are equally spaced.
6.77
a. Using Theorem 6.5, the joint density of Y(j) and Y(k) is given by
g ( j )( k ) ( y j , y k ) =
() (
y j j 1 yk
n!
( j 1)!( k 1 j )!( n k )!
) (1 )
y j k 1 j
()
yk n k 1 2
, 0 yj yk .
b. Cov(Y(j), Y(k)) = E(Y(j)Y(k)) E(Y(j))E(Y(k)). The expectations E(Y(j)) and E(Y(k)) were
derived in Ex. 6.76. To find E(Y(j)Y(k)), let u = yj/ and v = yk/ and write
1 v
E(Y(j)Y(k)) = c
( v u ) k 1 j v(1 v ) nk dudv ,
0 0
where c =
n!
( j 1)!( k 1 j )!( n k )!
1
1
c2 u k +1 (1 u ) nk du w j (1 w) k 1 j dw = c2 [B( k + 2, n k + 1)][B( j + 1, k j )] .
0
0
( k +1) j
2
Simplifying, this is ( n+1)( n+2 ) . Thus, Cov(Y(j), Y(k)) = ( n(+k1+)(1n)+j 2 ) 2 ( n+jk1)2 2 = ( n+n1)2k(+n1+2 ) 2 .
c. V(Y(k) Y(j)) = V(Y(k)) + V(Y(j)) 2Cov(Y(j), Y(k))
= ( n(+n1)k2 +( 1n)+k2 ) 2 + ( n(+n1)2j +( 1n)+j2 ) 2 ( n2+(1n)2 k( n+1+)2 ) 2 =
( k j )( n k + k +1)
( n +1)2 ( n +2 )
( n +1)
( k ) ( n k +1)
2 .
6.78
6.79
The joint density of Y(1) and Y(n) is given by (see Ex. 6.77 with j = 1, k = n),
g (1)( n ) ( y1 , y n ) = n( n 1)
yn
)( )
y1 n 1 2
y k 1 (1 y ) nk .
= n( n 1)( 1 ) ( y n y1 ) n2 , 0 y1 yn .
n
Applying the transformation U = Y(1)/Y(n) and V = Y(n), we have that y1 = uv, yn = v and the
Jacobian of transformation is v. Thus,
n
n
f (u, v ) = n( n 1)( 1 ) ( v uv ) n2 v = n( n 1)( 1 ) (1 u ) n2 v n1 , 0 u 1, 0 v .
Since this joint density factors into separate functions of u and v and the support is
rectangular, thus Y(1)/Y(n) and V = Y(n) are independent.
135
Instructors Solutions Manual
6.80
6.81
n 1
a. With f ( y ) = 1 e y / and F ( y ) = 1 e y / , y 0:
n 1
1 y /
g (1 ) ( y ) = n e y /
= n e ny / , y 0.
e
This is the exponential density with mean /n.
b. With n = 5, = 2, Y(1) has and exponential distribution with mean .4. Thus
P(Y(1) 3.6) = 1 e 9 = .99988.
6.82
Note that the distribution function for the largest order statistic is
n
n
G( n ) ( y ) = [F ( y )] = 1 e y / , y 0.
6.83
6.84
G(1) ( y ) = 1 [1 F ( y )] = 1 e y
n
] =1 e
n
ny m /
, y > 0.
This is the Weibull distribution function with shape parameter m and scale parameter /n.
6.85
Using Theorem 6.5, the joint density of Y(1) and Y(2) is given by
g (1)( 2 ) ( y1 , y 2 ) = 2 , 0 y1 y2 1.
1/ 2 1
2dy dy
2
= .5.
0 2 y1
6.86
n!
( k 1)!( n k )!
b. g ( j )( k ) ( y j , y k ) =
0 yj yk < .
(1 e ) (e )
y / k 1
n!
( j 1)!( k 1 j )!( n k )!
y / n k e y /
(1 e ) (e
y j / j 1
n!
( k 1)!( n k )!
y j /
(1 e ) (e )
) (e )
e yk /
y / k 1
k 1 j
y / n k +1 1
yk / n k +1 1
2
, y 0.
y j /
136
6.87
For this problem, we need the distribution of Y(1) (similar to Ex. 6.72). The distribution
function of Y is
y
F ( y ) = P(Y y ) = (1 / 2)e (1 / 2 )( t 4 ) dy = 1 e (1 / 2 )( y 4 ) , y 4.
a. g (1) ( y ) = 2 e
(1 / 2 )( y 4 ) 1 1
2
(1 / 2 )( y 4 )
= e ( y 4 ) , y 4.
b. E(Y(1)) = 5.
6.88
F ( y ) = P(Y y ) = e ( t ) dy = 1 e ( y ) , y > .
a. g (1) ( y ) = n e
b. E(Y(1)) =
6.89
1
n
( y ) n 1 ( y )
= ne
n ( y 4 )
, y > .
+ .
Theorem 6.5 gives the joint density of Y(1) and Y(n) is given by (also see Ex. 6.79)
g (1)( n ) ( y1 , y n ) = n( n 1)( y n y1 ) n2 , 0 y1 yn 1.
Using the method of transformations, let R = Y(n) Y(1) and S = Y(1). The inverse
transformations are y1 = s and yn = r + s and Jacobian of transformation is 1. Thus, the
joint density of R and S is given by
f ( r , s ) = n( n 1)( r + s s ) n2 = n( n 1)r n2 , 0 s 1 r 1.
(Note that since r = yn y1, r 1 y1 or equivalently r 1 s and then s 1 r).
The marginal density of R is then
1 r
f R (r ) =
n( n 1)r
n 2
ds = n( n 1)r n2 (1 r ) , 0 r 1.
Since the points on the interval (0, t) at which the calls occur are uniformly distributed,
we have that F(w) = w/t, 0 w t.
a. The distribution of W(4) is G( 4 ) ( w) = [ F ( w)]4 = w 4 / t 4 , 0 w t. Thus P(W(4) 1) =
G( 4 ) (1) = 1 / 16 .
2
6.91
n!
( j 2 )!( n j )!
(1 e
) (e
w j 1 / j 2
w j / n j 1
2
(e
( w j 1 + w j ) /
), 0 w
j1
wj < .
Define the random variables S = W(j1), Tj = W(j) W(j1). The inverse transformations
are wj1 = s and wj = tj + s and Jacobian of transformation is 1. Thus, the joint density
of S and Tj is given by
137
Instructors Solutions Manual
f ( s, t j ) =
n!
( j 2 )!( n j )!
n!
( j 2 )!( n j )!
(1 e ) (e
) (e
(1 e ) (e
e
s / j 2
( n j +1) t j / 1
2
( t j + s ) / n j 1
( 2 s +t j ) /
2
s / j 2
( n j + 2 ) s /
), s 0, tj 0.
n!
( j 2 )!( n j )!
( n j +1) t j / 1
2
(1 e ) (e
s / j 2
( n j + 2 ) s /
)ds .
s /
(n j + 1)T
j =1
= W1 + W2 + + Wr1 + (n r + 1)Wr =
r
j =1
W j + ( n r )Wr = U r .
Hence, E (U r ) = j =1 ( n r + 1) E (T j ) = r .
r
6.92
By Theorem 6.3, U will have a normal distribution with mean (1/2)( 3) = and
variance (1/4)(2 + 92) = 2.52.
6.93
di
dw
1
2r
( wr )1/ 2
for
f ( w) = r / wdr =
2
3
1
w
w , 0 w 1.
6.94
Note that Y1 and Y2 have identical gamma distributions with = 2, = 2. The mgf is
m(t ) = (1 2t ) 2 , t < 1/2.
The mgf for U = (Y1 + Y2)/2 is
mU (t ) = E (e tU ) = E (e t (Y1 +Y2 ) / 2 ) = m(t / 2)m(t / 2) = (1 t ) 4 .
This is the mgf for a gamma distribution with = 4 and = 1, so that is the distribution
of U.
6.95
By independence, f ( y1 , y2 ) = 1 , 0 y1 0, 0 y2 1.
a. Consider the joint distribution of U1 = Y1/Y2 and V = Y2. Fixing V at v, we can write
U1 = Y1/v. Then, Y1 = vU1 and dydu1 = v . The joint density of U1 and V is g (u, v ) = v .
The ranges of u and v are as follows:
138
if y1 y2 , then 0 u 1 and 0 v 1
if y1 > y2 , then u has a minimum value of 1 and a maximum at 1/y2 = 1/v.
Similarly, 0 v 1
0 u 1
.
u >1
and V is g (u, v ) = e u / v , with lnv u < and 0 v 1. Or, written another way,
eu v 1.
So, the marginal distribution of U2 is given by
1
fU 2 (u ) =
/ vdv = ue u , 0 u.
e u
Note that P(Y1 > Y2) = P(Y1 Y2 > 0). By Theorem 6.3, Y1 Y2 has a normal distribution
with mean 5 4 = 1 and variance 1 + 3 = 4. Thus,
P(Y1 Y2 > 0) = P(Z > 1/2) = .6915.
6.97
y2
0
1
2
3
p2(y2) .125 .375 .375 .125
Note that W = Y1 + Y2 is a random variable with support (0, 1, 2, 3, 4, 5, 6, 7). Using the
hint given in the problem, the mass function for W is given by
w
0
1
2
3
4
5
6
7
p(w)
p1(0)p2(0) = .4096(.125) = .0512
p1(0)p2(1) + p1(1)p2(0) = .4096(.375) + .4096(.125) = .2048
p1(0)p2(2) + p1(2)p2(0) + p1(1)p2(1) = .4096(.375) + .1536(.125) + .4096(.375) = .3264
p1(0)p2(3) + p1(3)p2(0) + p1(1)p2(2) + p1(2)p2(1) = .4096(.125) + .0256(.125) + .4096(.375)
+ .1536(.375) = .2656
p1(1)p2(3) + p1(3)p2(1) + p1(2)p2(2) + p1(4)p2(0) = .4096(.125) + .0256(.375) + .1536(.375)
+ .0016(.125) = .1186
p1(2)p2(3) + p1(3)p2(2) + p1(4)p2(1) = .1536(.125) + .0256(.375) + .0016(.375) = .0294
p1(4)p2(2) + p1(3)p2(3) = .0016(.375) + .0256(.125) = .0038
p1(4)p2(3) = .0016(.125) = .0002
139
Instructors Solutions Manual
6.98
Y1
Y1 +Y2
U2 = Y2. The inverse transformations are y1 = u1u2/(1 u1) and y2 = u2 so the Jacobian of
transformation is
J=
u2
u1
1 u1
(1 u1 ) 2
u2
(1 u1 ) 2
u2
(1u1 )2
, 0 u1 1, u2 > 0.
u2
(1u1 )2
du2 = 1, 0 u1 1.
6.100 Recall that by Ex. 6.81, Y(1) is exponential with mean 15/5 = 3.
a. P(Y(1) > 9) = e3.
b. P(Y(1) < 12) = 1 e4.
6.101 If we let (A, B) = (1, 1) and T = 0, the density function for X, the landing point is
f ( x ) = 1 / 2 , 1 < x < 1.
We must find the distribution of U = |X|. Therefore,
FU(u) = P(U u) = P(|X| u) = P( u X u) = [u ( u)]/2 = u.
1
2
fU1 (u1 ) =
1
2
u2 e
[ u22 (1+u12 )] / 2
1
2 v
140
1
(1+u12 )
, < u1 < .
The last expression above comes from noting the integrand is related an exponential
density with mean 2 /(1 + u12 ) . The distribution of U1 is called the Cauchy distribution.
6.104 a. The event {Y1 = Y2} occurs if
{(Y1 = 1, Y2 = 1), (Y1 = 2, Y2 = 2), (Y1 = 3, Y2 = 3), }
So, since the probability mass function for the geometric is given by p(y) = p(1 p)y1,
we can find the probability of this event by
P(Y1 = Y2) = p(1)2 + p(2)2 + p(3)2 = p 2 + p 2 (1 p ) 2 + p 2 (1 p ) 4 + ...
= p 2 (1 p ) 2 j =
j =0
p2
p
.
=
2
1 (1 p )
2 p
P(U = u ) = P(Y1 Y2 = u ) =
P(Y
y2 =1
= u + y2 ) P(Y2 = y2 ) =
y2 =1
x =1
p(1 p )
p(1 p ) y2 1
y2 =1
= p 2 (1 p ) u (1 p ) 2( y2 1) = p 2 (1 p ) u (1 p ) 2 x =
If u < 0, proceed similarly with y2 = y1 u to obtain P(U = u ) =
results can be combined to yield pU (u ) = P(U = u ) =
u + y2 1
p(1 p ) u
.
2 p
p(1 p ) u
. These two
2u
p(1 p )|u|
, u = 0, 1, 2, .
2u
141
Instructors Solutions Manual
6.107 The density function for Y is f ( y ) = 14 , 1 y 3. For U = Y2, the density function for U
is given by
fU ( u ) = 2 1 u f ( u ) + f ( u ) ,
.
fU ( u ) =
1 ( 14 + 0) = 1
1 u 9
8 u
2 u
6.108 The system will operate provided that C1 and C2 function and C3 or C4 function. That is,
defining the system as S and using set notation, we have
S = (C1 C2 ) (C3 C4 ) = (C1 C2 C3 ) (C1 C2 C4 ) .
At some y, the probability that a component is operational is given by 1 F(y). Since the
components are independent, we have
P( S ) = P(C1 C2 C3 ) + P(C1 C2 C4 ) P(C1 C2 C3 C4 ) .
Therefore, the reliability of the system is given by
P(U = C1 C3) =
20 y
(1 y )dy = .4156.
1/ 3
f ( y) =
6.111 a. Let U = lnY. So,
du
dy
1
y
1
( 4 )104
fY ( y ) =
1
y
fU (ln y ) =
2U
y 3e y / 10 , y 0 .
2 +2 2
/2
1
y 2
so V(Y) = e 2 +2 e +
/2
= e 2 + e 1 .
142
fY ( y ) =
1
y
du
dy
1
y
fU (ln y ) =
1
y ( )
(ln y ) 1 e (ln y ) / =
1
( )
b. Similar to Ex. 6.111: E(Y) = E(eU) = mU(1) = (1 ) , < 1, where mU(t) denotes the
mgf for U.
c. E(Y2) = E(e2U) = mU(2) = (1 2) , < .5, so that V(Y) = (1 2) (1 ) 2 .
6.113 a. The inverse transformations are y1 = u1/u2 and y2 = u2 so that the Jacobian of
transformation is 1/|u2|. Thus, the joint density of U1 and U2 is given by
1
.
fU1 ,U 2 (u1 , u2 ) = f Y1 ,Y2 (u1 / u 2 , u2 )
| u2 |
b. The marginal density is found using standard techniques.
c. If Y1 and Y2 are independent, the joint density will factor into the product of the
marginals, and this is applied to part b above.
6.114 The volume of the sphere is V =
fV ( v ) =
2
3
( 43 )2 / 3 v 1/ 3 , 0 v
4
3
4
3
R 3 , or R =
( 43 V )1/ 3 , so that
dr
dv
1
3
( 43 )1/ 3 v 2 / 3 .
Thus,
6.115 a. Let R = distance from a randomly chosen point to the nearest particle. Therefore,
P(R > r) = P(no particles in the sphere of radius r) = P(Y = 0 for volume 43 r 3 ).
Since Y = # of particles in a volume v has a Poisson distribution with mean v, we have
3
P(R > r) = P(Y = 0) = e ( 4 / 3) r , r > 0.
3
Therefore, the distribution function for R is F(r) = 1 P(R > r) = 1 e ( 4 / 3) r and the
density function is
3
f ( r ) = F ( r ) = 4r 2 e ( 4 / 3) r , r > 0.
b. Let U = R3. Then, R = U1/3 and
dr
du
= 13 u 2 / 3 . Thus,
( 4 / 3 ) u
fU (u ) = 4
, u > 0.
3 e
3
This is the exponential density with mean 4 .