You are on page 1of 21

Chapter 4

Problems

1.

2.

4.

4
2
6

P{X = 4} =
14 91
2

2
2
1

P{X = 0} =
14 91
2

4 2
2 1 8

P{X = 2} =
91
14
2

8 2
1 1 16

P{X = 1} =
91
14
2

4 8
1 1 32

P{X = 1} =
91
14
2

8
2 28

P{X = 2} =
14 91
2

p(1) = 1/36

p(5) = 2/36

p(9) = 1/36

p(15) = 2/36

p(24) = 2/36

p(2) = 2/36

p(6) = 4/36

p(10) = 2/36

p(16) = 1/36

p(25) = 1/36

p(3) = 2/36

p(7) = 0

p(11) = 0

p(18) = 2/36

p(30) = 2/36

p(4) = 3/36

p(8) = 2/36

p(12) = 4/36

p(20) = 2/36

p(36) = 1/36

5 5 5
5 45 5
, P{X = 3} =

,
10 9 18
10 9 8 36
5 4 3 5 10
5 4 3 2 5
5

P{X = 4} =
, P{X = 5} =
,

10 9 8 7 168
10 9 8 7 6 252
5 4 3 2 1
1
P{X = 6} =

10 9 8 7 6 252

P{X = 1} = 1/2, P{X = 2} =

5.

n 2i, i = 0, 1, , n

6.

P(X = 3} = 1/8, P{X = 1} = 3/8, P{X = 1} = 3/8, P{X = 3} = 1/8

8.

(a) p(6) = 1 (5/6)2 = 11/36, p(5) = 2 1/6 4/6 + (1/6)2 = 9/36


p(4) = 2 1/6 3/6 + (1/6)2 = 7/36, p(3) = 2 1/6 2/6 + (1/6)2 = 5/36
p(2) = 2 1/6 1/6 + (1/6)2 = 3/36, p(1) = 1/36
(d) p(5) = 1/36, p(4) = 2/36, p(3) = 3/36, p(2) = 4/36, p(1) = 5/36
p(0) = 6/36, p(j) = p(j), j > 0

48
Copyright 2014 Pearson Education, Inc.

Chapter 4

11.

49
333
9
P{divisible by 105} =
1000
1000
142
P{divisible by 7} =
1000
66
P{divisible by 15} =
1000

(a) P{divisible by 3} =

In limiting cases, probabilities converge to 1/3, 1/7, 1/15, 1/10


(b) P{(N) 0} = P{N is not divisible by pi2 , i 1}

P{N is not divisible by


= (1 1 / p ) = 6/
=

pi2 }

2
i

13.

p(0) = P{no sale on first and no sale on second}


= (.7)(.4) = .28
p(500) = P{1 sale and it is for standard}
= P{1 sale}/2
=[P{sale, no sale} + P{no sale, sale}]/2
= [(.3)(.4) + (.7)(.6)]/2 = .27
p(1000) = P{2 standard sales} + P{1 sale for deluxe}
= (.3)(.6)(1/4) + P{1 sale}/2
= .045 + .27 = .315
p(1500) = P{2 sales, one deluxe and one standard}
= (.3)(.6)(1/2) = .09
p(2000) = P{2 sales, both deluxe} = (.3)(.6)(1/4) = .045

14.

P{X = 0} = P{1 loses to 2} = 1/2


P{X = 1} = P{of 1, 2, 3: 3 has largest, then 1, then 2}
= (1/3)(1/2) = 1/6
P{X = 2} = P{of 1, 2, 3, 4: 4 has largest and 1 has next largest}
= (1/4)(1/3) = 1/12
P{X = 3} = P{of 1, 2, 3, 4, 5: 5 has largest then 1}
= (1/5)(1/4) = 1/20
P{X = 4} = P{1 has largest} = 1/5

Copyright 2014 Pearson Education, Inc.

50
15.

Chapter 4
P{X = 1} = 11/66
12 j 11

66 54 j
j2
11

P{X = 2} =

P{X = 3} =


k 1
k j

11
12 j 12 k

66 54 j 42 j k
j2
3

P{X = 4} = 1

P{X 1}
i 1

16.

12 i
66
12 j 12 i
P{Y2 = i} =

66 54 j
j i

P{Y1 = i} =

P{Y3 = i} =

11
12 j 12 k

66 54 j 42 k j
j i


k j
k i

All sums go from 1 to 11, except for prohibited values.


20.

(a) P{x > 0} = P{win first bet} + P{lose, win, win}


= 18/38 + (20/38)(18/38)2 .5918
(b) No, because if the gambler wins then he or she wins $1.
However, a loss would either be $1 or $3.
(c) E[X] = 1[18/38 + (20/38)(18/38)2] [(20/38)2(20/38)(18/38)] 3(20/38)3 .108

21.

(a) E[X] since whereas the bus driver selected is equally likely to be from any of the 4 buses,
the student selected is more likely to have come from a bus carrying a large number of
students.
(b) P{X = i} = i/148, i = 40, 33, 25, 50
E[X] = [(40)2 + (33)2 + (25)2 + (50)2]/148 39.28
E[Y] = (40 + 33 + 25 + 50)/4 = 37

22.

Let N denote the number of games played.


(a) E(N) = 2[p2 + (1 p)2] + 3[2p(1 p)] = 2 + 2p(1 p)
The final equality could also have been obtained by using that N = 2 + ] where I is 0 if
two games are played and 1 if three are played. Differentiation yields that
d
E[ N ] 2 4 p
dp
and so the minimum occurs when 2 4p = 0 or p = 1/2.

Copyright 2014 Pearson Education, Inc.

Chapter 4

51

(b) E[N] = 3[p3 + (1 p)3 + 4[3p2(1 p)p + 3p(1 p)2(1 p)]


+ 5[6p2(1 p)2 = 6p4 12p3 + 3p2 + 3p + 3
Differentiation yields
d
E [ N ] = 24p3 36p2 + 6p + 3
dp

Its value at p = 1/2 is easily seen to be 0.


23.

(a) Use all your money to buy 500 ounces of the commodity and then sell after one week.
The expected amount of money you will get is
E[money] =

1
1
500 2000 = 1250
2
2

(b) Do not immediately buy but use your money to buy after one week. Then
E[ounces of commodity] =

24.

1
1
1000 250 = 625
2
2

3 7
3
11
(b) p (1 p )2 p 2
p 3/ 4 ,
4 4
4
4
7
11
p 3 / 4 p 2 p 11 / 18 , maximum value = 23.72
4
4

(a) p (1 p )

3
3
(d) q 2(1 q) , minimax value = 23/72
(c) q (1 q )
4
4
attained when q = 11/18

25.

(a) P ( X 1) .6(.3) .4(.7) .46


(b) E [ X ] 1(.46) 2(.42) 1.3
A
1

C A p

10
10

27.

C Ap =

28.

29.

If check 1, then (if desired) 2: Expected Cost = C1 + (1 p)C2 + pR1 + (1 p)R2;


if check 2, then 1: Expected Cost = C2 + pC1 + pR1 + (1 p)R2 so 1, 2, best if
p
C2
C1 + (1 p)C2 C2 + pC1, or C1
1 p

4
= 3/5
20

Copyright 2014 Pearson Education, Inc.

52

Chapter 4

30.

E[X] =

2 (1 / 2)
n

n 1

(a) probably not


(b) yes, if you could play an arbitrarily large number of games
31.

E[score] = p*[1 (1 P)2 + (1 p*)(1 p2)


d
= 2(1 p)p* 2p(1 p*)
dp
= 0 p = p*

32.

If T is the number of tests needed for a group of 10 people, then


E[T] = (.9)10 + 11[1 (.9)10] = 11 10(.9)10

35.

If X is the amount that you win, then


P{X = 1.10} = 4/9 = 1 P{X = 1}
E[X] = (1.1)4/9 5/9 = .6/9 =.067
Var(X) = (1.1)2(4/9) + 5/9 (.6/9)2 1.089

36.

Using the representation


N=2+I
where I is 0 if the first two games are won by the same team and 1 otherwise, we have that
Var(N) = Var(I) = E[I]2 E2[I]
Now,

E[I]2 = E[I} = P{I = 1} = 2p{1 p} and so


Var(N) = 2p(1 p)[1 2p(1 p)] = 8p3 4p4 6p2 + 2p

Differentiation yields
d
Var( N ) = 24p2 16p3 12p + 2
dp

and it is easy to verify that this is equal to 0 when p = 1/2.

Copyright 2014 Pearson Education, Inc.

Chapter 4
37.

53

E[X2] = [(40)3 + (33)3 + (25)3 + (50)3]/148 1625.4


Var(X = E[X2] (E[X])2 82.2
E[Y2] = = [(40)2 + (33)2 + (25)2 + (50)2]/4 = 1453.5,

38.

Var(Y) = 84.5

(a) E[(2 + X)2] = Var(2 + X) + (E[2 + X])2 = Var(X) + 9 = 14


(b) Var(4 + 3X) = 9 Var(X) = 45

39.

4
4
2 (1 / 2) = 3/8
10

41.

10

5
5
4
1
4 (1 / 3) (2 / 3) + (1/3) = 11/243

40.

i (1 / 2)

10

i7

42.

(a) Because each question will, independently, be answered correctly by both A and B with
probability .28, the mean number is 2.8.
(b) Because each question will, independently, be answered correctly either by A or by B
with probability 1 .18 .82 , the number of questions so answered is binomial with
parameters n = 10, p = .82, yielding that its variance is np (1 p ) 1.476 .

43.

5
5
3
2
4
5
3 (.2) (.8) 4 (.2) (.8) (.2)

44.

p1i (1 p1 ) ni (1 ) p2i (1 p2 )ni


i
i

ik

45.

ik

with 3: P{pass} =

2 3

1 3
2
3
2
3
(.8) (.2) (.8) (.4) (.6) (.4)
3 2
3 2

= .533

with 5: P{pass} =

1 5 5 i
2
(.8) (.2)5i

3 i 3 i
3

i (.4) (.6)
i

5 i

i3

= .3038

Copyright 2014 Pearson Education, Inc.

54
46.

Chapter 4
Let C be the event that the jury is correct, and let G be the event that the defendant is guilty.
Then
c

P (C ) P (C G ) P (G ) P (C G ) P ( G c )

(.2)i (.8)12i (.65)

i 0

(.1) (.9)

12 i )

(.35)

i 0

Let CV be the event that the defendant is convicted. Then


c

P (CV ) P (CV G ) P (G ) P (CV G ) P (G c )

(.2)i (.8)12i (.65) 1

i0
3

(.2) (.8)
i

12 i

(.65)

i0

47.

(a) and (b):

(i)

i p (1 p)
i

i 5

(iii)

i p (1 p)
i

7 i

(.1) (.9)
i

12 i )

i 0

12

(.1) (.9)

12 i )

(.35)

(.35)

i 9

9 i

(ii)

i p (1 p)
i

8 i

i 5

where p = .7 in (a) and p = .3 in (b).

i4

48.

The probability that a package will be returned is p = 1 (.99)10 10(.99)9(.01). Hence, if


someone buys 3 packages then the probability they will return exactly 1 is 3p(1 p)2.

49.

(a)

1 10 7 3 1 10 7 3
.4 .6 .7 .3
2 7
27

1 9 7 3 1 7 3
.4 .6 .7 .3
2 6
2
(b)
.55

50.

(a) P{H, T, T6 heads}

= P(H, T, T and 6 heads}/P{6 heads}


= P{H, T, T}P{6 headsH, T, T}/P{6 heads}
7
10
= pq 2 p 5 q 2 p 6 q 4
5
6
=1/10

(b) P{T, H, T6 heads}

= P(T, H, T and 6 heads}/P{6 heads}


= P{T, H, T}P{6 headsT, H, T}/P{6 heads}
7
10
= q2 p p5 q 2 p6 q4
5
6
=1/10

Copyright 2014 Pearson Education, Inc.

Chapter 4

55

51.

(b) 1 e.2 .2e.2 = 1 1.2e.2


(a) e.2
Since each letter has a small probability of being a typo, the number of errors should
approximately have a Poisson distribution.

52.

(a) 1 e3.5 3.5e3.5 = 1 4.5e3.5


(b) 4.5e3.5
Since each flight has a small probability of crashing it seems reasonable to suppose that
the number of crashes is approximately Poisson distributed.

53.

(a) The probability that an arbitrary couple were both born on April 30 is, assuming
independence and an equal chance of having being born on any given date, (1/365)2.
Hence, the number of such couples is approximately Poisson with mean 80,000/(365)2
.6. Therefore, the probability that at least one pair were both born on this date is
approximately 1 e.6.
(b) The probability that an arbitrary couple were born on the same day of the year is 1/365.
Hence, the number of such couples is approximately Poisson with mean 80,000/365
219.18. Hence, the probability of at least one such pair is 1 e219.18 1.

54.

(a) e2.2

55.

1 3 1 4.2
e e
2
2

56.

57.

(b) 1 e2.2 2.2e2.2 = 1 3.2e2.2

The number of people in a random collection of size n that have the same birthday as yourself
is approximately Poisson distributed with mean n/365. Hence, the probability that at least
one person has the same birthday as you is approximately 1 en/365. Now, ex = 1/2 when
x = log(2). Thus, 1 en/365 1/2 when n/365 log(2). That is, there must be at least 365
log(2) people.
32
17 3
(a) 1 e 3e e 2 1 2 e
3

17 3
e
P{ X 3}
2
(b) P{X 3X 1} =

P{ X 1}
1 e 3
1

59.

(a) 1 e1/2
(b)

1 1/2
e
2

(c) 1 e1/2 =

1 1/2
e
2

Copyright 2014 Pearson Education, Inc.

56

60.

Chapter 4

P{2 beneficial}3/4

P{beneficial2} =

P{2 beneficial}3 / 4 P{2 not beneficial}1 / 4


e 3

=
e 3

32 3
2 4

32 3 5 52 1
e
2 4
2 4

61.

1 e1.4 1.4e1.4

62.

For i j , say that trial pair (i, j ) is a success if the same outcome occurs on trials i and j.
Then (i, j) is a success with probability

n
k 1

pk2 . By the Poisson paradigm the number of

trial pairs that result in successes will approximately have a Poisson distribution with mean
n

pk2 n( n 1)

i j k 1

2
k

/2

k 1

and so the probability that none of the trial pairs result in a success is approximately

exp n( n 1)

63.

n
k 1

pk2 / 2 .

(a) e2.5
(b) 1 e2.5 2.5e2.5
7

64.

(a) 1

(2.5) 2 2.5 (2.5)3 2.5


e
e
2
3!

4i / i ! p

i 0

(b) 1 (1 p)12 12p(1 p)11


(c) (1 p)i1p
65.

(a) 1 e1/2
1
1 e 1/2 e1/2
2
(b) P{X 2X 1} =
1/2
1 e

(c) 1 e1/2
(d) 1 exp {500 i)/1000}
66.

Assume n > 1.
2
(a)
2n 1
2
(b)
2n 2
(c) exp{2n/(2n 1)} e1

Copyright 2014 Pearson Education, Inc.

Chapter 4

57

67.

Assume n > 1.
2
(a)
n
(b) Conditioning on whether the man of couple j sits next to the woman of couple i gives the
1
1
n2 2
2n 3
result:

n 1 n 1 n 1 n 1 ( n 1)2
(c) e2

68.

exp(10e5}

69.

With Pj equal to the probability that 4 consecutive heads occur within j flips of a fair coin, P1
= P2 = P + 3 = 0, and
P4 = 1/16
P5 = (1/2)P4 + 1/16 = 3/32
P6 = (1/2)P5 + (1/4)P4 + 1/16 = 1/8
P7 = (1/2)P6 + (1/4)P5 + (1/8)P4 + 1/16 = 5/32
P8 = (1/2)P7 + (1/4)P6 + (1/8)P5 + (1/16)P4 + 1/16 = 6/32
P9 = (1/2)P8 + (1/4)P7 + (1/8)P6 + (1/16)P5 + 1/16 = 111/512
P10 = (1/2)P9 + (1/4)P8 + (1/8)P7 + (1/16)P6 + 1/16 = 251/1024 = .2451
The Poisson approximation gives
P10 1 exp{6/32 1/16} = 1 e.25 = .2212

70.

et + (1 et)p

71.

26
(a)
38

26 12
(b)
38 38

72.

i 1
P{wins in i games} =
(.6) 4 (.4)i 4

73.

Let N be the number of games played. Then


P{N = 4} = 2(1/2)4 = 1/8,

4
P{N = 5} = 2 (1 / 2)(1 / 2)4 = 1/4
1

5
P{N = 7} = 5/16
P{N = 6} = 2 (1 / 2)2 (1 / 2)4 = 5/16,
2
E[N] = 4/8 + 5/4 + 30/16 + 35.16 = 93/16 = 5.8125

Copyright 2014 Pearson Education, Inc.

58

74.

Chapter 4
2
(a)
3

5
3
6
2
7
8
8 2 1 8 2 1 8 2 1 2
(b)
5 3 3 6 3 3 7 3 3 3

5 2 1
(c)
4 3 3
5

6 2 1
(d)
4 3 3
5

76.

N1 N 2 k
N1 N 2 k
N1 N 2 k
(1 / 2) N1 N 2 k (1 / 2)
(1 / 2)

(1 / 2)
N1
N2

77.

2N k
2
(1 / 2) 2 N k
N
2 N k 1
2
(1 / 2) 2 N k 1 (1 / 2)
N 1

79.

94
10
(a) P{X = 0} =
100
10
94 94 6 94 6

10 9 1 8 2
(b) P{X > 2} = 1
100
10

80.

P{rejected1 defective} = 3/10


6 10
P{rejected4 defective} = 1 = 5/6
3 3
5 3
6 10
= 75/138
P{4 defectiverejected} =
5 3 3 7

6 10 10 10

81.

P{rejected} = 1 (.9)4

Copyright 2014 Pearson Education, Inc.

Chapter 4
83.

59

Let X i be the number of accidents that occur on highway i. Then


E X 1 X 2 X 3 E X 1 E X 2 E X 3 1.5

84.

Let X i equal 1 if box i does not have any balls, and let it equal 0 otherwise. Then
5

E Xi
i 1

E X P( X
i

i 1

1)

i 1

(1 p )

10

i 1

Let Yi equal 1 if box i has exactly one ball, and let it equal 0 otherwise. Then
5
E Yi
i 1

i 1

E Yi

P (Yi 1)

i 1

10 p (1 p )
i

i 1

where we used that the number of balls that go into box i is binomial with parameters 10 and pi .
85.

Let X i equal 1 if there is at least one type of i coupon in the set of n coupons. Then
Let X i equal 1 if box i does not have any balls, and let it equal to 0 otherwise. Then
k

E Xi
i 1

E X P( X
i

i 1

i 1

1)

1 (10 p ) k (1 p )
k

i 1

Copyright 2014 Pearson Education, Inc.

i 1

60

Chapter 4

Theoretical Exercises
1.

Let Ei = {no type i in first n selections}


N
P{T > n} = P Ei
i 1

(1 P ) (1 P P ) (1 p
+ (1) P
n

I J

i jk

pk ) n

P{T = n} = P{T > n 1} P{T > n}


2.

Not true. Suppose P{X = b} = > 0 and bn = b + 1/n. Then lim P ( X bn } = P{X b}
bn b

P{X < b}.


3.

When > 0
x

x
P{X + x} = P x
F

When < 0
x

x
P{X + x} = P X
1 .
1 lim F

h 0

4.

(a) Using the hint yields

n 1 n ( n 1)( n 2)

n(n 1) n(n 2)
1

n 1

n 1

1
1 1 1 1

n 1 n n 1 ( n 1) 2 n 1 n n 1 n 2

(b) E [ X ] 4

1 1
1 1 / 4
2 2

(n 1)(n 2)
1

n 1

1
1
4

4(1 / 2) 2
n 1 n 1 n 1 n 2

(c) E [ X ( X 1)] 4

1
Hence, E [ X 2 ] E [ X ( X 1)] E [ X ] .
n 1 n 2

Copyright 2014 Pearson Education, Inc.

Chapter 4
5.

61

Because ai will be part of the sum in all terms having j i it follows that

( a1 a j ) P{N j}

j 1

a ( P{N i} P{N i 1} )
i

i 1

a {N i}
i

i 1

Letting a j 1, j 1 gives that


E[ N ]

P{N i}
i 1

Letting a j j, j 1 gives that


E [ N ( N 1) / 2]

iP{N i}
i 1

6.

E[cX] = cp + c1(1 p)
Hence, 1 = E[cX] if
cp + c1(1 p) = 1
or, equivalently
pc2 c + 1 p = 0
or
(pc 1 + p)(c 1) = 0
Thus, c = (1 p)/p.

7.

E[Y] = E[X/ /] =

E [ X ] / = / / = 0

Var(Y) = (1/)2 Var(X) = 2/2 = 1.


8.

Let I equal 1 if X = a and let it equal 0 if X = b, then


X a (b a ) I

yielding the result


Var( X ) (b a )2 Var( I ) (b a ) 2 p (1 p )

Copyright 2014 Pearson Education, Inc.

62

Chapter 4

9.

P( X i ) i p (1 p)
i 1

n i

i 0

If x and y are positive numbers, then letting p


n

n x y
1
i x y x y
i0
n

x
, gives
x y

n i

or, upon multiplying both sides by ( x y )n , that


( x y )n

i x y
i

n i

i 0

10.

E[1/(X + 1)] =

i 1 (n i )!i ! p (1 p)
i 0
n

i 0

n!

n i

n!
p i (1 p )n i
( n i )!(i 1)!
n

n 1

1
(n 1) p

i 1 p

1
(n 1) p

n 1 0

1
n 1 0
(1
)

p
p
1

(n 1) p 0

1
[1 (1 p )n 1 ]
(n 1) p

11.

i 1

(1 p ) n i

i0
n 1

n 1 j
n 1 j
p (1 p )
j
j 1

For any given arrangement of k successes and n k failures:


P{arrangementtotal of k successes}
=

12.

P{arrangement}
p k (1 p )n k
1

n
P{k successes} n k

nk
k p (1 p )
k

Condition on the number of functioning components and then use the results of Example 4c
of Chapter 1:
n

Prob =

i p (1 p)
i 0

n i

i 1

n i

n
i

i 1
where
= 0 if n i > i + 1. We are using the results of Exercise 11.
n i

Copyright 2014 Pearson Education, Inc.

Chapter 4
13.

63

Easiest to first take log and then determine the p that maximizes log P{X = k}.
n
log P{X = k} = log + k log p + (n k) log (1 p)
k

k nk
log P{x k }
p
p 1 p
= 0 p = k/n maximizes

14.

(a) 1

n 1

p
1 p

(b) Condition on the number of children: For k > 0

P{k boys} =

P{k n children} p
n 1

k (1 / 2) p
n

n k

P{0 boys} = 1
17.

p
1 p

(1 / 2) n

n 1

(a) If X is binomial (n, p) then, from exercise 15,


P{X is even} = [1 + (1 2p)n]/2
= [1 + (1 2/n)n]/2 when = np
(1 + e2)/2 as n approaches infinity
(b) P{X is even} = e

18.

2n

/ (2n )! = e(e + e)/2

log P{X = k} = + k log log (k!)

k
log P{ X k } 1

=0=k

Copyright 2014 Pearson Education, Inc.

64

Chapter 4

19.

E[X n] =

i e
i 0

i e
i 1

/ i!
/ i!

n 1

e i / (i 1)!

i 1

( j 1)

n 1

e j 1 / j !

j0

( j 1)

n 1

e j / j!

j 0

= E [( X 1)n 1 ]
Hence [X 3] = E(X + 1)2]
=

(i 1) e
2

/ i!

i 0

= i 2 e i / i ! 2 ie i / i ! e i / i !
i0
i 0
i 0

2
= [ E [ X ] 2 E[ X ] 1)
= (Var(X) = E2[X] + 2E[X] + 1)
= ( + 2 + 2 + 1) = (2 + 3 + 1)

20.

Let S denote the number of heads that occur when all n coins are tossed, and note that S has a
distribution that is approximately that of a Poisson random variable with mean . Then,
because X is distributed as the conditional distribution of S given that S > 0,
P{X = 1} = P{S = 1S > 0} =

P{S 1}
e

P{S 0} 1 e

21.

(i) 1/365
(ii) 1/365
(iii) 1 The events, though independent in pairs, are not independent.

22.

(i) Say that trial i is a success if the ith pair selected have the same number. When n is large
trials 1, , k are roughly independent.
(ii) Since, P{trial i is a success} = 1/(2n 1) it follows that, when n is large, Mk is
approximately Poisson distributed with mean k/(2n 1). Hence,
P{Mk = 0} exp[k/(2n 1)]
(iii) and (iv) P{T > n} = P{Mn = 0} exp[n/(2n 1)] e/2

Copyright 2014 Pearson Education, Inc.

Chapter 4

23.

65

(a) P(Ei) = 1

2 365
j 0
(1 /

365) j (364 / 365)365 j

(b) exp(365P(E1)}
24.

(a) There will be a string of k consecutive heads within the first n trials either if there is one
within the first n 1 trials, or if the first such string occurs at trial n; the latter case is
equivalent to the conditions of 2.
(b) Because cases 1 and 2 are mutually exclusive
Pn = Pn1 + (1 Pnk1)(1 P)pk

25.

P(m counted) =

P(m n events)e

m p

/ n!

(1 p )n m e n / n !

nm

= e p

( p)m
m!

e p

[ (1 p )]n m (1 p )
e
( n m )!
nm

( p )m
m!

Intuitively, the Poisson random variable arises as the approximate number of successes in n
(large) independent trials each having a small success probability (and n). Now if each
successful trial is counted with probability p, than the number counted is Binomial with
parameters n (large) and p (small) which is approximately Poisson with parameter pn = p.
27.

P{X = n + kX > n} =

P{ X n k }
P{ X n}

p (1 p )n k 1
(1 p )n
= p(1 p)k1

If the first n trials are fall failures, then it is as if we are beginning anew at that time.
28.

29.

The events {X > n} and {Y < r} are both equivalent to the event that there are fewer than r
successes in the first n trials; hence, they are the same event.

P{ X k 1}
P{ X k}

Np N np
k 1 n k 1
=
Np N Np
k n k

( Np k )( n k )
(k 1)( N Np n k 1)

Copyright 2014 Pearson Education, Inc.

66

30.

Chapter 4
j 1
P{Y = j} =
n 1
N

E[Y] =

j 1

n 1
jn

=
31.

n
N
n

N
n , n j N
N
n

n
jn

n N 1 i 1
N i n 1 n 1 1
n

n N 1
N n 1
n
n( N 1)
n 1

Let Y denote the largest of the remaining m chips. By exercise 28


j 1
P{Y = j} =
m 1

m n
m , m j n + m

Now, X = n + m Y and so
m n i 1
P{X = i} = P{Y = m + n i} =
m 1

32.

P{X = k} =

k 1
n

k 2
i0

m n
m , i n

ni
,k>1
n

Copyright 2014 Pearson Education, Inc.

Chapter 4

67
n
k
k

34.

E[X] =

1
n
k2
k

k 0

E[X 2] =

2
k 0

n 2n 1
2n 1

2n 2 n( n 1)
2n 1
n 22 n 2 n( n 1)2n 2
(2n 1)2

Var(X) = E[X 2 ] {E[X])2 =

n 22 n 2 n

4
22 n

~
n 1
E[Y] =
, E[Y 2] =
2

35.

/n ~

i 1

x 2 dx n 2
~
n
3

n
n
n 1
~

3 2
12
2

Var(Y) ~

n 1

12
i
1

...
2 3 i 1 i 1
(b) P(X < } = lim P{ X i}

(a) P{X > i} =

= lim(1 1 / (i 1)) 1
i

iP{X i}
= i ( P{ X i 1} P{ X i}

(c) E[X] =

i i i 1
i

i 1
1

Copyright 2014 Pearson Education, Inc.

68
36.

Chapter 4
(a) This follows because { X Y zk } ( i , j )Ak {X xi , Y y j } , and the events
{X xi , Y y j } , (i, j ) Ak , are mutually exclusive.
E[ X Y ]

(b)

x P{X Y z }
k

z
k

P{ X xi , Y y j }

( i , j )Ak

zk P{ X xi , Y y j }

( xi y j ) P{ X xi , Y y j }

k ( i , j )Ak

k ( i , j )Ak

(c) This follows from (b) because every pair i, j is in exactly one of the sets Ak .
(d) The follows because
{ X xi } j { X xi , Y y j }

(e) From the preceding


E[ X Y ]

( xi y j )xk P{ X xi , Y y j }

k ( i , j )Ak

( x y ) P{X x ,Y y }
i

i, j

x P{ X x ,Y y } y P{X x ,Y y }
i

i, j

x P{X x ,Y y } y P{ X x ,Y y }
i

i, j

x P{ X x } y P{Y y }
i

E [ X ] E [Y ]

Copyright 2014 Pearson Education, Inc.

You might also like