You are on page 1of 3

Part 1

(7.3)

4. We’ll define two events:


E = {an orange ball is chosen}
F = {a ball is picked from the second box}

There’s a 1 in 2 chance of selecting either box:


P(F) =P( F ) = 1/2
Picking an orange ball from the box, given it is the second box:
P(E|F) = 5/11
Picking an orange ball from the first box:
P(E| F ) = 3/7

P(E) = P(E|F)*P(F) + P(E| F )*P( F )

Using Bayes Theorem:


Probability that she picked a ball from the second box, given that the ball is orange
P(F|E) = ( P(E|F)*P(F) )/ P(E)
= (5/11) * (1/2) / ( (5/11)*(1/2) + (3/7)*(1/2) )
= 35/68 = .5147

6. A = {the soccer player tests positive in a drug test}


B = {the soccer player is using steroids}
5% of players take steroids
P(B) = .05
P( B) = .95
98% of players taking steroids test positive in a drug test
P(A|B) = .98
12% of players taking steroids test negative
P(A| B) = .12

P(A) = P(B)*P(A|B) + P( B)*P(A| B) = .163

The probability that a player who takes steroids tests positive


P(B|A) = P(B)*P(A|B) / P(A)
= .05*.98 / .163 = .3006

16. a. Bayes Theorem for this problem is:


P(car|late) = P(car)*P(late|car) / ( P(car)*P(late|car) + P(bus)*P(late|bus) +
P(bicycle)*P(late|bicycle) )

Since all 3 modes of transportation are equally likely to be chosen,


P(car|late) = (1/3)(.5) / ( (1/3)(.5) + (1/3)(.2) + (1/3)(.05) )
The probability that Ramesh drove to work in a car, given that he was late that day, is
2/3, or .667

b. The same formula applies, but instead of equal likelihood for each mode of
transportation, the percentages of likelihood are now different;

The probability that Ramesh got to work in a car for the new probability values is now:
P(car|late) = (.3*.5) / ( .3*.5 + .1*.2 + .6 * .05) = 3/4, or .75

Part 2
(7.4)

6. There is a 1 in C(50,6) ways to get the winning numbers. The total prize money is 10 million
dollars, so the expected value is
(10000000/C(50,6) ) - 1 = -.37
Basically, every time you buy a ticket, you can expect to lost 37 cents.

8. On a single (fair) die, each face has an equal chance of landing face up, or 1/6.
For a single die, the expected outcome is:
1*(1/6) + 2*(1/6) + 3*(1/6) + 4*(1/6) + 5*(1/6) + 6*(1/6) = 21/6 = 3.5
For three (independent and fair) dice, the expected outcome is 3.5*3 = 10.5

12. a. The chance of rolling a 6 is 1/6, and the chance of not rolling a 6 is 5/6. If we roll the die n
times and we get a 6 on the last roll, then that means we rolled the die n-1 before without
resulting in a 6 being face up.
Thus, the probability is (5/6)(n−1 )*(1/6)

b. To determine the expected number of times we roll the dice, we would need to sum
up all the possible scenarios n*(5/6)(n−1 )*(1/6) where we roll the die some n number of times.
We do a summation for n from 1 to infinity.

∑ n*(5/6)(n−1 )*(1/6) =
n =1

= (1/6)∑ n*(5/6)(n−1 ) . Here, we know that because( lim(n-> infinity) for the nth term) =
n =1
0, the series is convergent.

Using the fact that the infinite sum∑ n*( x)(n−1) = 1/(1-x)2, when | x |< 1,
n =1

We can rewrite (1/6)∑ n*(5/6)(n−1 ) as
n =1
(1/6)* (1/ (1 - 5/6)2) = 1/6 * 36 = 6
The expected number of times we roll the die is 6.

16. X counts the number of heads, Y counts the number of tails

For two variables X and Y to be independent, P(X and Y) = P(X)*P(Y).

We can see that P(X=0 and Y=0) = 0. Basically, after two coin flips, it is impossible to have 0
heads and 0 tails total.
Notice that P(X=0) = 1/4 and P(Y=0) = 1/4, so P(X=0)*P(Y=0) is actually 1/16.
This means that P(X and Y) is not equal to P(x)*P(Y)
Therefore, X and Y are dependent.

24. IA is the indicator random variable


IA = 1 if A occurs
= 0 if A doesn’t occur

Note that P(IA =1) = P(A) and P(IA =0) = P( A )


The expectation of this indicator is:
E(IA) = 1*P(A) + 0*P( A ) = P(A).
Thus, the expectation of IA is equal to the probability of A occurring

28. The variance of the number of successes in n Bernoulli trials is n*p*q. Here n = 10, p = 1/6
and q = 5/6 . Therefore the variance is 50/36 = 25/18.

You might also like