1 views

Uploaded by Jaybert Dumaran

sefrsefsefs

save

- GMAT Math Problems
- 44 a Game of Automatic Fear
- Probability example questions with solutions
- tut1ans.pdf
- BetrayalSmall_v3
- Cthulhu Tech Quick Ref
- practice1_winter2012
- Dice Games
- SuperSystemII2nd
- probability - dice lesson
- dice1.pdf
- Lift Traffic Analysis_ Formulae for the General Case
- Lorraine Daston, The History of Emergences
- grup9
- hw2
- Discussion-Paper-Fixture-Unit-Rating-Systems-July-2015.pdf
- env68 7 07 06 kc
- Lecture on Basic Probability and Counting
- Meeple Crawl Rules
- syllabus statistics for mechanical engineering
- Decision tree Anal Folding Back 09.pdf

You are on page 1of 8

Homework 2 Solutions.

Problem 1. Prof. Pollak is flying from LA to Paris with two plane changes, in New York and London.

The probability to lose a piece of luggage is the same, p, in LA, NY, and London. Having arrived in

Paris, Prof. Pollak discovers that his suitcase is lost. Calculate the conditional probabilities that the

suitcase was lost in LA, NY, and London, respectively.

Solution. There are four outcomes in the sample space of this experiment carried out by Prof. Pollak:

NL: Suitcase was not lost

LD: Suitcase was lost in London

NY: Suitcase was lost in New York

LA: Suitcase was lost in LA

They can be described by a sequential tree in Fig. 1.

NL

1-p

1-p

p

(1-p) 3

LD

(1-p) 2 p

1-p

p

NY

Suitcase was lost

(1-p)p

p

LA

p

**Figure 1: Sequential tree description for Problem 1.
**

We can see that

P({suitcase was lost}) = 1 − P({suitcase was not lost}) = 1 − P({N L}) = 1 − (1 − p)3 = p3 − 3p2 + 3p.

Since the event {suitcase was lost in LA} is a subset of the event {suitcase was lost}, the intersection

of these two events is {suitcase was lost in LA}. The same reasoning applies to the events {suitcase

was lost in NY} and {suitcase was lost in London}. Using the definition of conditional probability, we

therefore have that the conditional probabilities of having lost the suitcase in LA, NY, and London, are:

P({suitcase was lost in LA|suitcase was lost}) =

=

P({lost in NY|suitcase was lost}) =

=

1

P({LA})

P({suitcase was lost})

1

p

;

= 2

3

1 − (1 − p)

p − 3p + 3

P({N Y })

P({suitcase was lost})

1−p

(1 − p)p

;

= 2

3

1 − (1 − p)

p − 3p + 3

Similarly. if he chooses to play champion-father-champion. LW W }) = P({W W W }) + P({W W L}) + P({LW W }) = f cf + f c(1 − f ) + (1 − f )cf = f c(2 − f ). or champion-father-champion. The champion is a better player than Elmer’s father. Since each of the three colors appears on two out of the four equally likely faces. f > c. LW L.) Solution. according to Elmer’s choice. we have: P(Ar ) = P(Ag ) = P(Ab ) = 2/4 = 0. Ab pairwise independent? Are they independent? Solution. W LL. Since the champion is a stronger player. the probability to win will be cf (2 − c) which is obviously larger than f c(2 − f ). LW W }.5. The face EFH is painted red. Therefore he must choose the champion-father-champion series. If Elmer chooses to play the father-champion-father series. his father offers him a prize if he wins two tennis sets in a row in a three-set series to be played with his father and the club champion alternately: father-champion-father. LLL}. The event {win two sets in a row}={W W W. the probability to win the prize is: P({win two sets in a row}) = P({W W W. Define F G E H the following events: Ar = {a face picked at random has red on it} Ag = {a face picked at random has green on it} Ab = {a face picked at random has blue on it} Are Ar . The face EGH of the tetrahedron FEGH is painted in three colors: red. W LW. W W L.P({LD}) P({suitcase was lost}) (1 − p)2 p (1 − p)2 = 2 . Suppose Elmer has probabilities f and c to win against his father and the champion. W W L. The face HFG is painted green. LLW. and blue. To encourage Elmer’s promising tennis career. The face GFE is painted blue. LW W. Problem 3. Ag . Which series should Elmer choose? (Assume that the results of the three tennis sets are independent. 2 . W W L. 3 1 − (1 − p) p − 3p + 3 P({lost in London|suitcase was lost}) = = Problem 2. respectively. green. There are 23 = 8 outcomes in the sample space of all possible results of a three-set series: Ω = {W W W.

the probability that the biased coin will result in T and the other in H is 2/3 · 1/2 = 1/3. For the biased coin. We can therefore calculate the probability that the biased coin is not removed by Al and that it is not removed by Bo. the probability that it will result in H and the other coin will result in T is 1/3 · 1/2 = 1/6. and it is the only face with more than one color. Al draws two coins from the box at random. • There are three equally likely ways of choosing two coins out of a set of three. and returns the coins to the box. his probability to observe one H and one T is 1/2.To check for pairwise independence. Determine the probability that neither Al nor Bo removed the biased coin from the box. and then multiply the two probabilities. because each coin is equally likely not to be chosen. The probability that both Al’s coins are fair is the probability that the biased coin is not chosen. given that he observed one head and one tail: P({both coins are fair| H and T}) = = P({H and T| both coins are fair})P({both coins are fair}) P({H and T}) 1/2 · 1/3 = 1/3. The result it a tail. The two experiments performed by Al and Bo are independent. they are not independent because P(Ar ∩ Ag ∩ Ab ) = P({EGH}) = 1/4 6= P(Ar )P(Ag )P(Ab ). which is 1/3. (1) 1/2 The explanation for these numbers is: • Regardless of which two coins Al picked. then the probability that the first coin results in H and the second one in T is 1/4. Therefore. P(Ar ∩ Ab ) = P({EGH}) = 1/4 = P(Ar )P(Ab ). producing a total probability of 1/4 + 1/4 = 1/2 to get one H and one T. Solution. and the probability that the first one results in T and the second one in H is also 1/4. flips each of them once. consider pairwise intersections of the three events. 3 . If the coins are fair. A box contains two fair coins and one biased coin. For Al’s experiment. Therefore. P({H and T| both coins are fair} = P({H and T}) = 1/2. This also gives a total probability of 1/6 + 1/3 = 1/2 of observing one H and one T. the probability that any flip will result in a head is 1/3. Hence these events are pairwise independent. Problem 4. P(Ab ∩ Ag ) = P({EGH}) = 1/4 = P(Ab )P(Ag ). If one of the coins is biased. However. The probability that a randomly picked face will have any two colors on it is the same as the probability that this face is EGH because EGH has all three colors. Both individual probabilities are computed using Bayes’ rule. observes one head and one tail. P(Ar ∩ Ag ) = P({EGH}) = 1/4 = P(Ar )P(Ag ). Bo then draws one coin from the box at random and flips it. we need the conditional probability that both coins he picked were fair.

it is known that both dice are fair. P(On ∩ On+1 ) = P(On ∩ On+1 |H)P(H) + P(On ∩ On+1 |T )P(T ) (4) Given the result of the coin flip. (c) If olive readings result from all the first n throws. T = {coin falls tails}. 2 4 . the game continues by throwing die A alone. is 1/3 · 3/5 = 1/5. Therefore it follows from Eq. 6 2 2 2 3 (b) This is another application of the total probability theorem. P(On ∩ On+1 |H) = P(On |H)P(On+1 |H) = 6 µ ¶2 1 P(On ∩ On+1 |T ) = P(On |T )P(On+1 |T ) = . die A is being thrown and P(On |H) = 5/6. given the observations. Let H = {coin falls heads}. (1) and (2). (b) Determine the probability that both the n-th and (n + 1)-st throws of the die result in olive. die B alone is used to continue the game. A fair coin is flipped once. This conditional independence means that µ ¶2 5 . Interpret your result for large values of n. 1/2 · 2/3 + 2/3 · 1/3 (2) Multiplying the two probabilities computed in Eqs. However awful their face colors may be. and On = {n-th throw of the die is olive}. we have: P(On ) = P(On |H)P(H) + P(On |T )P(T ) (3) Given that H happens.Bo’s probability is computed similarly. if it falls tails. (a) Determine the probability that the n-th throw of the die results in olive. we get that the conditional probability that the biased coin was not removed. P({fair| T}) = = = P({ T| fair})P({fair}) P({T}) P({T| fair})P({fair}) P({T| fair})P({fair}) + P({T| biased})P({biased}) 1/2 · 2/3 = 3/5. Problem 5 Die A has five olive faces and one lavender face. Otherwise T must occur and P(On |T ) = 1/2 because die B is being thrown. Solution. determine the conditional probability of an olive outcome on the (n + 1)-st throw. If it falls heads. (3) that P(On ) = 5 1 1 1 2 · + · = . die B has three faces of each of these colors. a die is chosen and the following throws of the die are conditionally independent. (a) By the total probability theorem.

an olive reading makes it more likely that we are throwing die A. we multiply both the numerator and the denominator by (6/5)n . and therefore it becomes more likely that the result of the next throw will also be olive. 6 This result makes sense because if we see a long sequence of olives. conditioned on our knowledge of which die we are throwing. which would make the probability that the next reading is olive 5/6. Indeed. 6 2 2 2 36 Note that it would be wrong to compute this probability by squaring the result from Part (a): while getting an olive on the n-th throw and an olive on the (n + 1)-st throw are two events that are conditionally independent. (c) We need to calculate the following conditional probability: ¯ n ! Ãn+1 ! . and let n go to infinity: ¯ n ! Ã ¡ 6 ¢n 1 ¯\ 5 ¯ 6 + ¡10 ¢ 2 lim P On+1 ¯ = lim Ok n n→∞ n→∞ 1 + 6 ¯ 10 k=1 = 5 .From Eq. for any m > 0 we have: Ãm ! Ãm ! Ãm ! \ \ \ P = P Ok Ok |H P(H) + P Ok |T P(T ) k=1 k=1 µ ¶m µ ¶m 5 1 1 1 = + . Ã n ! Ã ¯\ \ \ ¯ P P On+1 ¯ Ok = P Ok Ok . 5 . ¯ k=1 k=1 (5) k=1 By the total probability theorem and conditional independence. it is much more likely that we are throwing die A. we have: µ ¶2 µ ¶2 1 1 5 1 17 P(On ∩ On+1 ) = + = . ¯ + 2 6 k=1 To interpret this for large values of n.(4). (5) we get: ¯ n Ã ! ¡ ¢n+1 ¡ ¢n+1 ¯\ 5 + 1 ¯ P On+1 ¯ Ok = 6 ¡ 5 ¢n ¡ 21 ¢n . if we do not know whether we are throwing die A or die B. these events are not unconditionally independent. 6 2 2 2 k=1 Substituting this into the right-hand side of Eq.

which is: p= n Y pi .5 of being out of service. the probability that the entire link is OK is the probability that all the links are OK: n Y 1−p= (1 − pi ). Link failures are independent.A B Problem 6. a convenient method of solving such problems is to break down the system into smaller subsystems and apply the rules for parallel and series connections which are illustrated in Fig. and each link has a probability of 0. For the parallel case. Towns A and B can communicate as long as they are connected in the communication network by at least one path which contains only in-service links. 2. p1 p2 p1 pn (a) Connection in parallel p2 pn (b) Connection in series Figure 2: Problem 6. The pi ’s in the figure are the probabilities of failure of the corresponding links. i=1 6 . As discussed in class. in an efficient manner. the probability that A and B can communicate. Determine. Solution. Each || represents one communication link. the probability of failure of the entire link is the probability that all the links fail. i=1 For the series case.

0.5 3 ) = 51/128 0.5/16)(1 .8008.5 0.5 0.5 (a) (b) 0. 7 . which means that the probability of failure of the entire link is 1/4.0. Similarly.5 3 0. 3(b) to Fig.5)(1 .(1 . from (a) to (f). 3.5 0.5 3 B 51/128 A B 1 .(1 . where p is the probability of failure of the entire link. Since the probabilities of failure of the two links are 1/2 and 1/4.5 A 51/256 B 51/128 × 0. we replace the boxed series connection with an equivalent link. Each number in the figure is the probability of failure of the corresponding link.5 A 5/8 0.5 0. we replace the two parallel connections of links enclosed in dashed boxes. which means that the probability of success of the equivalent link it 1/2 · 3/4 = 3/8.5 = 51/256 (c) (d) (e) (f) Figure 3: Reduction process of Problem 6. 3(c).5 A B 0. the probabilities of success are 1/2 and 3/4. which is equal to 51/256. The first one contains two parallel links whose probabilities of failure are 1/2. the original network in this problem can be reduced step by step. as illustrated in Fig. To go from Fig.5 0.0. we finally get to Fig.5 2 0. with equivalent links.0. both components have to be OK. the second one contains three parallel links.5 0.5 0. Therefore the probability of success is: 1 − 51/256 = 205/256 ≈ 0. and its probability of failure is 1 − 3/8 = 5/8. 3(b).5 2 ) = 5 / 8 0. to result in the probability of failure 1/8.5 1 .5 × 5 / 8 = 5 / 16 B A 5/16 0. respectively. 3(f) which gives us the probability of failure of the entire link from A to B.5 0.5 0. In order for the whole link to be OK.5 3 B 0. 3(a). in order to get from the original network depicted in Fig. to the equivalent network of Fig.5 A 0. Based on the two equations above. Proceeding to combine the links in a similar fashion. For example.

OE at random. = 120 (Here. we denote the event that the hiker visits the point A. by the same letter as the point itself. A hiker leaves the point O. choosing one of the roads OB. B. At each subsequent crossroads he again chooses a road at random. What is the probability of the hiker arriving at the point A? O B C D E A Solution.) 8 . D. By the total probability theorem: P(A) = P(A|B)P(B) + P(A|C)P(C) + P(A|D)P(D) + P(A|E)P(E) 1 2 1 1 1 1 1 · + · +1· + · = 3 4 2 4 4 5 4 10 15 30 12 = + + + 120 120 120 120 67 . OD. E. OC.Problem 7. C.

- GMAT Math ProblemsUploaded by39reujfjds
- 44 a Game of Automatic FearUploaded bydavidsandey
- Probability example questions with solutionsUploaded bykobeadjordan
- tut1ans.pdfUploaded bywilliam walters
- BetrayalSmall_v3Uploaded byWillyRapier
- Cthulhu Tech Quick RefUploaded byRon Sterling
- practice1_winter2012Uploaded byJustinMalin
- Dice GamesUploaded byJonathan Ryner
- SuperSystemII2ndUploaded byUberSpoons
- probability - dice lessonUploaded byapi-273897319
- dice1.pdfUploaded byKiller
- Lift Traffic Analysis_ Formulae for the General CaseUploaded byMostafa Alaahoba
- Lorraine Daston, The History of EmergencesUploaded byakansrl
- grup9Uploaded byeni
- hw2Uploaded byTarik Adnan Moon
- Discussion-Paper-Fixture-Unit-Rating-Systems-July-2015.pdfUploaded bytehtehteh
- env68 7 07 06 kcUploaded byapi-367296283
- Lecture on Basic Probability and CountingUploaded byMitchell McIntire
- Meeple Crawl RulesUploaded byigymnast
- syllabus statistics for mechanical engineeringUploaded bydfsdfsdfdf4646545
- Decision tree Anal Folding Back 09.pdfUploaded byrrazi