You are on page 1of 53

# 4 Conditional Probability 1

Joint Probability:
For two events A & B that intersects in the sample space,
the probability p(A∩B) is called the Joint Probability.

## A∩B A Venn Diagram

B
Shown that
P(Α∪Β) = P(A) + P(B) - P(A∩B)
So, P(A∩B) = P(A) + P(B) - P(Α∪Β) ..(Joint Probability)
Note: Probability & Union of A & B never exceeds sum of
P(A) & P(B); The equality holds for mutually exclusive
(disjoint) events
ECEN 4503, Probability & Random Signals T. Menhaj
4 Conditional …(cont’d) 2

• Conditional Probability:
• Event B given such that P(B)≠0; Form the ratio P(A∩B) / P(B)
for any event A; This ratio called Conditional Probability of A
assuming B (given B) is denoted by P(A|B);

P(A ∩ B)
P(A | B) = ; no meaning if P(B)=0
Motivation: P(B)
In a manufacturing process, 15% of the parts contain
special labels and 30% of them are functionally defective. Only
10% of parts without such labels are defective. The probability
of a functionally defective part depends on our knowledge of the
presence or absence of a label. If a part has a label, the probability
of it being functionally defective is 0.30.
ECEN 4503, Probability & Random Signals T. Menhaj
4 Conditional ..(cont’d) 3

## Let A = a part is functionally defective and B= a part has the special

label; then we denote the probability of A given a part has a label, B,
P(A|B)=0.3,
P ( A | B ) = 0 .1 .
Interpretation 1:
The conditional probability makes intuitively sense in terms of
relative Frequencies:
n- Trials of some experiment, nA- no. of times event A occurs,
nB- no. of times event B occurs, nAB- no. of times both events A& B
occur,
Given that event B has occurred, the relative frequency of A is
nAB / nB.

## ECEN 4503, Probability & Random Signals T. Menhaj

4 Conditional ..(cont’d) 4

## • In other words this means that: An experiment is repeated

n times, and on each case the occurrence of the events A
and B is observed. Now, because our interest is in those
outcomes for which B occurs, we disregard the
nonoccurrence experiments. This leads to a smaller
collection of trials. In that collection the proportion of
times that A occurs is n(A ∩B)/n(B); since B occurs at
each of them. This ratio equals
• n(A ∩B)/n =P(A ∩B)
• n(B)/n P(B)
So, P(A|B) – relative frequency of event A among the trials
that produce an outcome in event B.

## ECEN 4503, Probability & Random Signals T. Menhaj

4 Conditional …(cont’d) 5

Example 16:
Find the probability of drawing two aces in succession from a deck
of 52 cards.
A1 –Event that the first card is an Ace.
A2 –Event that the second card is an Ace.
Find P(A1, A2);
P(A1∩ A2)= P(A2 | A1) P(A1),
P(A1) =4 / 52; P(A1 | A2) = 3 / 51; (51 cards left & 3 A’s),
P(A1 A2) = 1/221
Now consider drawing three cards from a 52 deck;
Ai –Event that the ith card is an ace; i=1,2,3
Case 1: With replacement
Intuitively we know that A1’s are independent, so

## ECEN 4503, Probability & Random Signals T. Menhaj

4 Conditional (cont’d) 6

3
 4
p ( A1 ∩ A 2 ∩ A 3) =   = 4.55 × 10 − 4
 52 
Case 2: Without replacement, keep each card after drawn. We
expect Ai’s are not independent.
p ( A1 ∩ A2 ∩ A3) = p ( A1) p ( A2 ∩ A3 | A1)
= p ( A1) p ( A2 | A1) p ( A3 | A1 I A2)
4 3 2
= • • = 1.81 × 10-4
52 51 50
Thus we have approximately 4.55/1.81=2.5 times better chance of
drawing three aces when cards replaced than when put aside.This is
an intuitively satisfying result because replacing the aces drawn
raises chances for an ace on the succeeding draw.
ECEN 4503, Probability & Random Signals T. Menhaj
4 Conditional …(cont’d) 7

Interpretation 2:
We may understand this definition in a special case in which a random
experiment are equally likely if there are n total outcomes, then

No. of outcomes in B n B
P(B) = =
n n
No. of outcomes in A I B n AIB
P(A I B) = =
n n
P(A I B) n AIB
=
P(B) nB
Consequently,

## ECEN 4503, Probability & Random Signals T. Menhaj

4 Conditional …(cont’d) 8

## • Given B occurs, A occurs if A ∩ B occurs. Thus the

conditional probability of A given B should be
proportional to P(A ∩ B), which is to say that;
• P(A|B)= α P(A∩B) , for some constant α we need to find
α; use normalizing factor;
• P(S|B) =1 = αP(S ∩B)
• αP(B)=1 α=1/P(B)
• Thus,
• P(AB)=P(A ∩B)/P(B)

## ECEN 4503, Probability & Random Signals T. Menhaj

4 Conditional …(cont’d) 9

P(A) P(A I S)
= = = P(A | S)
1 P(S)

## Probability of A given B is nothing but the probability of A

assuming the entire sample space is B; is that reasonable?
Yes, because the event B has already occurred and we
know that the occurrence of any outcome outside B is impossible.
Example 17: A history of 300 product samples classified
on the basis of the presence of two rare materials.
Let B= the event that consists of all samples in which

## ECEN 4503, Probability & Random Signals T. Menhaj

4 Conditional …(cont’d) 10

## material 2 is present, and A= the event that consists of all

samples in which material 1 is present.
Presence of Material 1: A
N Y
Presence of N 230 40 270
Material 2:B
Y 10 20 30
240 60 300
Find P{mat. 2 present | mat.1 present}
= P (B | A ) =
P (A I B )
= nAIB
=
20
=
1
P (A ) n A
60 3
ECEN 4503, Probability & Random Signals T. Menhaj
4 Conditional… (cont’d) 11

## Example 18: Back to Mortality experiment:

Find the probability that a person dies between 60 & 70 ,
assuming that the person is alive at 60.
Let A={60 < t ≤70}, B={t > 60}
Known, P(B)=0.317, P(A)=0.154
Since Α∩Β=A, we will have
P ( A) 0.154
P( A | B) = = = 0.486
P ( B ) 0.317
Note: 15% of all people will die between 60 & 70, though 48.6%
of the persons are alive at 60, they die between the ages 60 & 70.
Note: In some models, conditional probability can be calculated
directly from the description of the random experiment.
ECEN 4503, Probability & Random Signals T. Menhaj
4
Conditional …(cont’d)
12

## Example 19: A day’s production of 1000 transistors contain 100

parts are not acceptable. Three parts are selected at random, without
replacement, what is the probability that given the first two are
defective, the third is not defective.
Sol’n: The phrase “at random” says that when the first transistor
is selected all parts are equally likely and when the second part is
selected all the remaining transistors are equally likely.Let Bi denote
the event that the ith selected transistor is defective, i=1,2, and let A
denote the event that the third part is not defective. The probability
needed is expressed as P(A|B1,B2).Now if the first two parts are
defective, then prior to selecting the third part, the batch contains 997
transistors of which900 are not defective, therefore

## ECEN 4503, Probability & Random Signals T. Menhaj

4 Conditional …(cont’d) 13

900
p ( A | B1 , B 2 ) =
997
and what is the probability of the first two transistor are defective
and the third is not defective, P(ddn)=?. We have
100 99 900
p ( ddn ) = • • = 0 .11 × 10 − 5
1000 999 998
Note: One may use the definition of Conditional Probability to
provide a general expression for the probability of the intersection
of the two events.
P(Α∩Β)=P(AB)=P(A|B)P(B)=P(B|A)P(A)
Note: May have the following multiplication rule:
P{ABC}=P(C|A,B)P{B|A}P(A)
P{A1 A2…An)=P(An|An-1….A1}….P{A2|A1}P(A1)
ECEN 4503, Probability & Random Signals T. Menhaj
4 Conditional ..(cont’d) 14

B2 B1

## B1 contains 6 red cards and 12 white; B2 contains 10 red and 8

white. Select at random one of the following boxes and pick up at
random one of its cards. Find the probability p that the card
selected is white.
The outcome of this experiment are 36 cards.
B1 =event consists of 18 cards
B2 =event consists of 18 cards
(1) P(B1)=P(B2)=1 / 2; a box selected at random the event
W={white} consists of 20 outcomes. Our problem is to find its
ECEN 4503, Probability & Random Signals T. Menhaj
4 Conditional …(cont’d) 15

## probability.We cannot do so directly because the 36 outcomes

are not equally likely.They are however conditionally equally
likely subject to the condition that a box is selected. As a model
concept, this means that
(2) P(W|B1)= 12 / 18; P(W| B2)=8 / 18
(1) & (2) are not derived; they are assumptions about the model
based on the hypothesis that the box and cards are selected at
random.
Now we shall derive P(W) deductively, using the above assumptions.
Since P(W| B1)=P(W ∩B1) / P(B1), P(W| B2) =P(W∩ B2)/P(B2)
P(W ∩B1)=12/18*1/2=1/3; P(W∩ B2)=4/9*1/2=2/9
The events W∩ B1 & W∩ B2 are mutually exclusive, and their

## ECEN 4503, Probability & Random Signals T. Menhaj

4 Conditional ..(cont’d) 16

## union equals the event W; Hence

P(W)=1/3+2/9=5/9
Note: In about 55% of the trials the selected card will be white.
Repeated application of P(Α∩Β)=P(A|B)P(B) yields
P(A∩B∩C)=P(A|B∩C)P(B∩C)
=P(A|B∩C)P(B|C)P(C)
P(A∩Β∩C∩D)=p(A|Β∩C∩D)p(B|C∩D)p(C|D)p(D), and so on.
Show that P(A) = P(A | B)P(B) + P(A | B)P(B)

S A = (A ∩ B) ∪ (A ∩ B)
AIB A B

Disjoint
A∩B
ECEN 4503, Probability & Random Signals T. Menhaj
4 Conditional…(cont’d) 17

From axiom 3:
P(A) = P(A I B) + P(A I B) = P(A | B)P(B) + P(A | B)P(B)
Now as a probability need to check the fundamental
property of conditional probability, P(A|B). For a fixed B as A
ranges over all the events of S. We need to show that these numbers
are indeed, probabilities; that is, they satisfy the three postulates.
1.P(A|B)≥0
2. P(S|B)=1
3. If A∩C=Φ, then P(A∪C|B)=P(A|B)+P(C|B).
Knowing that A∩B is an event and S∩B=B, the first two axioms
are easily verified.
To prove the third one, we know that if A & C are disjoint, their
subsets A∩B and C∩B are also disjoint. And since
ECEN 4503, Probability & Random Signals T. Menhaj
4 Conditional ..(cont’d) 18

## (A U C) I B = (A I B) U (C I B), we may write

P(A U C) I B P(A I B) P(C I B)
P ( A U C) | B = = +
P(B) P(B) P(B)
= P(A | B) + P(C | B)
So, one may use conditional probabilities to produce a new experiment
conditioned on an event B from a given experiment S. This new
experiment has the same outcomes si & the events Ai as the original
experiment S does, but its probabilities equals P(Ai |B).

## Note: Mass interpretation of P(A|B):

The event Α∩Β is the part of A in B, and P(A|B) is the mass in that
region normalized by the factor 1/P(B), as discussed before.

## ECEN 4503, Probability & Random Signals T. Menhaj

4 Conditional …(cont’d) 19

Statistical Independence
Suppose a box contains 10 parts; 3 of them are not good and
suppose two parts taken at random from the box with replacement,
it means that the first part is replaced before the second one is
selected. What is the Probability that the second part is defective
given that the first one is defective?
Denote: A-the second part is defective B-the first part is defective
P(A|B)=?
Because the first part is replaced prior to selecting the second one,
this makes the box contains still 10 parts of which 3 are defective.
Therefore, the probability of A does not depend on whether

## ECEN 4503, Probability & Random Signals T. Menhaj

4 Conditional …(cont’d) 20

## or not the first part was defective. That is,

P(A|B)=P(A)=3/10
Note: Knowledge that the outcome of the experiment is in
event B does not affect the probability that the outcome
is in event A.
Definition: Two event A & B are independent iff one of the
following statement is true.
1)P(AB)=P(A)P(B)
2)P(A|B)=P(A); P(B)≠0
3)P(B|A)=P(B); P(A) ≠0
Consider last example.
Find the probability that the outcome (defective, defective)
occurs; this is equal to (3/10)2=0.09

## ECEN 4503, Probability & Random Signals T. Menhaj

4 Total probability 21

## Total (unconditional or average) Probability :

May express the probability of an event in terms of the
known conditional probabilities.
Suppose [A1,….,Am] is a partition of S consisting of m events;
shown below

A1 Am [A1,….,Am] -
B
A2 called exhaustive

## Consider B as an arbitrary event of S; the probability P(B) can be

m
written as sum : p( B) = ∑ p( B | A ) p(A )
i i
Why? i =1

## ECEN 4503, Probability & Random Signals T. Menhaj

4 Total …(cont’d) 22

Obviously,
m
B = B ∩ S , S = U Ai,
i =1
  m
 m m
B B  U Ai , = U  B ∩ Ai  = U E i
= ∩ Ei’s are disjoint
i =1  i =1 12 3  i =1
 Ei 
m
p ( B ) = ∑ p( Ei ),
i =1
p( E i ) = p( B ∩ Ai ) = p( B | Ai ) p( Ai ).

## This is called Total Probability theorem.

ECEN 4503, Probability & Random Signals T. Menhaj
4
Total ..(cont’d)
23

## Summary: Find P(B) of an event B if its conditional

probabilities
are given.
Ai’s - conditioning events; possible causes
B – an effect events
Note: Interpretation: This provides us a way to evaluate an
“effect”from its “causes”; this determines the probability of
an “effect” from all its possible “causes” probabilities and
the relationship between these possible “causes”, Ai’s and
“effects”, and P(B|Ai)’s.
Now we may show that how P(Ai|B) is related to P(Ai);
P(Ai|B) - Posterior probability
P(Ai) – Prior probability

## ECEN 4503, Probability & Random Signals T. Menhaj

4 Bayes’ theorem 24

Known:
p(A i ∩ B) p (B ∩ A i )
p (A i | B ) = ; p (B | A i ) =
p(B) p( A i )
p(B | A i )p(Ai )
p (A i | B ) =
p(B)
p(B | A i )P(A i )
= m
∑ p ( B | A i ) P (A i )
i =1

## This is an important result known as Bayes’ theorem.

Special case:
For any two events A & B, we may write
ECEN 4503, Probability & Random Signals T. Menhaj
4 Bayes’ Rule 25

P ( B ) = P ( B I A) + P ( B I A)
= P(B | A)P(A) + P(B | A ) P ( A )
P ( B | A) P ( A) P ( B | A) P ( A)
P( A | B) = , P( A | B) =
P(B) P(B)
Note: Interpretation
P(Ai|B) – Probability of cause Ai knowing that effect B
occurred. So Bayes’ rule gives a way to find the posterior
probability by combining the prior probability with the
evidence observed from the current experiment in which the
event B has occurred. It means that given an “effect”, event B,
it determines the probability of a particular “cause”, event Ai,
among all its possible “causes”, event Ai’s ,i=1,2,…,m.
ECEN 4503, Probability & Random Signals T. Menhaj
4 Statistical Independence 26

Statistical Independence:
The three events, A1,A2 & A3 are mutually independent iff
p(A j Ak) = p(A j)p(Ak), j ≠ k, j, k = 1,2,3, and
p(A1 A2 A3) = p(A1)p(A2)p(A3)
Independence of n Events: The events, Ai;i=1,2,3,…….,n
are independent iff
p(A j ∩ A k ) = p(A j )p(A k ), j ≠ k, j, k = 1,2,3,..., n
p(A i ∩ A j ∩ A k ) = p(A i )p(A j )p(A k ), ∀i ≠ j ≠ k
M
n n
P( I A i ) = ∏ P( A i )
i =1 i =1

## ECEN 4503, Probability & Random Signals T. Menhaj

4 Statistical …(cont’d) 27

## Example 21: Consider a fair die and events A={1},B={odd} and

C={even}. Find p(A∪B)=P(A)+P(B)-P(Α∩Β).
1 1 1 1
P ( A ) = , P(B) = , P (C) = , P ( A I B) =
6 2 2 6
1 1 1 1
P ( A U B) = + − =
6 2 6 2
P ( A U C) = P ( A ) + P (C) − P ( A I C)
P ( A I C) = 0
1 1 2
P ( A U C) = + =
6 2 3

## ECEN 4503, Probability & Random Signals T. Menhaj

4 Statistical …(cont’d) 28

## Note: Statistical Independence of events A & B just means that

Knowledge of A gives no information about the occurrence of B,
And vice versa. For example, if A⊂S then the occurrence of A
implies that of S; so, A & S are not independent in the practical
sense. However, the events A & S are statistically independent
since P(A∩S)=P(S)=P(A)P(S)
Why? This is so because the occurrence of A does imply that of S;
it does not change the probability of S.

## Practical and usual intuitive dependence ≡ Any sort of relation,

constraints;
Statistical Independence → Intuitive independence in general
Intuitive independence → Statistical Independence

## ECEN 4503, Probability & Random Signals T. Menhaj

4 Statistical …(cont’d) 29

## Note: use mathematical definition to determine statistical

independence; do not use physical meaning of the events through
common sense.
Independence & Mutually Exclusive:
Recall:
A & B disjoint ⇒ Union probability ⇒ Sum of probabilities
A & B independent ⇒ Joint probability ⇒ Product of Probabilitie
See they are quite different.
Disjoint ⇒ No common object; A∩B= Φ
The occurrence of A preludes that of B ; this gives some information
⇒ See what sort of information is obtained :

## ECEN 4503, Probability & Random Signals T. Menhaj

4 Statistical …(cont’d) 30

## Assume A,B disjoint & independent.

p(A∩B)=0; P(A∩B)=P(A)P(B)
⇒P(A)P(B)=0 ⇒ P(A)=0 or P(B)=0
If A & B are disjoint, then
1. A & B are not independent, or
2. A or B, at most of one of them, has zero probability
Summary:
A & B disjoint, If A occurs, we expect B won't occur.
A & B independent, if occurrence of one does not change the
chance of occurrence of the other one.

## ECEN 4503, Probability & Random Signals T. Menhaj

4 Statistical …(cont’d) 31

Pair-wise Independence:
Events Ai’s, i=1,2,…,n are pair-wise independent iff
P(Ai ∩ Aj )=P(Ai)P(Aj), for all i≠j
Independence ⇒ Pair-wise Independence
Independence ⇐ Pair-wise Independence
May show that P(A1A2A3)=P(A1)P(A2)P(A3) does not imply
P(Ai Aj )= P(Ai)P(Aj), i≠j, i,j=1,2,3 and vice versa.
Example 22: Consider a superficial four sided die in which face f1,
f2, f3 are brown, black, and gray. The fourth face can be
interpreted as f1, f2, f3, because it is a mixed color with brown,
black and gray.
Now, the events B = {a side with black shows up},
Br = {a side with brown shows up},
Gr = {a side with gray shows up}

## are not independent but they are indeed pair-wise independent.

ECEN 4503, Probability & Random Signals T. Menhaj
4 Examples 32

## Example 23: Independent in pairs does not imply mutually

independent. Consider events A,B & C as:
6
A∩B∩C
Red Boxes- A
Green Boxes- B
Blue Boxes- C

0 6

## ECEN 4503, Probability & Random Signals T. Menhaj

4 Examples (cont’d) 33

P (A ) = P (B ) = P (C ) = 1 & P (A ∩ B ∩ C ) = 1
3 9
P ( A ∩ B ) = P ( A ∩ C ) = P (B ∩ C ) = 1
9

## Note: They are independent in pairs.

However, they are not statistically independent because
they do not satisfy p(A ∩ B ∩ C) = P(A)p(B)p.(C)
1/9=P(Α∩B∩C) ≠ P(A)P(B)P(C )=1/27

## ECEN 4503, Probability & Random Signals T. Menhaj

4 Examples (cont’d) 34

## Note: Four conditions must be specified and that pair-wise

independence is not sufficient for the entire set of events to
be mutually independent.
In general for n event, Ai’s, it is necessary that
P(Ai,Aj ,….Ak)=P(Ai)P(Aj)….P(Ak) for every set of integers
less than or equal to n.
This implies that 2n-(n+1) equations of the above form are
requested to establish the independence of n events.
Note: If A1,A2 & A3 are independent,
P(A1∩(A2∪A3)=P(A1)P(A2∪A3)
This is not true if they are only independent in pairs. In
general, if A1,…..An are independent events, then any one
of them is independent of any event formed by unions,
intersections and compliments of the others.
ECEN 4503, Probability & Random Signals T. Menhaj
4 Examples (cont’d) 35

## • Example 24: There are N boxes. Each box

contains a number (golden). The boxes are
presented to you at random one at a time. At any
time you will see just one box. You have to make
take it there will be no more chance of taking
another box. The risk here is there still might be a
box not displayed which contains the golden no.,
so you made the wrong decision. What is the
probability of correctly choosing the Golden no.?

## ECEN 4503, Probability & Random Signals T. Menhaj

4 36
Examples (cont’d)
• Modeling:.
• Assumption: Golden no. is the biggest one. You
will never see the numbers in the boxes. So, we
have a bag containing N numbers and they are
drawn individually at random. At each draw, a
number appears; is it the biggest one? Order the
draws along a line;
• 1)first draw is number 1; ith draw is number i
• 2) pass over the first q draws; they are rejected;
and record the highest number , M, observed
within this group of q-draws

## ECEN 4503, Probability & Random Signals T. Menhaj

4 Examples (cont’d) 37

## • 3) Continue drawing more numbers; the first draw yielding

a number bigger than M is chosen to be the best number.
To determine the probability of the correct selection, we
define the following events;

## • G={y is position of the highest number}

• F={ highest number in {1,…,y-1}is in {1,…,q}}
• H={the event of correctly choosing the highest number}
• E={y>q}
• Then, P(H)=P(EFG)
• =P(EF|G)P(G)

## ECEN 4503, Probability & Random Signals T. Menhaj

4 Examples (cont’d) 38

1
p(EF | G ) = p( F | EG ) p( EG ), p( EG ) = p(E | G ) p(G ) = ;
1
424 3 N
1
prob. of the highest number in the {1,2,..., y - 1}
N
p( F | EG ) = ∑
y = q +1 draws actually being in the {1,2,..., q} draws =
N length({1,2,..., q}) N q
∑ = ∑
y = q +1 length({1,2,..., q}) y = q +1 y − 1

Second Modeling:
Let Y be the draw containing the best number.
Fq (q′) − the first number drawn in the first q′ draws
occurs in the group of the first draws. Noting that p(F; q′ ≤ q ) = 1.
ECEN 4503, Probability & Random Signals T. Menhaj
4 Examples (cont’d) 39

## • To have a correct decision, Y must be

bigger than q and the NFq (q′); q′ = 1,2,.., Y -;1 so
the event E = {Y > q} = U {Y = n}, and the
n = q +1
correct decision
H = U {Y = n, Fq (n − 1)}⇒ p( H ) = ∑ p (Y = n, Fq (n − 1)) =
N N

n = q +1 n = q +1

q N −1 1
∑ p (Fq (n − 1) | Y = n )p(Y = n ) =
N 1 N q
∑ = ∑
n = q +1 N n=q +1 n − 1 N n=q n

## ECEN 4503, Probability & Random Signals T. Menhaj

4 Examples (cont’d) 40

## • Now, how to find the above finite series.

1
( )
• Recall: Let f x x=
M 1 M
∑ f (1) ≈ [ f (1) + f (M )] + ∫ f ( x )dx ⇒
n =1 2 1
M 1 M
∑ f (1) ≈ [ f (M ) − f (m − 1)] + ∫ f ( x )dx.
n=m 2 m
Now back to our problem, m = q, M = N - 1.
1 1  1 1  N −1 1 q−N N −1
+ .... + ≈ 0.5 −  + ∫ dx = 0.5 + ln ⇒
q N −1  N −1 q  q x qN q
1424 3
C
ECEN 4503, Probability & Random Signals T. Menhaj
4 Examples (cont’d) 41

q N − 1 qC q N − 1
So, p(H ) ≈ ln + ≈ ln , for big N.
N q N N q
* dp(H )
The best choice for q, q , is obtained =0⇒
dq
− ( N − 1)
1 N −1 q q2 N −1 N −1
ln + = 0 ⇒ ln = 1 ⇒ q* =
N q N N −1 q e
q
and the maximum probability of a correct decision becomes
N −1 N −1
*
p = ln e = e −1 , for big N.
Ne N −1

## ECEN 4503, Probability & Random Signals T. Menhaj

4
Examples (cont’d)
42

## • Interpretation: From this logically driven result, we may

have the following argument which conveys which is
uncertain (random) in nature:
• “Let the first 35% of boxes pass by and then begin to
evaluate the boxes.”
• Conclusion: we can empirically estimate the “base of
natural logarithm” e .
n
 1
e = lim 1 +  = 2.718281828....
n →∞  n

## ECEN 4503, Probability & Random Signals T. Menhaj

4 Examples (cont’d) 43

P*

No. of boxes
ECEN 4503, Probability & Random Signals T. Menhaj
4 Examples (cont’d) 44

e s tim a te d n a tu ra l lo g a rith m b a s e , e
4.5

4 e=2.718
3.5

Imperical 3

Estimation 2.5

of “e” 2

1.5

0.5

0
0 100 200 300 400 500 600 700 800 900 1000
N

No. of Boxes

## ECEN 4503, Probability & Random Signals T. Menhaj

4 Lucky Key Problem 45

## • Back to the lucky key problem demonstrated in the first

lecture:
• There are three boxes. One of them does have the key. You
select one of them, and you will have a chance to switch to the
remaining box after one of the two boxes not containing the key
was opened.
• Denote the event that box i may have the key by Bi, i=1,2,3, and
assume that you select box 1 and box 2 opened.
• Fact: the event E that you will get the key by switching is
equivalent to B3 ∩ B 2 ∩ B1 ; Bi means that no key in box i.

## ECEN 4503, Probability & Random Signals T. Menhaj

4 Lucky Key Problem 46

• So,
( )
p(B3 ∩ B 2 ∩ B1 ) = p B3 B 2 ∩ B1 p(B 2 ∩ B1 ) =

= 1× p(B1 ) =
2
• Chance of getting3the key by switching get doubled.
• Generalization: for n boxes the chance of getting the key
by switching becomes:
p( E ) = p (B j B i ∩ B m ; j ≠ i, m ∈ {1,.., n}) p(B i ) =
1 1 n −1 1
= × 1 −  = >
n − 2  n  n (n − 2 ) n
note : Bi − chosen first , Bm − next opened
• Chance of finding the key becomes higher by switching.

## ECEN 4503, Probability & Random Signals T. Menhaj

4 Lucky Key Problem 47

## • Another way to look at it:

• Modeling:
• K={ A key is selected}; Er={An empty box revealed};
• Facts: p(K)=1/3; p(the empty box selected)=1-p(K)=2/3;
• Need To find P(K|Er)=?
• Use of Bayes’ theorem:
p (Er K )p (K )
p (K Er ) = ,
p (Er )
( )
p (Er ) = p (Er K )p ( K ) + p Er K p ( K ) : By Total theorem
• The above conditional probabilities depend upon a random
selection is made between the unchosen box or a prior
knowledge is used so that always to open a box does not
contain the key.
ECEN 4503, Probability & Random Signals T. Menhaj
4 48
Lucky Key Problem
 1
(
p(Er K ) = 1; p Er K =  2 ) From 2 boxes one is selected randomly
 1 Intentionally the empty box is opened

## P(Er)=1*1/3+1/2*2/3=2/3 : for random selection,

P(Er)=1*1/3+1*2/3=1; use of prior knowledge.
 1 
1 ⋅
 3 1 
 2 = ; for random selection case,
 2 
p(K Er ) =  3 
 1 
 1⋅ 3 1 
 = ; Use of prior knowledge 
 1 3 
ECEN 4503, Probability & Random Signals T. Menhaj
4 Lucky Key Problem 49

## • So, the probability of having selected a key

given the empty box is observed is ½ with
random selection and 1/3 if we use the prior
knowledge that the empty box is always
observed; it means that here we are not
using the additional information that the
box being opened is empty. The conclusion
is: We should switch .

## ECEN 4503, Probability & Random Signals T. Menhaj

4 Lucky Key Problem 50

## • Experimental Results: We run some

computer simulations. The results are given
in the following plots. As one may easily
see the results show a trend (in the average)
consistent wit the theoretical results
obtained logically from the theory of
probability (axiomatic approach).

## ECEN 4503, Probability & Random Signals T. Menhaj

4 Lucky Key Problem 51

F ig u re 1 :Th e p ro b a b ility o f c h o o s in g th e k e y in c a s e o f k e e p in g
0.5

## proba b ility of finding th e ke y if the c h os e n bo x wa s k e pt 0.45

0.4

0.35

0.3

0.25

0.2

0.15

0.1

0.05

0
0 50 100 150 200 250 300 350 400 450 500
Diffe re n t n u m b e r o f tria ls

## ECEN 4503, Probability & Random Signals T. Menhaj

4 Lucky Key Problem 52

1

## P roba bility of finding the k e y if the c ho s e n box wa s s witc h e d 0.95

0.9

0.85

0.8

0.75

0.7

0.65

0.6

0.55

0.5
0 50 100 150 200 250 300 350 400 450 500
Diffe re n t n u m b e r o f tria ls

## ECEN 4503, Probability & Random Signals T. Menhaj

4 Lucky Key Problem 53

## P roba bility of finding the k e y if the c ho s e n box wa s ra ndo m ly s e le c te d

F ig u re 2 :Th e p ro b a b ility o f c h o o s in g th e k e y in c a s e o f ra n d o m s e le c tio n
0.7

0.65

0.6

0.55

0.5

0.45

0.4

0.35

0.3
0 50 100 150 200 250 300 350 400 450 500
Diffe re n t n u m b e r o f tria ls