You are on page 1of 2

Bayes rule

Marginal Probability: The probability of an event irrespective of the outcomes of other random variables, e.g.,
P(A).

Joint Probability: Probability of two (or more) simultaneous events, e.g. P (A and B) or P (A, B).

Conditional Probability: Probability of one (or more) event given the occurrence of another event, e.g., P (A
given B) or P (A | B).

The joint probability can be calculated using the conditional probability, for example:

P (A, B) = P (A | B) * P(B)

Let A1, A2, ..., are mutually exclusive and exhaustive, i.e., one of them is certain to occur but no two can occur
together.

Let B be the event that happens with any one of the events A1, A2, ...,. That is B cannot happen independent of
A1, A2, A3,…. .

Then, P(B)=P(B and A1) + P(B and A2) + P(B and A3)+…

P(B) is called the marginal likelihood; this is the total probability of observing the evidence. 

Then, P(A1|B) and P(A2|B) are given by the following formula

P(A1|B) is called the posterior; this is what we are trying to estimate.

P(B|A1) is called the likelihood; this is the probability of observing the new evidence, given our initial
hypothesis.

P(A) is called the prior; this is the probability of our hypothesis without any additional prior information.

Calculations using Table

Step 1. Prepare the following three columns:


Column 1—The mutually exclusive events Ai for which posterior probabilities are desired.
Column 2—The prior probabilities P(Ai) for the events.
Column 3—The conditional probabilities P (B ∣ Ai) of the new information B given each event.

Step 2. In column 4, compute the joint probabilities P (Ai ∩ B) for each event and the new information B by
using the multiplication law. These joint probabilities are found by multiplying the prior probabilities in column
2 by the corresponding conditional probabilities in column 3; that is, P (Ai ∩ B) = P(Ai)P (B ∣ Ai).

Step 3. Sum the joint probabilities in column 4. The sum is the probability of the new information, P(B).

Step 4. In column 5, compute the posterior probabilities using the basic relationship of conditional probability.

Note that the joint probabilities P (Ai ∩ B) are in column 4 and the probability P(B) is the sum of column 4.
Example:

A drug manufacturer believes there is a 0.95 chance that the Food and Drug Administration (FDA) will approve
a new drug the company plans to distribute if the results of current testing show that the drug causes no side
effects. The manufacturer further believes there is a 0.50 probability that the FDA will approve the drug if the
test shows that the drug does cause side effects. A physician working for the drug manufacturer believes there is
a 0.20 probability that tests will show that the drug causes side effects. What is the probability that the drug will
be approved by the FDA?

Steps for the solution:

Step-1: One has to decide upon the role for the given situation, Stakeholders associated with the role defined,
what value one can add if the calculations are done.

Step-2: Which method to be used for the calculations? This should be identified based on the information given
in the situation and based on required computations.

Step-3: For the given problem, the following is the information given

Probability of a drug showing a side effect=0.20 (P(SE)=0.20) and Probability of a drug not showing a side
effect=0.80 (P(NSE)=0.80).

Probability of approving a drug if the results of the testing show no side effects=0.95 (P (A|NSE)=0.95)

Probability of approving a drug if the results of the testing show side effects=0.50 (P (A| P(SE)=0.50)

From the above one can note that the information given can be defined as Apriori probability (P(SE) and
P(NSE)) and the likelihood (P(A|SE) and P(A|NSE))

Hence, we use Bayes rule for solving the problem and for calculating the required probability

P (SE|A) = P (SE and A)/P(A)

P (A)= P(A|SE) * P(SE)+ P(A|NSE) *P(NSE)

= 0.95*0.80+0.50*0.20=0.86

P (SE and A) = P(A|SE) * P(SE)=0.50*0.20=0.10

P (NSE and A) = P(A|NSE) * P(NSE)=0.95*0.80=0.76

P(SE|A) = P (SE and A)/P(A)=0.10/0.86=0.1163

P(NSE|A) = P (NSE and A)/P(A)=0.76/0.86=0.8837

Calculations in table format

Events Apriori Likelihood Joint Posterior


Side effects 0.2 0.5 0.1 0.116279
No side effects 0.8 0.95 0.76 0.883721
0.86
General table

Event Apriori Likelihood Joint Posterior


s
A1 P(A1) P(B|A1) P(A1 and B)=P(A1)*P(A1|B) P(A1|B) =P(A1 and B)\P(B)
A2 P(A2) P(B|A2) P(A2 and B)=P(A2)*P(A2|B) P(A2|B) =P(A2 and B)\P(B)
A3 P(A3) P(B|A3) P(A3 and B)=P(A3)*P(A3|B) P(A3|B) =P(A3 and B)\P(B)
P(B)= Total of this column

You might also like