You are on page 1of 56

Chapter 4

Introduction to Probability
 Experiments, Counting Rules,
and Assigning Probabilities
 Events and Their Probability
 Some Basic Relationships
of Probability
 Conditional Probability
Decision trees

Slide 1
Uncertainties

Managers often base their decisions on an analysis


of uncertainties such as the following:

What are the chances that sales will decrease


if we increase prices?

What is the likelihood a new assembly method


will increase productivity?

What are the odds that a new investment will


be profitable?

Slide 2
Probability

Probability is a numerical measure of the likelihood


that an event will occur.

Probability values are always assigned on a scale


from 0 to 1.

A probability near zero indicates an event is quite


unlikely to occur.

A probability near one indicates an event is almost


certain to occur.

Slide 3
Probability as a Numerical Measure
of the Likelihood of Occurrence
Increasing Likelihood of Occurrence

0 .5 1
Probability:

The event The occurrence The event


is very of the event is is almost
unlikely just as likely as certain
to occur. it is unlikely. to occur.

Slide 4
Statistical Experiments

In statistics, the notion of an experiment differs


somewhat from that of an experiment in the
physical sciences.

In statistical experiments, probability determines


outcomes.

Even though the experiment is repeated in exactly


the same way, an entirely different outcome may
occur.

For this reason, statistical experiments are some-


times called random experiments.

Slide 5
An Experiment and Its Sample Space

An experiment is any process that generates well-


defined outcomes.

The sample space for an experiment is the set of


all experimental outcomes.

An experimental outcome is also called a sample


point.

Slide 6
An Experiment and Its Sample Space

Experiment Experiment Outcomes


Toss a coin Head, tail
Inspection a part Defective, non-defective
Conduct a sales call Purchase, no purchase
Roll a die 1, 2, 3, 4, 5, 6
Play a football game Win, lose, tie

Slide 7
An Experiment

• Two stage process

Slide 8
An Experiment and Its Sample Space
 Example: Bradley Investments
Bradley has invested in two stocks, Markley Oil
and Collins Mining. Bradley has determined that the
possible outcomes of these investments three months
from now are as follows.

Investment Gain or Loss


in 3 Months (in $000)
Markley Oil Collins Mining
10 8
5 -2
0
-20

Slide 9
A Counting Rule for
Multiple-Step Experiments
 If an experiment consists of a sequence of k steps
in which there are n1 possible results for the first step,
n2 possible results for the second step, and so on,
then the total number of experimental outcomes is
given by (n1)(n2) . . . (nk).
 A helpful graphical representation of a multiple-step
experiment is a tree diagram.

Slide 10
A Counting Rule for
Multiple-Step Experiments
 Example: Bradley Investments
Bradley Investments can be viewed as a two-step
experiment. It involves two stocks, each with a set of
experimental outcomes.

Markley Oil: n1 = 4
Collins Mining: n2 = 2
Total Number of
Experimental Outcomes: n1n2 = (4)(2) = 8

Slide 11
Tree Diagram
 Example: Bradley Investments
Markley Oil Collins Mining Experimental
(Stage 1) (Stage 2) Outcomes
Gain 8 (10, 8) Gain $18,000
(10, -2) Gain $8,000
Gain 10 Lose 2
Gain 8 (5, 8) Gain $13,000

Lose 2 (5, -2) Gain $3,000


Gain 5
Gain 8
(0, 8) Gain $8,000
Even
(0, -2) Lose $2,000
Lose 20 Lose 2
Gain 8 (-20, 8) Lose $12,000
Lose 2 (-20, -2) Lose $22,000

Slide 12
Assigning Probabilities

 Basic Requirements for Assigning Probabilities

1. The probability assigned to each experimental


outcome must be between 0 and 1, inclusively.

0 < P(Ei) < 1 for all i

where:
Ei is the ith experimental outcome
and P(Ei) is its probability

Slide 13
Assigning Probabilities

Basic Requirements for Assigning Probabilities

2. The sum of the probabilities for all experimental


outcomes must equal 1.

P(E1) + P(E2) + . . . + P(En) = 1

where:
n is the number of experimental outcomes

Slide 14
Assigning Probabilities

Classical Method
Assigning probabilities based on the assumption
of equally likely outcomes

Relative Frequency Method


Assigning probabilities based on experimentation
or historical data

Subjective Method
Assigning probabilities based on judgment

Slide 15
Classical Method
 Example: Rolling a Die
If an experiment has n possible outcomes, the
classical method would assign a probability of 1/n
to each outcome.

Experiment: Rolling a die


Sample Space: S = {1, 2, 3, 4, 5, 6}
Probabilities: Each sample point has a
1/6 chance of occurring

Slide 16
Relative Frequency Method
 Example: Lucas Tool Rental
Lucas Tool Rental would like to assign probabilities
to the number of car polishers it rents each day.
Office records show the following frequencies of daily
rentals for the last 40 days.
Number of Number
Polishers Rented of Days
0 4
1 6
2 18
3 10
4 2

Slide 17
Relative Frequency Method
 Example: Lucas Tool Rental
Each probability assignment is given by dividing
the frequency (number of days) by the total frequency
(total number of days).

Number of Number
Polishers Rented of Days Probability
0 4 .10
1 6 .15
2 18 .45 4/40
3 10 .25
4 2 .05
40 1.00
Slide 18
Subjective Method
 When economic conditions and a company’s
circumstances change rapidly it might be
inappropriate to assign probabilities based solely on
historical data.
 We can use any data available as well as our
experience and intuition, but ultimately a probability
value should express our degree of belief that the
experimental outcome will occur.
 The best probability estimates often are obtained by
combining the estimates from the classical or relative
frequency approach with the subjective estimate.

Slide 19
Subjective Method
 Example: Bradley Investments
An analyst made the following probability estimates.

Exper. Outcome Net Gain or Loss Probability


(10, 8) $18,000 Gain .20
(10, -2) $8,000 Gain .08
(5, 8) $13,000 Gain .16
(5, -2) $3,000 Gain .26
(0, 8) $8,000 Gain .10
(0, -2) $2,000 Loss .12
(-20, 8) $12,000 Loss .02
(-20, -2) $22,000 Loss .06

Slide 20
Events and Their Probabilities

An event is a collection of sample points.

The probability of any event is equal to the sum of


the probabilities of the sample points in the event.

If we can identify all the sample points of an


experiment and assign a probability to each, we
can compute the probability of an event.

Slide 21
Events and Their Probabilities
 Example: Bradley Investments

Event M = Markley Oil Profitable


M = {(10, 8), (10, -2), (5, 8), (5, -2)}
P(M) = P(10, 8) + P(10, -2) + P(5, 8) + P(5, -2)
= .20 + .08 + .16 + .26
= .70

Slide 22
Events and Their Probabilities

 Example: Bradley Investments

Event C = Collins Mining Profitable


C = {(10, 8), (5, 8), (0, 8), (-20, 8)}
P(C) = P(10, 8) + P(5, 8) + P(0, 8) + P(-20, 8)
= .20 + .16 + .10 + .02
= .48

Slide 23
Some Basic Relationships of Probability
There are some basic probability relationships that
can be used to compute the probability of an event
without knowledge of all the sample point probabilities.

Complement of an Event

Union of Two Events

Intersection of Two Events

Mutually Exclusive Events

Slide 24
Organizing & Visualizing Events
• Contingency Tables -- For All Days in 2013
Jan. Not Jan. Total

Wed. 4 48 52
Not Wed. 27 286 313
• Decision Trees Total 31 334 365

Total
4 Number
Sample Of
Space 27 Sample
All Days Space
In 2013 Outcomes
48

286 Slide 25
Definition: Simple Probability

• Simple Probability refers to the probability of a simple event.


• ex. P(Jan.)
• ex. P(Wed.)

Jan. Not Jan. Total


P(Wed.) = 52 / 365
Wed. 4 48 52
Not Wed. 27 286 313

Total 31 334 365

P(Jan.) = 31 / 365 Slide 26


Definition: Joint Probability
• Joint Probability refers to the probability of an occurrence of two or
more events (joint event).
• ex. P(Jan. and Wed.)
• ex. P(Not Jan. and Not Wed.)

Jan. Not Jan. Total


P(Not Jan. and Not Wed.)
Wed. 4 48 52
= 286 / 365
Not Wed. 27 286 313

Total 31 334 365

P(Jan. and Wed.) = 4 / 365


Slide 27
Mutually Exclusive Events

• Mutually exclusive events


• Events that cannot occur simultaneously

Example: Randomly choosing a day from 2013

A = day in January; B = day in February

• Events A and B are mutually exclusive

Slide 28
Collectively Exhaustive Events

• Collectively exhaustive events


• One of the events must occur
• The set of events covers the entire sample space

Example: Randomly choose a day from 2013

A = Weekday; B = Weekend;
C = January; D = Spring;

• Events A, B, C and D are collectively exhaustive (but not mutually exclusive


– a weekday can be in January or in Spring)
• Events A and B are collectively exhaustive and also mutually exclusive

Slide 29
Computing Joint and
Marginal Probabilities

• The probability of a joint event, A and B:

• Computing a marginal (or simple) probability:

• Where B1, B2, …, Bk are k mutually exclusive and collectively exhaustive events

Slide 30
Joint Probability Example

P(Jan. and Wed.)

Jan. Not Jan. Total

Wed. 4 48 52
Not Wed. 27 286 313

Total 31 334 365

Slide 31
Marginal Probability Example

P(Wed.)

Jan. Not Jan. Total

Wed. 4 48 52
Not Wed. 27 286 313

Total 31 334 365

Slide 32
Marginal & Joint Probabilities In A
Contingency Table

Event
Event B1 B2 Total
A1 P(A1 and B1) P(A1 and B2) P(A1)
A2 P(A2 and B1) P(A2 and B2) P(A2)

Total P(B1) P(B2) 1

Joint Probabilities Marginal (Simple) Probabilities

Slide 33
General Addition Rule

General Addition Rule:


P(A or B) = P(A) + P(B) - P(A and B)

If A and B are mutually exclusive, then


P(A and B) = 0, so the rule can be simplified:

P(A or B) = P(A) + P(B)


For mutually exclusive events A and B
Slide 34
General Addition Rule Example

P(Jan. or Wed.) = P(Jan.) + P(Wed.) - P(Jan. and Wed.)


= 31/365 + 52/365 - 4/365 = 79/365
Don’t count
the five
Wednesdays
in January
Jan. Not Jan. Total twice!
Wed. 4 48 52
Not Wed. 27 286 313

Total 31 334 365

Slide 35
Computing Conditional Probabilities

• A conditional probability is the probability of one event, given that


another event has occurred:

The conditional
probability of A given
that B has occurred

The conditional
probability of B given
that A has occurred

Where P(A and B) = joint probability of A and B


P(A) = marginal or simple probability of A
P(B) = marginal or simple probability of B Slide 36
Conditional Probability Example

 Of the cars on a used car lot, 70% have air


conditioning (AC) and 40% have a GPS. 20%
of the cars have both.

• What is the probability that a car has a GPS, given that it has AC ?

i.e., we want to find P(GPS | AC)

Slide 37
Conditional Probability Example
(continued)
 Of the cars on a used car lot, 70% have air conditioning
(AC) and 40% have a GPS and
20% of the cars have both.
GPS No GPS Total
AC 0.2 0.5 0.7
No AC 0.2 0.1 0.3
Total 0.4 0.6 1.0

Slide 38
Conditional Probability Example
(continued)
 Given AC, we only consider the top row (70% of the cars). Of these,
20% have a GPS. 20% of 70% is about 28.57%.

GPS No GPS Total


AC 0.2 0.5 0.7
No AC 0.2 0.1 0.3
Total 0.4 0.6 1.0

Slide 39
Using Decision Trees

Given AC or P(AC and GPS) = 0.2


no AC:
P(AC and GPS’) = 0.5

All Conditional
Probabilities
Cars

P(AC’ and GPS) = 0.2

P(AC’ and GPS’) = 0.1


Slide 40
Using Decision Trees
(continued)

Given GPS P(GPS and AC) = 0.2


or no GPS:
P(GPS and AC’) = 0.2

All Conditional
Probabilities
Cars

P(GPS’ and AC) = 0.5

P(GPS’ and AC’) = 0.1


Slide 41
Bayes’ Theorem

 Often we begin probability analysis with initial or


prior probabilities.
 Then, from a sample, special report, or a product
test we obtain some additional information.
 Given this information, we calculate revised or
posterior probabilities.
 Bayes’ theorem provides the means for revising the
prior probabilities.
Application
Prior New Posterior
of Bayes’
Probabilities Information Probabilities
Theorem

Slide 42
Bayes’ Theorem

Example:
Consider a manufacturing firm that receives shipment of parts from
two different suppliers.
Event A1 denote that part is from supplier 1 and A2 denotes the event
that part is from supplier 2.
currently 65% of the parts are from supplier 1 and the remaining are
from supplier 2.

P(A1)=0.65 P(A2) = 0.35

Slide 43
Additional information

If we let G Denote the event that part is good and B Denote the event
that part is bad.
98% of the parts are from supplier 1 are good and 95% of the parts are
from supplier 2 are good.

• P(G/A1) = 0.98 P(B/A1) = 0.02


• P(G/A2) = 0.95 P(B/A1) = 0.05

• Given that the part is bad what is the probability that it came from
supplier 1.

Slide 44
First tree diagram

Slide 45
Slide 46
• Given that the part is bad what is the probability that it came from
supplier 1

• P(A1/B) = 0.0130/(0.0130 + 0.0175)


= 0.4262

Slide 47
Bayes’ Theorem

 To find the posterior probability that event Ai will


occur given that event B has occurred, we apply
Bayes’ theorem.

 Bayes’ theorem is applicable when the events for


which we want to compute posterior probabilities
are mutually exclusive and their union is the entire
sample space.

Slide 48
Using Decision Trees

•Can be used as visual aids to structure and


solve sequential decision problems
•Especially beneficial when the complexity
of the problem grows

Slide 49
Decision Trees
• Three types of “nodes”
• Decision nodes - represented by squares (□)
• Chance nodes - represented by circles (Ο)
• Terminal nodes - represented by triangles (optional)
• Solving the tree involves pruning all but the best decisions at decision nodes,
and finding expected values of all possible states of nature at chance nodes
• Create the tree from left to right
• Solve the tree from right to left

Slide 50
Example Decision Tree

Chance
node Event
Decision 1
Event 2
node
Event 3

Slide 51
Problem: Jenny Lind

Jenny Lind is a writer of novels. A movie company and a TV network


both want exclusive rights to one of her more popular works. If she
signs with the network, she will receive a single lump sum, but if she
signs with the movie company, the amount she will receive depends on
the market response to her movie. What should she do?

Slide 52
Payouts and Probabilities

• Movie company Payouts


• Small box office - $200,000
• Medium box office - $1,000,000
• Large box office - $3,000,000
• TV Network Payout
• Flat rate - $900,000
• Probabilities
• P(Small Box Office) = 0.3
• P(Medium Box Office) = 0.6
• P(Large Box Office) = 0.1

Slide 53
Jenny Lind - Payoff Table

States of Nature

Small Box Medium Box Large Box


Decisions Office Office Office

Sign with Movie


$200,000 $1,000,000 $3,000,000
Company

Sign with TV
$900,000 $900,000 $900,000
Network
Prior
0.3 0.6 0.1
Probabilities
Slide 54
Jenny Lind Decision Tree - Solved

Small Box Office


ER .3 $200,000
960,000
Sign with Movie Co. .6 Medium Box Office
$1,000,000
ER .1
960,000 Large Box Office
$3,000,000

Small Box Office


ER .3 $900,000
900,000
Sign with TV Network .6 Medium Box Office
$900,000
.1
Large Box Office
$900,000

Slide 55
End of Chapter 4

Slide 56

You might also like