Professional Documents
Culture Documents
SAMPLE SPACE
<def> The set of all possible outcomes of a statistical experiment is called the
sample space and is represented by the symbol S.
EVENTS
Definitions:
<def> If each element of a set A1 is also an element of set A2, then A1 is a subset of
A2, A1 A2 .
<def> If a set A has no elements, then A is called null set, A= .
<def> An event, A, is a subset of a sample space, i.e. A S .
<def> The complement of an event A with respect to S is the subset of S that are not
in A. The complement of A is denoted by A' , Ac or A .
Ch 2-1
Statistics Ch 2: Probability Page 2-2
ALGEBRA OF SETS:
1. Union: The union of events A and B, A B , is the event containing all elements
that belong to A or B or both. A B x: x A or x B
2. Intersection: The intersection of events A and B, A B , is the event containing
all elements that are common to A and B. A B x: x A and x B
Ch 2-2
Statistics Ch 2: Probability Page 2-3
8. Distributive Laws:
A ( B C) ( A B) ( A C), A ( B C) ( A B) ( A C)
9. De Morgan's Law: ( A B)' A' B',( A B)' A' B'
10. Complement Laws: A A' S , A A' ,( A')' A, S ' , ' S .
11. If A B and B A , then A=B.
12. If A B and B C , then A C .
Venn Diagram:
A B
results for the first step, n2 possible results for the second step, and so on, then
6
9
1
2
...
5
9
2 4 3 = 24
Ch 2-3
Statistics Ch 2: Probability Page 2-4
Bradley has invested in two stocks, Markley Oil and Collins Mining. Bradley
has determined that the possible outcomes of these investments three months
from now are as follows.
Investment Gain or Loss in 3 Months (in $000)
Markley Oil Collins Mining
10 8
5 -2
0
-20
Tree Diagram:
Ch 2-4
Statistics Ch 2: Probability Page 2-5
Permutation: (排列)
A permutation is an arrangement of all or part of a set of objects.
5 4 3 2 1 5! 120
IV. The number of distinct permutations of n things of which n1 are of one kind, n2
n!
of a second kind, ..., n k of a k-th kind is .
n1!n2! nk !
Example: How many 5-digit numbers can be constructed by 1, 1, 5, 5, 9?
5 4 3 2 1 5!
30
(2 1)( 2 1) 2!2!
Partitioning: (分組)
The number of ways of partitioning a set of n objects into k cells with n1 elements
n n!
.
1 2
n , n , , n k n1!n2 !nk !
Ch 2-5
Statistics Ch 2: Probability Page 2-6
Combination: (組合)
The number of combinations of n distinct objects taken r at a time is
n n!
Crn
r r!(n r )!
Note: A combination is actually a partition with two cells, the one cell containing
the r objects selected and the other cell containing the (n-r) objects that are
left.
Example: 1. How many outcomes when we draw 2 cards from a bridge deck?
52 52! 52 51
C252 1326
2 2!50! 1 2
2. How many outcomes that we have a black jack?
C116C14 64
3. What is the probability to have a black jack?
64
P{black jack } 0.048
1326
Example: Lottery(小樂透)
The number of combinations of 42 numbers taken 6 at a time is
42 42! 42 41 40 39 38 37
C642 5,245 ,786
6 6!36! 1 2 3 4 5 6
36
特:C136 36 Ch 2-6
1
Statistics Ch 2: Probability Page 2-7
頭獎 1
C56C136 6 36
二獎 6
C136 36
三獎
C56C136 6 6 36 6 210
152 ,467
2 .9 %
四獎 C 6C 36 15 630 9450 5,245 ,786
4 2
普獎
C36C336 20 7140 142 ,800
Assigning Probabilities:
• Classical Method
Assigning probabilities based on the assumption of equally likely outcomes.
• Relative Frequency Method
Assigning probabilities based on experimentation or historical data.
• Subjective Method
Assigning probabilities based on the assignor’s judgment.
0 P( Ei ) 1
Basic Requirement: .
P ( E1 ) P ( E 2 ) P ( E n ) 1
1 Certain
Classical Method:
If an experiment has n possible outcomes, this method
would assign a probability of (1/n) to each outcome. 0.5
Ch 2-7
Statistics Ch 2: Probability Page 2-8
Subjective Method:
• When economic conditions and a company’s circumstances change rapidly it
might be inappropriate to assign probabilities based solely on historical data.
• We can use any data available as well as our experience and intuition, but
ultimately a probability value should express our degree of belief that the
experimental outcome will occur.
• The best probability estimates often are obtained by combining the estimates
from the classical or relative frequency approach with the subjective estimates.
Ch 2-8
Statistics Ch 2: Probability Page 2-9
PROBABILITY OF AN EVENT
• An event is a collection of sample points. (A subset of the sample space.)
• The probability of any event is equal to the sum of the probabilities of the
sample points in the event.
<def> The probability of the event E is equal to the sum of the probabilities of the
sample points in the event E. Therefore,
0 P( E ) 1, P( ) 0, P( S ) 1 .
<def> If an experiment can result in any one of N different equally likely outcomes,
and if exactly n of these outcomes correspond to event A, then the probability
n
of event A is P A .
N
4 4
The # of ways to hold 2 aces and 3 jacks: n 24
2 3
n 24
P A 9 10 6
N 2598960
Ch 2-9
Statistics Ch 2: Probability Page 2-10
Ch 2-10
Statistics Ch 2: Probability Page 2-11
In general:
P( E1 E2 En ) P( Ei ) P( Ei E j ) P( Ei E j Ek )
i i j i jk
.
n 1
( 1) P( E1 E2 En )
n n
If 1 2
E , E , , E n are mutually exclusive, then P Ei P( Ei ) .
i 1 i 1
If A B , then P( A) P( B) .
Sample Space S
A n A1
A2
A3
Ch 2-11
Statistics Ch 2: Probability Page 2-12
Independent Events(獨立事件):
• Events A and B are independent if P( A | B) P( A) .
• The multiplication law also can be used as a test to see if two events are
independent.
• That is, the knowledge that A ( or B) has occurred does not affect the
probability that B (or A) occurs. Otherwise, A and B are dependent.
Ch 2-12
Statistics Ch 2: Probability Page 2-13
Multiplicative Rules:
1. P( A B) P( B) P( A| B) P( A) P( B| A) .
2.
P ( E1 E 2 E n ) P( E1 ) P ( E 2 | E1 ) P ( E 3 | E1 E 2 )
.
P( E n | E1 E 2 E n 1 )
Example: S={1,2,3,4},E={1,2},F={1,3},G={1,4}
P(EF)=1/4=P(E)P(F) =>E and F are independent.
P(EG)=1/4=P(E)P(G) =>E and G are independent.
P(FG)=1/4=P(F)P(G) =>F and G are independent.
But, P(EFG)=1/4 P(E)P(F)P(G)=1/8 =>E, F and G are dependent.
Events E, F and G are pairwise independent but not jointly independent.
Ch 2-13
Statistics Ch 2: Probability Page 2-14
Example: Suppose n men throw their hats in a bag and each randomly selects
a hat. What is the probability that none has his own hat?
Let E i be the event that man i has his own hat. Then, P( Ei ) 1 / n ,
1 (n 2)! (n k )!
P( E i1 E i2 ) , , P( E i1 E i2 E ik ) .
n(n 1) n! n!
The probability that none has his own hat is
n
1 P Ei 1 P( Ei ) P( Ei E j ) P( Ei E j E k ) ( 1) n P( E1E2 En )
i 1 i i j i jk
1 n (n 2)! n (n k )! n (n n)!
1 n ( 1) k ( 1) n
n 2 n! k n! n n!
1 1 1 1 n
( 1) n e 1
2! 3! 4! n!
x x2 x3
NOTE: e 1 x
2! 3!
BAYES’ RULE:
• Often we begin probability analysis with initial or prior probabilities(事前
機率).
• Then, from a sample, special report, or a product test we obtain some
additional information.
• Given this information, we calculate revised or posterior probabilities (事
後機率).
• Bayes’ theorem provides the means for revising the prior probabilities.
Application
Prior New Posterior
of Bayes’
Probabilities Information Probabilities
Theorem
Ch 2-14
Statistics Ch 2: Probability Page 2-15
P( B) P( B A1 ) P( B A2 ) P( A1 ) P( B | A1 ) P( A2 ) P( B | A2 )
Sample Space S
An A 1
A1 A2 B A2
B
A3
In general:
P( B) P( B A1 ) P( B A2 ) P( B An )
n n
P( B Ai ) P( Ai ) P( B | Ai )
i 1 i 1
P( A1 ) P( B | A1 ) P( A2 ) P( B | A2 ) P( An ) P( B | An )
Example:
New Information
Prior Probabilities Conditional Probabilities
P(A1) = .7 P(B) = . 41
Ch 2-15
Statistics Ch 2: Probability Page 2-16
P( Ai ) P( B | Ai )
P( Ai | B)
P( A1 ) P( B | A1 ) P( A2 ) P( B | A2 ) ... P( An ) P( B | An )
Bayes’ theorem is applicable when the events for which we want to compute
posterior probabilities are mutually exclusive and their union is the entire sample
space.
Example:
P( A1 ) P( B | A1 ) (0.7)(0.2) 0.14
P( A1 | B) 0.34
P( A1 ) P( B | A1 ) P( A2 ) P( B | A2 ) (0.7)(0.2) (0.3)(0.9) 0.41
P( A2 ) P( B | A2 ) (0.3)(0.9) 0.27
P( A2 | B) 0.66
P( A1 ) P( B | A1 ) P( A2 ) P( B | A2 ) (0.7)(0.2) (0.3)(0.9) 0.41
Example: In a multiple choice test,
p = the probability that a student knows the answer
1- p = the probability that a student guesses
m = the number of choices in each question
What is the probability that a student knew the answer given that he answered
it correctly?
Let E be the event that he answers the question correctly and F be the event
that he knows the answer.
1
Then, P( E ) P( F ) P( E | F ) P( F ' ) P( E | F ' ) p 1 (1 p) .
m
P ( FE ) P( F ) P( E | F ) p mp
P( F | E )
P( E ) P( F ) P( E | F ) P( F ' ) P( E | F ' ) 1 1 (m 1) p
p 1 (1 p )
m
Ch 2-16
Statistics Ch 2: Probability Page 2-17
Decision Process:
別靠錯梯子!
現在 未來
5年或10年後的你!
Ch 2-17