Professional Documents
Culture Documents
Introduction – Probability
Let P(E) be the probability of event E. Therefore, the probability of an event is between 0 and 1.
Laws of Probability:
1.
The probability of an Event (E) happening, lies between 0 and 100%. If the probability is 100%, the
Event is certain to happen.
The probability that an event does not occur is 1 minus the probability that it happens. For example,
the probability of the event "not obtaining a nine by throwing two dices" is similar to 1 minus "the
probability of obtaining a nine ":
3.
if A, B are mutually exclusive
The probability of either event A or event B happening equals the sum of the probabilities of event
A and event B.
4.
if A, B are independent (Two events are independent of each other if the occurrence of one does
not affect the probability of the other's occurrence).
The probability for the occurence of two independent events equals the product of the probabilities
of those two events. Consider the following events::
· (a) When 2 dices are thrown: the outcomes of the 2 dices are independent from each other.
· (b) The color of person's eyes is not independent of the color of his/her hair.
Conditional Probability
Example:
Let A: a card drawn is a King
B: a card drawn is a face card
Then P(A) = 4/52
To determine P(A|B), we would think of drawing a King not from the whole deck, but just from the
group of all face cards, of which there are only 12.
Therefore, P(A|B) = 4/12
The effect of knowing B, then, is to cut down the size of the sample space. We throw away all
those points that have nothing to do with B.
Let us look at a second example. What is the probability of throwing a seven with a pair of dice if
you know that the first die shows 3?
Solution:
1. Draw a sample space of all possible outcomes for a pair of dice.
2. Identify the sample spaces corresponding to the following events:
A: both dice total seven
B: first die shows three
P(AB) = P(A|B)P(B)
Similarly, it can be proved that
P(AB) = P(A|B)P(A)
Let B1, B2, ....Bn represent multiple, mutually exclusive and jointly
exhaustive events. Then
Bayesian Statistics
An essential element of Bayesian Statistics is the revision of probabilities in the light of new
information. This can be best illustrated with a few examples.
Example:
Given 3 identical, unmarked boxes:
Box A contains 2 gold coins
Box B contains 1 gold coin and 1 silver coin
Box C contains 2 silver coins
Choose 1 box at random.
What is the probability that a particular box, say box A, was chosen?
Answer: P(A) = 1/3
Pick one coin out of the chosen box without observing the other coin. Assume the coin picked is
gold.
Answer: P(A|G) = 2/3
P(A) is the prior probability of having picked box A. However, once we know that the box picked
contained at least one gold coin we can use this information to revise the prior probability. The
revised probability P(A|G) is known as the posterior probability.
A systematic way of revising probabilities
Example 1:
The chance of picking a gold coin (after having picked the box at random) is P(G) = 0.50; out of
this 0.33 will involve box A, 0.17 box B. Therefore, if a gold coin is picked, the chance that we
have also picked box A is:
The Bayes Theorem
where :
P(Si | E) = probability of state Si given an event E is observed
P(E|Si) = probability of event E occurring given the state is Si.
P(Si) = probability of state Si occurring
n = total number of all possible states
Example 2
An automatic machine requires daily adjustment. If adjusted correctly, it will produce 90%
acceptable parts; if adjusted incorrectly, only 20% of all parts produced are acceptable. From past
experience we know that about 70% of all machine adjustments are done correctly.One day the
machine is adjusted and the first three parts produced are found to be acceptable.
What is the probability that the machine has been adjusted properly?
Solution:
Therefore, P (Set up correct|3 Good Parts) = 0.5103/0.5127 = 99.5%
Solution: