You are on page 1of 18

PROBABILITY

MODULE-1
CHAPTER-3
SUBJECT: BUSINESS
STATISTICS
COMPILED BY :
KAJAL MEHTA
Topics to be covered in the Chapter

•Introduction
•Theories of probability
•Laws of Probability
•Inverse Probability
•Revision of Probability: BAYE’S RULE
Introduction
• Jacob Bernouli (1654–1705), Abraham de Moivre
(1667–1754), the Reverend Thomas Bayes (1702–61)
and Joseph Laagrange (1736–1813) developed
probability formulae and techniques.

• Probability was successfully applied at the gambling


tables, and more relevant to social and economic
problems.

• The mathematical theory of probability is the basis for


statistical applications in both social and
decision-making research.
Introduction
• Probability is the chance something will happen.
• Probabilities are expressed as fractions/decimals between 0 and
1.
• Assigning a probability of zero means that something will never
happen, a probability of one indicates that something will always
happen.
• In probability theory, an event is one or more of the possible
outcomes of doing something.
Introduction

• The activity that produces such an event is


referred to in probability theory as an experiment.

• Events are said to be mutually exclusive, if one


and only one of them can take place at a time.

• When a list of the possible events that can result


from an experiment includes every possible
outcome, the list is said to be collectively
exhaustive.
INTRODUCTION
•EVENTS
•Elementary or Complementary
•Elementary Events : Rolling a die and getting only no. ‘2’ on that die.
•P(A)

•Complementary Events: Not P(A) = P (A’)


•= 1 – P(A)

•Mutually Exclusive or Independent Events


•Events are said to be mutually exclusive, if one and only one of them can take place at a
time.

•Occurrence or non-occurrence of one events doesn’t affect other event if events are
independent.

•Collectively Exhaustive Events


•All possible elementary events for an experiment
•Rolling a die : {1,2,3,4,5,6}
Theories or Types of Assigning
Probability

•Probability
•Classical Approach

•Relative Frequency Approach

•Subjective Approach
❖ Classical Approach
▪ Defines probability as ratio of favorable outcomes
to the total outcomes.
▪ Also known as priori probability.
▪ It assumes a number of assumptions, hence is the
most restrictive approach, and it is least useful in
real-life situations.
❖ Relative Frequency Approach
▪ Defines probability as observed relative frequency
of an event in a very large number of trials.
▪ It assumes less assumptions but requires the event
to be capable of being repeated a large number of
times.
❖ Subjective Probability
▪ Deals with specific or unique situations typical of
the business or management world.
▪ Based upon some belief or educated guess of the
decision maker.
▪ Subjective assessments of probability permits the
widest flexibility of the three concepts, also known
as personal probability.
PROBABILITY RULES/LAWS

❖ Most managers who use probabilities are concerned


with two conditions:
The case where one event or another will occur.
The situation where two or more events will both
occur.

❖ A single probability means that only one event can


take place, it is called marginal or unconditional
probability.
Probabilities Under Conditions of Statistical
Independence
• Occurrence of one event has no effect on the probability of the
occurrence of any other event.
• There are three types of probabilities under statistical
independence
• Marginal
– Probability of the occurrence of an event
• Joint
- (AB) = P(A) X P(B)
• Conditional
– P(A/B) = P(A) and P(B/A) = P(B)
• In statistical independence, assumption is that events are not
related.
Probabilities Under Conditions of Statistical
Dependence
• When the probability of some event is dependent on or
affected by the occurrence of some other event.

• As independent case, there are three types of probabilities


under statistical independence
• Conditional:
– P (A/B) = P(AB)/ P(B) and P(B/A) = P(AB)/P(A)
• Joint:
– P(AB) = P(A/B) X P(B) and P(AB) = P(B/A) X
P(A)
• Marginal: Marginal probabilities under statistical
dependence are computed by summing up the probabilities
of all the joint events in which the simple event occurs.
Addition rule for probability Multiplication Rule of probability
General Law of Addition: General Law of Multiplication:
P ( A or B) = P (A) + P(B) – P (A Both events must occurred
and B) P ( A and B)
= P (A)*P(B/A)
Or P (A and B)
= P(B)*P(A/B)

Special Law of Addition: Special Law of Multiplication:


For mutually exclusive events: For mutually exclusive events:
P ( A or B) = P (A) + P(B) P(A and B) = P(A)*P(B)
For independent events:
P(A/B) = P(A)
P(B/A) = P(B)
Revising Prior Estimates of
Probabilities—Bayes’ Theorem
• Bayes’ theorem expresses how a subjective degree of belief
should rationally change to account for evidence.

• Probabilities can be revised as more (additional) information is


gained. New probability is known as posterior probability.

• In the Bayesian interpretation, Bayes' theorem is fundamental


to Bayesian statistics, and has application in fields including
science, engineering, medicine and law.

• In the Bayesian (or epistemological) interpretation, probability


measures a degree of belief. Bayes' theorem then links the degree
of belief in a proposition before and after accounting for evidence.
Note down important
formulas in your book

You might also like