You are on page 1of 126

Probability and probability

distributions
By: Amare M
amaremu7@gmail.com
Objectives
At the end of the session students should be able
to:

 Define and calculate probabilities


 Understand the different properties of probability
 Recognize the various types of probability
distributions – Binomial, Poisson and Normal
(Gaussian) probability distributions
Probability
 What comes to your mind when you think of probability?
 Probability is the chance of observing a particular
outcome or likelihood of an event.
 A way of quantifying uncertainty
 Assumes a “stochastic” or “random” process i.e the out
come is not predetermined or not predicted i.e there is
an element of chance
 A process like flipping a coin, rolling a die or drawing a
card from a deck are probability experiments

3
 An understanding of probability is fundamental for
quantifying the uncertainty that is inherent in the
decision-making process
 Probability theory also allows us to draw conclusions
about a population of patients based on known
information about a sample of patients drawn from that
population
 In fact, Probability theory, is the foundation of statistical
inferences

4
 When can we talk about probability?
 When dealing with a process that has an uncertain
outcome
oSex of child (male/female)?
oFlipping a coin (head/tail)?
oA patient taking a certain drug (cure/no cure)?
oThe fate of the patient (recover/dead)?
oE.t.c

5
Basic concepts
 Random experiment: Random experiment is one whose
results depend on chance, that is the result cannot be
predicted. Tossing of coins, throwing of dice are some
examples of random experiments.

 Trial: Performing a random experiment is called a trial.

 Outcomes: The results of a random experiment are


called its outcomes. When two coins are tossed the
possible outcomes are HH, HT,TH, TT.

6
 Event: An outcome or a combination of outcomes of a
random experiment is called an event. For example tossing of
a coin is a random experiment and getting a head or tail is an
event.

 Sample space: Each conceivable outcome of an experiment is


called a sample point. The totality of all sample points is called
a sample space and is denoted by S. For example, when a coin
is tossed, the sample space is S = { H, T } . H and T are the
sample points of the sample space S.

7
 Equally likely events: Two or more events are said to be
equally likely if each one of them has an equal chance of
occurring. For example in tossing of a coin, the event of
getting a head and the event of getting a tail are equally likely
events.

 Mutually exclusive events: Two or more events are said to be


mutually exclusive, when the occurrence of any one event
excludes the occurrence of the other event. Mutually
exclusive events cannot occur simultaneously.

8
 Complementary events: The event ‘ A occurs’ and the event ‘
A does not occur’ are called complementary events to each
other. The event ‘ A does not occur’ is denoted by A’ or Ac
.The event and its complements are mutually exclusive.

 For example in throwing a die, the event of getting odd


numbers is { 1, 3, 5 } and getting even numbers is { 2, 4, 6}
.These two events are mutually exclusive and complement to
each other.

9
Definition of probability
 The number of times in which that event occurs in a very
large number of trials
• The probability that something occurs is the proportion of
times it occurs when exactly the same experiment is
repeated a very large (preferably infinite!) number of times
in independent trials; frequentist definition of Probability.
- “independent” means the outcome of one trial of the
experiment doesn’t affect any other outcome.

10
Probability of an event E
 A number between 0 and 1 representing the
proportion of times that event E is expected to
happen when the experiment is done over and over
again under the same conditions

11
Types of probability
1. Objective probability

 Classical /theoretical probability

 Relative frequency probability

2. Subjective/ judgment probability

12
Classical probability
 If an event can occur in N mutually exclusive and
equally likely ways, and if m of these possess a
characteristic, E then the probability of the
occurrence of E=m/N

 P(E) = the probability of E =m/N

13
Example

 If we roll a die, what is the probability of turning up 4?

 M=1 which is 4 and N=6

 The probability of 4 turning up is 1/6=0.167

 There are 2 possible outcomes {H, T} in the set of all


possible trials of tossing a coin

 P(H)=1/2=0.5

 P(T)=1/2=0.5

14
 Exact out come is unknown before conducting the
experiment

 All possible outcomes of experiment are known

 Each outcome is equally likely

 Experiment can be repeated under uniform


conditions

15
Empirical probability/Relative frequency method
 Based on observations obtained from probability
experiments
 The proportion of times the event A occurs in a large
number of trials repeated under essentially identical
conditions
 If a process is repeated a large number of times (n), and
it an event with the characteristic E occurs m times the
relative frequency of E
 P(E)= Probability of E=m/n

16
 This method used for an experiment where it is not
possible to apply the classical approach (usually
because outcomes not equally likely or the
experiment is not repeatable under uniform
conditions).
 The probability of an event E is the relative frequency
of occurrence of E or the proportion of times E
occurs in a large number of trials of the experiment

17
Example

 If you toss a coin 100 times and head turns up 35


times,

P(H)=35/100=0.35

 If we toss a coin 10,000 time and the tail turns up


5,687,

P(T) =5687/10,000=0.5687

18
Subjective probability/Judgmental method
 Personalistic (an opinion or judgment by a decision maker
about the likelihood of an event)
o Personal assessment of which is more effective to
provide cure (traditional/modern)

o Personal assessment of which sports team will win a


match

o Also uses classical and empirical methods to assess the


likelihood of an event, but does not rely on
repeatability of any process

19
 A degree of belief that an outcome will occur

 Best guess / experience / wise judgment produces


good estimates of the chance of a particular
outcome occurring

 Poor guess / poor judgment produces unreliable


estimates of the chance of a particular outcome
occurring

20
 Make your best guess of the likelihood of the event,
based on reliable information and good judgment.

 Inform yourself about the issue or discuss it with


someone knowledgeable.

 Probability is a number between 0 and 1 that


represents your degree of belief the event will occur

21
Types of events in probability
Independent Vs Dependent events
 Independent events: Those events whose outcome does
not depend on some previous outcome
o No matter how many times an experiment has been conducted
the probability of occurrence of independent events will be the
same. Eg. Tossing a coin

 Dependent events: events whose outcome depends on a


previous outcome
o That is the probability of the occurrence of a dependent event
will be affected by some previous outcome.

22
For example :

 Suppose a bag has 3 red and 6 green balls.

 Two balls are drawn from the bag one after the other.

 Let A be event of drawing red ball in the first draw and B be the

event of drawing green ball in the second draw.

 If the ball drawn in the first draw is not replaced back in the bag,

then A and B are dependent events because P(B) is decreased or

increased according to the first draw results as a red or green ball

23
Intersection & union of events
Intersection of events

24
Union of events

25
Properties of probability

26
27
28
29
Basic probability rules
1. Addition rule

30
31
32
2. Multiplication rule

33
34
Conditional probability

35
36
37
39
40
41
42
43
44
45
46
Random variable
 Any quantity or characteristic that is able to assume
a number of different values such that any particular
outcome is determined by chance

 The variable X is a random or stochastic variable


since the value that it takes is subject to chance

 Random variables can be discrete or continuous

47
 A discrete random variable is able to assume only a
finite or countable number of outcomes

 A continuous random variable can take on any value


in a specified interval

 For categorical variables we obtain the frequency


distribution of each variable

48
Probability distribution
 It is a device used to describe the behavior that a
random variable may have by applying the theory of
probability

 The distribution of all possible values (outcomes) of a


random variable along with their respective
probabilities

49
 It is a list of the probabilities associated with the
values of the random variable obtained in an
experiment

 Therefore, the probability distribution of a random


variable is a table, graph or mathematical formula
that gives the probabilities with which the random
variable take different values or range of values

50
• It is a function that assigns probability for each
element of random variable
• A probability distribution defines the
relationship between the outcomes and their
likelihood of occurrence
• Probabilities and probability distributions are
nothing more than extensions of the ideas of
relative frequency and histograms
respectivelly

51
• Discrete probability distributions

52
Thanks!

Introduction 126

You might also like