You are on page 1of 9

Handout 1

MA 202 - Probability and Statistics

Department of Mathematics

IIT Ropar, Rupnagar

January 14, 2020

1
1 Probability Models
Probability Model: A probability model consists of sample space and probability law.

• Sample space denoted by Ω

• Probability law denoted by P(·): describes our belief which outcomes are more likely
than the others. Probability law satisfies certain axioms.

Sample Space Ω

• Set (collection) of possible outcomes.

• This collection must be

- mutually exclusive: that is, if head comes in a coin tossing experiment then
tail doesn’t come and vice-versa (or there is a unique outcome).
- collectively exhaustive: no matter what happens in the experiment, we always
obtain an outcome that has been included in the sample space.

• The sample space could be finite, countably infinite or uncountable.

Discrete Sample Space: If the collection of outcomes is finite or countably infinite then
the sample space is called discrete.

Example 1.1. Consider the tossing of a coin. Then Ω = {H, T }.

Example 1.2. Two rolls of a tetrahedral die . The sample space can be represented by

Figure 1: Two dimensional grid

Figure 2: Sequential description

2
Remark 1.1. Note that (1, 2) and (2, 1) denote different outcomes of the experiment. (1, 2)
denotes that there is a one on the first roll and two on the second roll.
Example 1.3. Number of tosses required to get the first head in a coin tossing experiment.
In this case Ω = {1, 2, . . .}
Continuous Sample Space
Ω = {(x, y) : 0 ≤ x, y ≤ 1}
Consider throwing a dart on the square target and viewing the point of impact as outcome.
The outcome of a single through is going to be a point in the region denoted by Ω. Since
there are infinitely many points, the sample space is of size infinite and is uncountable.

Definition 1.1 (Event). A subset of the sample space to which a probability is assigned is
called an event. Note that all subsets of the sample space are not necessarily considered as
events.
Probability Axioms are set of rules that every probabilistic model or probability law has
to follow (obey).
1. Non- negativity : P(A) ≥ 0.

2. Normalization : P(Ω) = 1.

3. Additivity : P(A ∪ B) = P(A) + P(B) for A ∩ B = φ.


Occurrence of an event: If outcome of a random experiment is inside the set A, we say
that the event A has occurred. Further, if outcome is outside event A, we say that event A
has not occurred.

Proposition 1.1. If A1 , A2 , . . . , An are disjoint sets, using probability axioms show that
n
X
P(A1 ∪ A2 ∪ . . . ∪ An ) = P(Ai ).
i=1

3
Proposition 1.2. If A ⊆ B, show that P(A) ≤ P(B).

Quick reminder of set theory

Discrete Probability Law: Probability of any event A = {w1 , w2 , · · · , wn } is given by

P({w1 , w2 , . . . , wn }) = P({w1 }) + P({w2 }) + · · · + P({wn })


= P(w1 ) + P(w2 ) + · · · + P(wn ).

That is probability of an event A can be written as the sum of


probabilities of individual outcomes.

4
Probability law (finite sample space). Consider two
rolls of a tetrahedral die. Let each possible outcome has
probability 1/16.

(a) P(First roll is 4) = P({x = 4})


4
= P({(4, 1), (4, 2), (4, 3), (4, 4)}) = 16
= 41 .
8
(b) P(X + Y = even) = 16
= 12 .
12
(c) P( First roll is greater than or equal to 2) = 16
= 34 .

Discrete Uniform law: Let all outcomes be equally


likely. Then
number of elements in A #A
P(A) = = .
total number of sample points #Ω
Principle of symmetry. Let Ω be a finite sample space with
outcomes {w1 , w2 , . . . , wn }, such that all of the wi ’s are physi-
cally identical except for labels we attach to them.
In this case

P(w1 ) = P(w2 ) = · · · = P(wn ) = p.

Then
1
P(w1 ) + P(w2 ) + · · · + P(wn ) = 1 =⇒ np = 1 =⇒ p =
n
=⇒ P(A) = P(w1 ) + P(w2 ) + · · · + P(wr ) =
r
n
.

5
Continuous Uniform Law. Probabilistic models with con-
tinuous sample space differ from their discrete counterparts in
that the probabilities of the single element events may not be sufficient to characterize the
probability.

Example 1. Consider Ω = [0, 1]. Assume continuous uniform


law, i.e. each point has equal chance of selection. Suppose

P({x}) = p > 0, 0 ≤ x ≤ 1.

Since there are infinite points, we can find a natural num-


ber N such that N p > 1 and hence p = 0 is a natural
choice.

In this example, it make sense to assign probability b − a to any sub-interval [a, b] of [0, 1],
and to calculate probability of a more complicated set by evaluating its “length”.
This assignment satisfies the three probability axioms and hence qualifies as a legitimate
probability law.

Example 1.4. Two numbers are chosen at random from the


square [0, 1] × [0, 1]. Let X and Y represent the x- and y-
coordinate points respectively.

Uniform Law (probability = area). We have the following

(a) P(X + Y ≤ 1) = 21 .

(b) P((X, Y ) = (0.5, 0.5) = 0.

(c) P(of a line) = P(Y = 0.5, 0 ≤ X ≤ 1) = 0.

Probability Law on a Countably Infinite Sample Space

Example 1.5. Consider the number of tosses required to get the first head in a coin filliping
experiment.
The first head could come in first, second, 100th or millionth toss. So it is appropriate to
choose Ω = {1, 2, . . . , n, . . . }. Suppose it is given that

P(n) = 2−n , n = 1, 2, . . . .

6
Now

(not valid)
P(even number of outcomes are required) = P({2, 4, 6, . . . }) = P(2) + P(4) + · · ·
 
1 1 1 1 1
= + 4 + ··· = 2 1 + 2 + 4 + ···
22 2 2 2 2
1 1 1
= 2 · 1 = .
2 1− 4 3

Countable sum is not valid by finite additivity axiom. Hence we need additional axiom to
make countable additivity as a valid step.

Countable additivity axioms


If A1 , A2 , · · · are disjoint (pairwise disjoint/mutually disjoint) events, then

P(A1 ∪ A2 ∪ . . . ) = P(A1 ) + P(A2 ) + · · ·

or ∞
X
P (∪∞
i=1 ) = P(Ai )
i=1

Proposition 1.3. Countable additivity implies finite additivity.

Proof. Consider An+1 = An+2 = · · · = φ. Using countable additivity, we have



X
P (∪∞
i=1 Ai ) = P(Ai )
i=1

= P(A1 ) + P(A2 ) + · · · + P(An ) + P(φ) + P(φ) · · ·


= P(A1 ) + P(A2 ) + · · · + P(An ) + 0 + 0 + · · ·
= P(A1 ) + P(A2 ) + · · · + P(An )
= P (∪ni=1 Ai ) .

Proposition 1.4. Uniform-discrete-law is not applicable for countably infinite sample spaces.

Proof. Suppose Ω = {ω1 , ω2 , · · · }.

1. If P(ωi ) = p > 0, ∀i, then by countable additivity axiom np > 1 for sufficiently large n,
hence contradiction.
P
2. Further, if P(ωi ) = 0, ∀ i, then i∈N 0 = 0, hence contradiction.

7
More on countable additivity. The concept of countable additivity is more subtle then it
looks. Consider the sample space given by Ω = {(x, y), 0 ≤ x ≤ 1, 0 ≤ y ≤ 1}. Define P(A) =
area(A) (which is a legitimate probability law). We have

Ω = ∪x,y {(x, y)}

X
1 = P(Ω) = P (∪x,y {(x, y)}) = P({x, y}) (countable additivity??)
x,y
X
= 0 (Since area of a point is 0)
x,y

= 0. (contradiction)

Note that there is no contradiction in above case when we assign 0 probability to indi-
viduals outcomes since countable additivity is not applicable because we are not dealing
with countable sets here.

Remark 1.2. A zero-probability event does not imply that the event can not occur, rather it
occur very infrequently given that the set of possible outcomes is infinite.

Comparison of discrete and continuous sample spaces

Discrete Sample Space Continuous Sample Space


Finite or countably infinite elements Uncountably infinite elements
Probability of individual outcomes is suffi- Probability of individual outcomes is not suf-
cient to characterize the probability law ficient
discrete uniform law is applicable only for fi- continuous uniform law is applicable
nite sample space
Probability general events are found by sum- Probability general events are found by inte-
mation gration.

References
Dimitri Bertsekas and John N. Tsitsiklis (2008). Introduction to Probability, Athena
Scientific, 2nd edition.

8
Sheldon M. Ross (2009). Introduction to Probability and Statistics for Engineers and
Scientists, Academic Press.

You might also like