You are on page 1of 72

Probability & Statistics

10.07.2020
Anh Tuan Tran (Ph.D.) & Thinh Tien Nguyen (Ph.D.)
1. Random variables
Random variables

Definition:

Consider an experiment with a sample space Ω, a


function

X: Ω → 
is a random variable.
Random variables

Example:

Flip a coin three times.


X: number of Heads.

 X T, T, T = 0.
 X H, T, H = X H, H, T = 2.
 X H, H, H = 3.
Random variables

Example:

Roll 2 dice.
X: sum of the numbers that appeared.

 X 2,5 = 7.
 X 1,6 = 7.
 X 6,6 = 12.
Random variables

Example:

Measure the resting body temperature.


X: the resting body temperature of a person.

 X Ann = 37.5oC.
 X John = 36.2oC.
 X Tom = 37.0oC.
Discrete random variables
Definition:

A random variable 𝑋 that can take on finitely or


countably infinitely many values is called discrete.

Example:

Flip a coin three times. X: number of Heads.

Roll 2 dice. X: sum of the numbers that appeared.


Continuous random variables
Definition:

A random variable 𝑋 that can take on all values in an


interval of .

Example:

Measure the resting body temperature. X: the resting body


temperature of a person.

Measure the life time of devices. X: the life time of a device.


2. Discrete random variables
Mass function

Definition:

Let X be a discrete random variable with values


x1 , x2 , x3 , … The mass function or distribution of X is a
function p defined by

p xi ≔ P X = xi ∀i = 1,2,3, …
Mass function

Example:

Roll 2 dice.
X: sum of the numbers that appeared.

1
 p 2 = P X = 2 = P 1,1 = .
36
2 1
 p 11 = P X = 11 = P 5,6 , (6,5) = = .
36 18
Mass function

Properties:

p xi ≥ 0 ∀i = 1,2,3, …

p xi = 1.
i
Mass function

Example:

Flip a coin three times.


X: number of Heads.

𝐗 𝟎 𝟏 𝟐 𝟑

1 3 3 1
𝐩
8 8 8 8
Expectation

Definition:

The expectation or mean or expected value of a


discrete random variable X is defined by

E X ≔ xi p xi .
i
Expectation
Example:
Flip a coin three times.
X: number of Heads.

𝐗 𝟎 𝟏 𝟐 𝟑

1 3 3 1
𝐩
8 8 8 8

1 3 3 1
E X = 0 ⋅ + 1 ⋅ + 2 ⋅ + 3 ⋅ = 1.5.
8 8 8 8
Expectation

Example:

Roll 2 dice.
X: sum of the numbers that appeared.

1 1 1 1 5 1
E X =2⋅ +3⋅ +4⋅ +5⋅ +6⋅ +7⋅ +8
36 18 12 9 36 6
5 1 1 1 1
⋅ + 9 ⋅ + 10 ⋅ + 11 ⋅ + 12 ⋅ = 7.
36 9 12 18 36
Expectation

Properties:

 Let g:  → , then
Eg X = g xi p xi .
i
 As a consequence,
E aX + b = aE X + b.
Expectation

Properties (cont’):

 E X + Y = E X + E(Y).
 E XY = E X E(Y) if X, Y are independent.
Variance

Definition:

Let X be a discrete random variable with values


x1 , x2 , x3 , … The variance of X is
2
D X ≔E X−μ = xi − μ 2 p xi .
i
In which, μ ≔ E(X).
Variance
Example:
Flip a coin three times.
X: number of Heads.

𝐗 𝟎 𝟏 𝟐 𝟑

1 3 3 1
𝐩
8 8 8 8

D X = E X − 1.5 2
2
1 2
3 2
3 2
1
= 1.5 ⋅ + 0.5 ⋅ + 0.5 ⋅ + 1.5 ⋅ = 0.75.
8 8 8 8
Variance

Properties:

 D X = E X2 − E X 2.
 D X ≥ 0. The equality occurs only if X = const.
 Corollary, E X 2 ≥ E X 2 . The equality occurs only if
X = const.

We defind the standard deviation of X


σ X ≔ D X .
Variance
Example:
Flip a coin three times. X: number of Heads.

𝐗 𝟎 𝟏 𝟐 𝟑

1 3 3 1
𝐩
8 8 8 8

1 3 3 1
E(X 2 )
= 02 2 2 2
⋅ + 1 ⋅ + 2 ⋅ + 3 ⋅ = 3.
8 8 8 8
D X = E X 2 − E X 2 = 3 − 1.52 = 0.75.
𝜎 𝑋 = 0.75 = 0.866025.
Variance

Properties (cont’):

 D aX + b = a2 D X .
 D X + Y = D X + D(Y) if X, Y are independent.
Binomial distribution

Definition:

Assume n independent trials are performed, each


succeeding with probability p.

Let X be the number of successes in the n trials.

Then X~Binom n, p .
Binomial distribution
Example:

Screws are sold in packages of 10. Due to a


manufacturing error, each screw today is
independently defective with probability 0.1.

Let X be the number of defective screws in each


package.

Then X~Binom 10,0.1 .


Binomial distribution
Example:

A midterm test has 20 quizzes. Each quiz has 4


different choices such that there is exactly 1 correct
answer among them. A student took that test.

Let X be the number of quizzes that the student


answer correctly.

Then X~Binom 20,0.25 .


Binomial distribution

Proposition:

Let X~Binom n, p . Then

n k n−k
1. P X=k = p 1−p for k = 0,1, … , n.
k
2. E X = np.
3. D X = np 1 − p .
Binomial distribution

Example:
Let X be the number of defective screws in each
package.

Then X~Binom 10,0.1 .

10
1. P X=2 = 0.12 1 − 0.1 10−2 = 0.19371.
2
2. E X = 10.0.1 = 1.
3. D X = 10.0.1. 1 − 0.1 = 0.9.
Poisson distribution
Definition:

Let λ > 0 be a real number.

Let X be a discrete random variable with non-negative


integer values.

Then X~Poi λ if

e−λ λk
P X=k =
k!
for k = 0,1,2, …
Poisson distribution

Proposition:

Let Xn ~Binom n, pn such that E Xn = npn → λ as


n → +∞. Then
−λ λk
n k e
P Xn = k = pn 1 − pn n−k →
k k!
for k = 0,1,2, … , n.
Poisson distribution

Examples:

1. Number of typos on a page of a book.


2. Number of citizens over 100 years of age in a city.
3. Number of incoming calls per hour in a customer
centre.
4. Number of customers in a post office today.
Poisson distribution

Property:

Let X~Poi(λ). Then

E X = D X = λ.
Poisson distribution
Example:

Assume that there is a chance of 0.25% that a page of


a book has typos. The book has 500 pages in total.
What is the probability that the book has 5 pages with
typos.

X: the number of pages with typos. X~Poi(1.25).

e−1.25 1.25 5
P X=5 = ≈ 0.72%.
5!
Geometric distribution
Definition:

Repeating independent trials, each succeeding with


probability p, until the first success.

Let X the number of trials until the first success. Then


X~Geometric p .

Example:

 Tossing a coin until Head.


 Taking a test until passing it.
Geometric distribution

Proposition:

Let X~Geometric p . Then

1. P X=k = 1−p k−1 p for k = 1,2,3, …


1
2. E X = .
p
1−p
3. D X = 2 .
p
Geometric distribution
Example:

Rolling a die until 3 appears.


X: the number of rollings until 3 appears.
1
Then X~Geometric .
6

1. E X = 6.
5
2. D X = ⋅ 36 = 30.
6
3. The chance that the 7th roll gives the first 3 is
6
5 1
P X=7 = ⋅ ≈ 0.056.
6 6
Cummulative distribution function

Definition:

The cummulative distribution of a discrete random


variable X is given by

F:  → 0,1
such that
F x ≔P X≤x .
Cummulative distribution function
Example:
Flip a coin three times.
X: number of Heads.

𝐗 𝟎 𝟏 𝟐 𝟑

1 3 3 1
𝐩
8 8 8 8

1 3 1
F 1 =P X≤1 =P X=0 +P X=1 = + = .
8 8 2
Cummulative distribution function

Properties:

 F is non-decreasing.
 lim F x = 0 on the left.
x→−∞
 lim F x = 1 on the right.
x→+∞
 P a < X ≤ b = P X ≤ b − P X ≤ a = F b − F(a) for
a < b.
Cummulative distribution function
Cummulative distribution of discrete random variable X:
Cummulative distribution function
Example:

0 if x < 0,
x
if 0 ≤ x < 1,
2
2
F x = if 1 ≤ x < 2,
3
11
if 2 ≤ x < 3,
12
1 if x ≥ 3.

11 1
P X=3 =F 3 −F 2 =1− = .
12 12
3. Continuous random variables
Density function
Example:

Measure the resting body temperature.


X: the resting body temperature of a person.

 X Ann = 37.5oC.
 X John = 36.2oC.
 X Tom = 37.0oC.

P X = 37 =?
Density function
Definition:
Let f:  →  be an integrable function. We assume
f y ≥ 0 for all y ∈  and
+∞

f y dy = 1.
−∞
Suppose a continuous random variable X has its
cummulative distribution function
x

F x = f y dy.
−∞
Then f is the density function of X.
Cummulative distribution function
Properties:

 F is non-decreasing.
 lim F x = 0 on the left.
x→−∞
 lim F x = 1 on the right.
x→+∞
 F is differentiable and F ′ x = f(x).
 P a < X ≤ b = P X ≤ b − P X ≤ a = F b − F(a) for
a < b.
 P X = a = 0 for any constant a ∈ .
 P a<X≤b =P a≤X≤b =P a≤X<b =
P a<X<b .
Density function
Cummulative distribution of continuous random variable X:
Density function
Density function of continuous random variable X. The red
region is P(a ≤ X ≤ b).
Expectation

Definition:

The expectation or mean or expected value of a


continuous random variable X is defined by

+∞

E X ≔ xf x dx.
−∞

In which, f is the density function of X.


Expectation

Properties:

 Let g:  → , then
+∞

Eg X = g(x)f x dx.
−∞
 As a consequence,
E aX + b = aE X + b.
Expectation

Properties (cont’):

 E X + Y = E X + E(Y).
 E XY = E X E(Y) if X, Y are independent.
Variance

Definition:

The variance or mean or expected value of a


continuous random variable X is defined by

+∞
2
D X ≔E X−μ = x − μ 2 f x dx.
−∞

In which, f is the density function of X and μ ≔ E(X).


Variance

Properties:

 D X = E X2 − E X 2.
 D X ≥ 0. The equality occurs only if X = const.
 Corollary, E X 2 ≥ E X 2 . The equality occurs only if
X = const.

We defind the standard deviation of X


σ X ≔ D X .
Variance

Properties (cont’):

 D aX + b = a2 D X .
 D X + Y = D X + D(Y) if X, Y are independent.
Uniform distribution

Definition:

Fix a < b real numbers. X~U a, b if the density


function is given by

1
f x = b − a if x ∈ [a, b],
0 otherwise.
Uniform distribution
Uniform distribution
Proposition:

Let a < c < d < b real. X~U a, b . Then

d
1 d−c
P c≤X≤d = dx = .
b−a b−a
c
b
x a+b
μ≔E X = dx = .
b−a 2
a
b
x−μ 2 b−a 2
D X = dx = .
b−a 12
a
Normal distribution

Definition:

Let μ ∈  and σ > 0. X~N μ, σ2 if the density function


is given by

1 x−μ 2

f x = e 2σ2
σ 2π
for all x ∈ .
Normal distribution
Normal distribution

Theorem:

Let X~N μ, σ2 . Then

a−μ b−μ
P a≤X≤b =P ≤Z≤ .
σ σ

Here Z~N 0,1 .


Normal distribution

Example:

Assume that the grade of an exam X follows a normal


probability distribution model N(μ = 6, σ = 1.5). What
percentage of students didn’t pass the exam?

5−6
P X<5 =P Z< ≈ −0.67 = F −0.67
1.5
= 0.2514.
Normal distribution

Proposition:

Let a < b real. X~N μ, σ2 . Then

+∞
x x−μ 2

E X = e 2σ2 dx = μ.
σ 2π
−∞
+∞
x−μ 2 x−μ 2

D X = e 2σ2 dx = σ2 .
σ 2π
−∞
Normal distribution

Property:

Let
X~Poi λ .

Then if λ is large enough

X~N λ, λ .
Chi-square distribution
Definition:

Let n ∈ . X~χ2 n if

X = X12 + X22 + ⋯ + Xn2 .

In which, Xi ~N(0,1) for all i = 1,2, … , n and are


independent.

n is called the freedom degree.


Chi-square distribution
Density function:

X~χ2 n . Then for x ≥ 0,

1 k
−1 −
x
f x = n x2 e 2
n
22 Γ
2
where
+∞

Γ x ≔ y x−1 e−y dy.


0
For x < 0, f x = 0.
Chi-square distribution
Chi-square distribution

Proposition:

Let X~𝜒 2 (𝑛). Then values of 𝑋 are non-negative and

E X = n.

D X = 2n.
Chi-square distribution

Property:

Let
X~χ2 n .

Then if n is large enough

X−n
~N 0,1 .
2n
Student’s t distribution

Definition:

Let Y~N 0,1 and Z~χ2 n be independent. Then

Y
X≔
Z
n

follows T n with n degrees of freedom.


Student’s t distribution

Density function:

X~T n . Then
n+1 n+1
Γ x2

2
2
f x = n 1+ .
nπΓ n
2
Student’s t distribution
Student’s t distribution
Proposition:

Let X~T(n). Then

E X = 0.

n
for n > 2,
D X = n−2
∞ for 1 < n ≤ 2.

Otherwise, D(X) is undefined.


Student’s t distribution

Property:

Let
X~T n .

Then for n ≥ 30,

X~N 0,1 .

You might also like