Professional Documents
Culture Documents
Theorems
2 Limit Theorems
Introduction
Chebyshev’s Theorem
Law of Large Numbers
Central Limit Theorem
3 Sampling Distributions
Introduction
Definition
Let X and Y be random variables. If both X and Y are discrete, then:
f (x , y ) = P (X = x , Y = y )
Definition
If both X and Y are continuous then f (x , y ) is called the probability
density function (joint pdf) of X and Y iff:
Z bZ d
P (a ≤ X ≤ b, c ≤ Y ≤ d) = f (x , y ) dxdy
a c
Suppose that there are 8 red balls, 10 yellow balls, and 20 blue balls in
a bucket. A total of 10 balls are randomly selected from this bucket.
Let X = number of red balls and Y = number of blue balls. Find the
joint probability function of the bivariate random variable (X , Y ).
Theorem
If X and Y are two random variables with joint probability function
f(x , y ), then:
f (x , y ) ≥ 0 for all x and y
P
If X and Y are discrete, then x ,y f (x , y ) = 1
where the sum is over all values (x , y ) that are assigned nonzero
probabilities. If X and Y are continuous, then
Z ∞ Z ∞
f (x , y ) = 1.
−∞ −∞
Suppose that we are given the joint probability function (pdf or pmf)
We can obtain the probability distribution function of one of the
components through the marginals.
Definition
The marginal pmf or pdf of X , fX (x ) is defined by:
(R ∞
−∞ f (x , y ) dy X , Y are continuous
fX (x ) = P
∀y f (x , y ) X , Y are discrete
Let
(C x + y ) x,y ≥ 0, x+y<1
(
f (x , y ) =
0 otherwise
otherwis
Find:
1) Show the range of (X,Y), RXy, in the x-y plane
2) Find the constant C
3) The Marginal PDF fX (x ) and fY (y )
4) Find P(Y<2X2)
Definition
Let f (x , y ) be the joint probability function. The expected value of
(X , Y ) is:
(P
x ,y xyf (x , y ) X , Y are discrete
E (X , Y ) = R∞ R∞
−∞ −∞ xyf (x , y ) dxdy X , Y are continuous
E [XY ] = E [X ] E [Y ]
Definition
Let X and Y be jointly distributed with pmf or pdf f (x , y ). Then, the
conditional expectation of X given Y = y is:
(P
xxf (x | y ) X , Y are discrete
E (X | Y = y ) = R∞
−∞ xf (x | y ) dx X , Y are continuous
Let (
(x+y)/2 x>0, y>0, 3x+y>3
f (x , y ) =
0 otherwise
Find:
▶ E [X ]
Definition
The covariance between two random variables X and Y is defined by:
▶ If U = a1 X + b1 and V = a2 Y + b2 , then:
and (
ρXY if a1 a2 > 0
ρUV =
−ρXY otherwise
▶ Var (aX + bY ) = a2 Var (X ) + b 2 Var (Y ) + 2abCov (X , Y )
2 Limit Theorems
Introduction
Chebyshev’s Theorem
Law of Large Numbers
Central Limit Theorem
3 Sampling Distributions
Introduction
E (X1 + X2 + . . . + Xk ) = µ1 + µ2 + . . . + µk .
2 Limit Theorems
Introduction
Chebyshev’s Theorem
Law of Large Numbers
Central Limit Theorem
3 Sampling Distributions
Introduction
2 Limit Theorems
Introduction
Chebyshev’s Theorem
Law of Large Numbers
Central Limit Theorem
3 Sampling Distributions
Introduction
The theorem gives a lower bound for the area under a curve between
two points that:
▶ are on opposite sides of the mean
▶ equidistant from the mean
Why is this result important?
▶ We do not need to know the distribution of the underlying population
▶ We only need to know its mean and variance.
The theorem was developed by a Russian mathematician, Pafnuty
Chebyshev (1821-1894)
Theorem
Let the random variable X have a mean µ and standard deviation σ. Then
for K > 0, a constant:
1
P (|X − µ| < K σ) ≥ 1 −
K2
h i
E (X − µ)2 Var (X )
P {|X − µ| ≥ ϵ} ≤ = , for some ϵ > 0
ϵ2 ϵ2
2 Limit Theorems
Introduction
Chebyshev’s Theorem
Law of Large Numbers
Central Limit Theorem
3 Sampling Distributions
Introduction
We can now use Chebyshev’s theorem to prove the weak law of large
numbers.
The law says that if the sample size n is large, the sample mean rarely
deviates from the mean of the distribution of X .
Theorem
Let X1 , . . . , Xn be a set of pairwise independent random variables with
E (Xi ) = µ and Var (Xi ) = σ 2 . The for any c > 0,
n o σ2
P µ − c ≤ X̄ ≤ µ + c ≥ 1 −
nc 2
and as n → ∞, the probability approaches 1.
Sn
P −µ <ϵ →1
n
as n → ∞.
2 Limit Theorems
Introduction
Chebyshev’s Theorem
Law of Large Numbers
Central Limit Theorem
3 Sampling Distributions
Introduction
Theorem
If X1 , . . . , Xn is a random sample from an infinite population with mean µ
and variances σ 2 , and the moment-generating function MX (t), then the
(X̄ −µ)
limiting distribution of Zn = σ/√n as n → ∞ is the standard normal
probability distribution.
2 Limit Theorems
Introduction
Chebyshev’s Theorem
Law of Large Numbers
Central Limit Theorem
3 Sampling Distributions
Introduction
Definition
A sample is a set of observable random variables X1 , . . . , Xn . The number
n is called the sample size.
Definition
A random sample of size n from a population is a set of n independent
and identically distributed (iid) observable random variables X1 , . . . , Xn .
Definition
A function T of observable random variables X1 , . . . , Xn that does not
depend on any unknown parameters is called a statistic.
Definition
The probability distribution of a sample statistic is called the sampling
distribution.
Theorem
Let X1 , . . . , Xn be a random sample
of size n from
a population with mean
2 σ2
µ and variance σ . Then, E X̄ = µ and Var X̄ = n .