Professional Documents
Culture Documents
EEN 330 Topic 3
EEN 330 Topic 3
Semester 2021-2022
There are several models for describing discrete random variables. They are the binomial and negative binomial, the
geometric, the hypergeometric and Poisson distributions.
Factorials
The factorial of a positive integer is the product of all positive integers from 1 to that integer. By definition, the factorial
of 0 is 1. The factorial of N, or N! can be thought as being the number of ordered arrangements that can be made from N
distinct objects, as there are N choices for position 1, N-1 choices for position 2 and so on until there is only 1 choice for
position N. 0 factorial (0!) becomes 1, because there is only one way to have no objects, the empty set.
If there are N different objects and they are to be put in M different positions, the number of ordered arrangements that
𝑁!
can be made in M positions, where M is less than or equal to N, there will be N-M items left over, so there will be (𝑁−𝑀)!
(𝑁 𝑃𝑀 ) arrangements.
If there is a fixed number of independent trials of an experiment of two different outcomes which have fixed
probabilities in each trial, the trials are said to follow a binomial distribution. The symbol for a variable X following a
binomial distribution with N trials and a probability P for a desired outcome is𝑋~𝐵(𝑁, 𝑝)
If there are N trials, the probability that there are R trials that give a targeted outcome with probability p is:
𝑁
𝑝(𝑋 = 𝑅) = ( ) (𝑝)𝑅 (1 − 𝑝)𝑁−𝑅
𝑅
The expectation of the binomial random variable is
𝑁 𝑁
𝑁 𝑁!
∑ 𝑅 ( ) (𝑝)𝑥 (1 − 𝑝)𝑁−𝑥 = ∑ 𝑅 ( ) (𝑝)𝑥 (1 − 𝑝)𝑁−𝑥
𝑅 𝑅! (𝑁 − 𝑅)!
𝑅=0 𝑅=0
𝑁
𝑁!
= ∑( ) (𝑝 )𝑥 (1 − 𝑝)𝑁−𝑥
(𝑅 − 1)! (𝑁 − 𝑅)!
𝑅=0
𝑁
(𝑁 − 1)!
= 𝑁𝑝 ∑ ( ) (𝑝)𝑥−1 (1 − 𝑝)𝑁−𝑥
(𝑅 − 1)! (𝑁 − 𝑅)!
𝑅=0
The calculation is complicated due to the factorial. It is not possible to get a squared term directly, but
𝑁(𝑁 − 1) = 𝑁 2 − 𝑁
Find the expectation of 𝑅2 − 𝑅 and add the expectation of 𝑅 to get the expectation of 𝑅2 .
𝑁 𝑁
𝑁 𝑁!
∑ 𝑅(𝑅 − 1) ( ) (𝑝 )𝑥 (1 − 𝑝)𝑁−𝑥 = ∑ ( ) (𝑝)𝑅 (1 − 𝑝 )𝑁−𝑅
𝑅 (𝑅 − 2)! (𝑁 − 𝑅)!
𝑅=0 𝑅=0
𝑁
(𝑁 − 2)!
= 𝑁(𝑁 − 1) ∑ ( ) (𝑝)𝑅 (1 − 𝑝)𝑁−𝑅
(𝑅 − 2)! (𝑁 − 𝑅)!
𝑅=0
Let 𝑢 = 𝑅 − 2
𝑁−2
(𝑁 − 2)!
= 𝑁(𝑁 − 1) ∑ ( ) (𝑝)𝑢+2 (1 − 𝑝)𝑁−2−𝑢
(𝑢)! (𝑁 − 2 − 𝑢)!
𝑅=0
𝑁−2
2
(𝑁 − 2)!
𝑁(𝑁 − 1)𝑝 ∑ ( ) (𝑝)𝑢 (1 − 𝑝 )𝑁−2−𝑢
(𝑢)! (𝑁 − 2 − 𝑢)!
𝑅=0
= 𝑁(𝑁 − 1)𝑝2
This distribution tracks the number of trials needed to reach a specified number of desired outcomes. It is like the
binomial distribution, except that the number of desired outcomes is fixed and the number of trials is variable.
Let u = x-r
∞ ∞
𝑢 + 𝑟 − 1 𝑟( (𝑢 + 𝑟 − 1)! 𝑟
∑( ) 𝑝 1 − 𝑝)𝑢 = ∑ 𝑝 (1 − 𝑝)𝑢
𝑟−1 (𝑟 − 1)! (𝑢)!
𝑢=0 𝑢=0
∞
𝑟(𝑟 + 1)(𝑟 + 2) … (𝑢 + 𝑟 − 1) 𝑟
∑ 𝑝 (1 − 𝑝 )𝑢
(𝑢)!
𝑢=0
∞
𝑟
𝑟(𝑟 + 1)(𝑟 + 2) … (𝑢 + 𝑟 − 1)
𝑝 ∑ (1 − 𝑝 )𝑢
(𝑢)!
𝑢=0
∞
𝑟
(−1)𝑢 (−𝑟)(−𝑟 − 1)(−𝑟 − 2) … (−𝑟 − 𝑢 + 1)
𝑝 ∑ (1 − 𝑝)𝑢
(𝑢)!
𝑢=0
−𝑟
𝑝𝑟 (1 − (1 − 𝑝)) = 𝑝𝑟 𝑝−𝑟 = 1
𝑟(𝑟 + 1)
𝑝2
The expectation of x(x+1) is found for the negative binomial distribution. This will help us find the expectation for x
squared.
𝑟(𝑟 + 1) 𝑟 𝑟 2 + 𝑟 − 𝑟𝑝
𝐸 (𝑋 2 ) = − =
𝑝2 𝑝 𝑝2
𝑟 2 + 𝑟 − 𝑟𝑝 𝑟 2 𝑟 − 𝑟𝑝 𝑟(1 − 𝑝)
𝑉𝑎𝑟(𝑋 ) = − ( ) = =
𝑝2 𝑝 𝑝2 𝑝2
In the binomial, negative binomial and geometric distributions, it is assumed that the events occur independently of
each other and with fixed probability. If this condition is to be applied in the context of picking objects from a finite set,
it means that the object taken in a given trial should be returned to the set before the next trial. If this condition is not
met, a trial will depend on the outcomes of different trials, as the composition of the set will be altered by the picking
and the proportions of the outcomes in the set will change.
This is the hyper geometric distribution. It can be used to find the probability of selecting s items out of which r are the
desired items from a pool of n objects out of which m are of the desired type.
As the sum of the powers is fixed for each term, the sum of the exponents of the powers that are multiplied together
must be fixed.
(1 + 𝑥)𝑛 = (1 + 𝑥)𝑚 (1 + 𝑥)𝑛−𝑚
𝑛 𝑚 𝑛−𝑚 𝑛 𝑠
𝑛 𝑚 𝑛−𝑚 𝑣 𝑚 𝑛−𝑚
∑ ( ) 𝑥 𝑖 = [∑( )𝑥 𝑟 ] [ ∑ ( )𝑥 ] = ∑ (∑( )( )) 𝑥 𝑠
𝑖 𝑟 𝑣 𝑟 𝑠−𝑟
𝑖=0 𝑟=0 𝑣=0 𝑠=0 𝑟=0
When i=s
𝑠
𝑛 𝑚 𝑛−𝑚
( ) = ∑( )( )
𝑠 𝑟 𝑠−𝑟
𝑟=0
𝑚𝑠 (𝑠 − 𝑛)(𝑚 − 𝑛)
( )
𝑛 𝑛2 − 𝑛
𝑚𝑠 (𝑛 − 𝑠)(𝑛 − 𝑚)
( )
𝑛 𝑛2 − 𝑛
This is the Poisson distribution. It is realized when each event occurs independently at a constant rate and they do not
occur together.
So
∞
𝜆𝑟 −𝜆
∑ 𝑒 = 𝑒 𝜆 𝑒 −𝜆 = 1
𝑟!
𝑟=0
If
𝑢 = 𝑟−1
then
∞ ∞
𝜆𝑟−1 −𝜆 𝜆𝑢 −𝜆
𝜆∑ 𝑒 =𝜆∑ 𝑒 = 𝜆𝑒 𝜆 𝑒 −𝜆 = 𝜆
(𝑟 − 1)! (𝑢)!
𝑟=1 𝑢−0
∞ ∞ ∞ ∞ ∞
𝜆𝑟 𝜆𝑟 𝜆𝑟 𝜆𝑟 𝜆𝑟−2 −𝜆
∑ 𝑟(𝑟 − 1) 𝑒 −𝜆 = 0 + 0 + ∑ 𝑒 −𝜆 = ∑ 𝑒 −𝜆 = ∑ 𝑒 −𝜆 = 𝜆2 ∑ 𝑒 = 𝜆2
𝑟! (𝑟 − 2)! (𝑟 − 2)! (𝑟 − 2)! (𝑟 − 2)!
𝑟=0 𝑟=2 𝑟=2 𝑟=2 𝑟=2
𝑟(𝑟 − 1)
The expectation of
𝑟2
Is
𝜆2 + 𝜆
The variance is
𝜆2 + 𝜆 − 𝜆2 = 𝜆
Reference
H. Pishro-Nik, Introduction to Probability. Statistics and Random Processes. .