You are on page 1of 7

Worksheet Title

Semester 2021-2022

Course name Instructor’s name: Worksheet created by


Random signals and noise Dr. Montasir Qasymeh Name: Mohammad Tahmid ID: 1076344
Hassan
Discrete probability models

There are several models for describing discrete random variables. They are the binomial and negative binomial, the
geometric, the hypergeometric and Poisson distributions.

Factorials

The factorial of a positive integer is the product of all positive integers from 1 to that integer. By definition, the factorial
of 0 is 1. The factorial of N, or N! can be thought as being the number of ordered arrangements that can be made from N
distinct objects, as there are N choices for position 1, N-1 choices for position 2 and so on until there is only 1 choice for
position N. 0 factorial (0!) becomes 1, because there is only one way to have no objects, the empty set.

Permutations and combinations

If there are N different objects and they are to be put in M different positions, the number of ordered arrangements that
𝑁!
can be made in M positions, where M is less than or equal to N, there will be N-M items left over, so there will be (𝑁−𝑀)!
(𝑁 𝑃𝑀 ) arrangements.

A combination is a set of distinct objects, irrespective of the arrangements. It can be calculated as


𝑁! 𝑁
,(𝑁 𝐶𝑀 𝑜𝑟 ( )) as the M items present themselves can be arranged in M! ways.
𝑃!(𝑁−𝑃)! 𝑀
The binomial distribution

If there is a fixed number of independent trials of an experiment of two different outcomes which have fixed
probabilities in each trial, the trials are said to follow a binomial distribution. The symbol for a variable X following a
binomial distribution with N trials and a probability P for a desired outcome is𝑋~𝐵(𝑁, 𝑝)

If there are N trials, the probability that there are R trials that give a targeted outcome with probability p is:
𝑁
𝑝(𝑋 = 𝑅) = ( ) (𝑝)𝑅 (1 − 𝑝)𝑁−𝑅
𝑅
The expectation of the binomial random variable is
𝑁 𝑁
𝑁 𝑁!
∑ 𝑅 ( ) (𝑝)𝑥 (1 − 𝑝)𝑁−𝑥 = ∑ 𝑅 ( ) (𝑝)𝑥 (1 − 𝑝)𝑁−𝑥
𝑅 𝑅! (𝑁 − 𝑅)!
𝑅=0 𝑅=0
𝑁
𝑁!
= ∑( ) (𝑝 )𝑥 (1 − 𝑝)𝑁−𝑥
(𝑅 − 1)! (𝑁 − 𝑅)!
𝑅=0
𝑁
(𝑁 − 1)!
= 𝑁𝑝 ∑ ( ) (𝑝)𝑥−1 (1 − 𝑝)𝑁−𝑥
(𝑅 − 1)! (𝑁 − 𝑅)!
𝑅=0

As the term at R=0 contributes nothing,


𝑁
(𝑁 − 1)!
= 𝑁𝑝 ∑ ( ) (𝑝)𝑅−1 (1 − 𝑝)𝑁−𝑅
(𝑅 − 1)! (𝑁 − 𝑅)!
𝑅=1

Use the substitution u=R-1


𝑁−1
(𝑁 − 1)!
= 𝑁𝑝 ∑ ( ) (𝑝)𝑢 (1 − 𝑝)𝑁−1−𝑢 = 𝑁𝑝(1) = 𝑁𝑝
(𝑢)! (𝑁 − 1 − 𝑢)!
𝑅=0

What is the variance of the binomial distribution?

The calculation is complicated due to the factorial. It is not possible to get a squared term directly, but

𝑁(𝑁 − 1) = 𝑁 2 − 𝑁

Find the expectation of 𝑅2 − 𝑅 and add the expectation of 𝑅 to get the expectation of 𝑅2 .
𝑁 𝑁
𝑁 𝑁!
∑ 𝑅(𝑅 − 1) ( ) (𝑝 )𝑥 (1 − 𝑝)𝑁−𝑥 = ∑ ( ) (𝑝)𝑅 (1 − 𝑝 )𝑁−𝑅
𝑅 (𝑅 − 2)! (𝑁 − 𝑅)!
𝑅=0 𝑅=0
𝑁
(𝑁 − 2)!
= 𝑁(𝑁 − 1) ∑ ( ) (𝑝)𝑅 (1 − 𝑝)𝑁−𝑅
(𝑅 − 2)! (𝑁 − 𝑅)!
𝑅=0

The terms at R=0 and R=1 contribute nothing, so we get


𝑁
(𝑁 − 2)!
(𝑁 − 1) ∑ ( ) (𝑝)𝑅 (1 − 𝑝)𝑁−𝑅
(𝑅 − 2)! (𝑁 − 𝑅)!
𝑅=2

Let 𝑢 = 𝑅 − 2
𝑁−2
(𝑁 − 2)!
= 𝑁(𝑁 − 1) ∑ ( ) (𝑝)𝑢+2 (1 − 𝑝)𝑁−2−𝑢
(𝑢)! (𝑁 − 2 − 𝑢)!
𝑅=0
𝑁−2
2
(𝑁 − 2)!
𝑁(𝑁 − 1)𝑝 ∑ ( ) (𝑝)𝑢 (1 − 𝑝 )𝑁−2−𝑢
(𝑢)! (𝑁 − 2 − 𝑢)!
𝑅=0

= 𝑁(𝑁 − 1)𝑝2

So, the expectation of 𝑅2 is 𝑁 2 𝑝2 − 𝑁𝑝 2 + 𝑁𝑝

𝑉𝑎𝑟𝑖𝑎𝑛𝑐𝑒 = 𝜎 2 = 𝐸 (𝑋 2 ) − (𝐸 (𝑋)) 2 = 𝑁 2 𝑝2 − 𝑁𝑝2 + 𝑁𝑝 − (𝑁𝑝)2 = 𝑁𝑝 − 𝑁𝑝2 = 𝑁𝑝(1 − 𝑝)


Negative binomial distribution

This distribution tracks the number of trials needed to reach a specified number of desired outcomes. It is like the
binomial distribution, except that the number of desired outcomes is fixed and the number of trials is variable.

This means there is already a targeted outcome, a trial at the end.

So the probability mass function is


𝑥 − 1 𝑟(
( ) 𝑝 1 − 𝑝)𝑥−𝑟
𝑟−1
Here, x is any integer equal to or greater than r.
∞ ∞
𝑥 − 1 𝑟( (𝑥 − 1)!
∑( ) 𝑝 1 − 𝑝)𝑥−𝑟 = ∑ 𝑝𝑟 (1 − 𝑝)𝑥−𝑟
𝑟−1 (𝑟 − 1)! (𝑥 − 𝑟)!
𝑥=𝑟 𝑥=𝑟

Let u = x-r
∞ ∞
𝑢 + 𝑟 − 1 𝑟( (𝑢 + 𝑟 − 1)! 𝑟
∑( ) 𝑝 1 − 𝑝)𝑢 = ∑ 𝑝 (1 − 𝑝)𝑢
𝑟−1 (𝑟 − 1)! (𝑢)!
𝑢=0 𝑢=0

𝑟(𝑟 + 1)(𝑟 + 2) … (𝑢 + 𝑟 − 1) 𝑟
∑ 𝑝 (1 − 𝑝 )𝑢
(𝑢)!
𝑢=0

𝑟
𝑟(𝑟 + 1)(𝑟 + 2) … (𝑢 + 𝑟 − 1)
𝑝 ∑ (1 − 𝑝 )𝑢
(𝑢)!
𝑢=0

𝑟
(−1)𝑢 (−𝑟)(−𝑟 − 1)(−𝑟 − 2) … (−𝑟 − 𝑢 + 1)
𝑝 ∑ (1 − 𝑝)𝑢
(𝑢)!
𝑢=0
−𝑟
𝑝𝑟 (1 − (1 − 𝑝)) = 𝑝𝑟 𝑝−𝑟 = 1

What is the expectation of the negative binomial distribution?


∞ ∞
𝑥 − 1 𝑟( 𝑥 (𝑥 − 1)!
∑𝑥( ) 𝑝 1 − 𝑝)𝑥−𝑟 = ∑ 𝑝𝑟 (1 − 𝑝)𝑥−𝑟
𝑟−1 (𝑟 − 1)! (𝑥 − 𝑟)!
𝑥=𝑟 𝑥=𝑟

(𝑥)!
=∑ 𝑝𝑟 (1 − 𝑝)𝑥−𝑟
(𝑟 − 1)! (𝑥 − 𝑟)!
𝑥=𝑟

𝑟 (𝑥)!
=∑ ∗ 𝑝 𝑟 (1 − 𝑝)𝑥−𝑟
𝑟 (𝑟 − 1)! (𝑥 − 𝑟)!
𝑥=𝑟

(𝑥)!
= ∑𝑟∗ 𝑝𝑟 (1 − 𝑝)𝑥−𝑟
(𝑟)! (𝑥 − 𝑟)!
𝑥=𝑟

(𝑥)! 𝑝
= ∑𝑟∗ 𝑝𝑟 ∗ (1 − 𝑝)𝑥−𝑟
(𝑟)! (𝑥 − 𝑟)! 𝑝
𝑥=𝑟

𝑟 (𝑥)!
=∑ ∗ 𝑝𝑟+1 ∗ (1 − 𝑝)𝑥−𝑟
𝑝 (𝑟)! (𝑥 − 𝑟)!
𝑥=𝑟

𝑟 (𝑥)!
= ∑∗ 𝑝𝑟+1 ∗ (1 − 𝑝)𝑥−𝑟
𝑝 (𝑟)! (𝑥 − 𝑟)!
𝑥=𝑟
𝑟
=
𝑝
∞ ∞
𝑥 − 1 𝑟( (𝑥 + 1)𝑥 (𝑥 − 1)! 𝑟
∑ 𝑥(𝑥 + 1) ( ) 𝑝 1 − 𝑝 )𝑥−𝑟 = ∑ 𝑝 (1 − 𝑝)𝑥−𝑟
𝑟−1 (𝑟 − 1)! (𝑥 − 𝑟)!
𝑥=𝑟 𝑥=𝑟

(𝑥 + 1)!
=∑ 𝑝𝑟 (1 − 𝑝)𝑥−𝑟
(𝑟 − 1)! (𝑥 − 𝑟)!
𝑥=𝑟

𝑟(𝑟 + 1) (𝑥 + 1)!
=∑ ∗ 𝑝𝑟 (1 − 𝑝)𝑥−𝑟
𝑟(𝑟 + 1) (𝑟 − 1)! (𝑥 − 𝑟)!
𝑥=𝑟

(𝑥)!
= ∑ 𝑟(𝑟 + 1) ∗ 𝑝𝑟 (1 − 𝑝)𝑥−𝑟
(𝑟 + 1)! (𝑥 − 𝑟)!
𝑥=𝑟

(𝑥)! 𝑝2
= ∑ 𝑟(𝑟 + 1) ∗ 𝑝𝑟 ∗ 2 (1 − 𝑝)𝑥−𝑟
(𝑟)! (𝑥 − 𝑟)! 𝑝
𝑥=𝑟

𝑟(𝑟 + 1) (𝑥)!
=∑ 2
∗ 𝑝𝑟+2 ∗ (1 − 𝑝)𝑥−𝑟
𝑝 (𝑟)! (𝑥 − 𝑟)!
𝑥=𝑟

𝑟(𝑟 + 1) (𝑥)!
= ∑ 𝑝𝑟+1 ∗ (1 − 𝑝)𝑥−𝑟
𝑝2 (𝑟)! (𝑥 − 𝑟)!
𝑥=𝑟

𝑟(𝑟 + 1)
𝑝2
The expectation of x(x+1) is found for the negative binomial distribution. This will help us find the expectation for x
squared.

𝑟(𝑟 + 1) 𝑟 𝑟 2 + 𝑟 − 𝑟𝑝
𝐸 (𝑋 2 ) = − =
𝑝2 𝑝 𝑝2
𝑟 2 + 𝑟 − 𝑟𝑝 𝑟 2 𝑟 − 𝑟𝑝 𝑟(1 − 𝑝)
𝑉𝑎𝑟(𝑋 ) = − ( ) = =
𝑝2 𝑝 𝑝2 𝑝2
In the binomial, negative binomial and geometric distributions, it is assumed that the events occur independently of
each other and with fixed probability. If this condition is to be applied in the context of picking objects from a finite set,
it means that the object taken in a given trial should be returned to the set before the next trial. If this condition is not
met, a trial will depend on the outcomes of different trials, as the composition of the set will be altered by the picking
and the proportions of the outcomes in the set will change.

This is the hyper geometric distribution. It can be used to find the probability of selecting s items out of which r are the
desired items from a pool of n objects out of which m are of the desired type.

If the number of desired items present is the variable X…

The number of ways to choose r items out of m items is


𝑚! 𝑚
=( )
𝑟! (𝑚 − 𝑟)! 𝑟

The number of ways to choose s-r items from n-m items is


(𝑛 − 𝑚 )! 𝑛−𝑚
=( )
(𝑠 − 𝑟)! ((𝑛 − 𝑚 ) − (𝑠 − 𝑟))! 𝑠−𝑟
Finally, the probability of selecting s items out of n items is
𝑛! 𝑛
=( )
𝑠! (𝑛 − 𝑠)! 𝑠
𝑚 𝑛−𝑚
( )( )
𝑟 𝑠−𝑟
Then, 𝑃 (𝑋 = 𝑟) = 𝑛
( )
𝑠

The proof that this is a valid is as follows:

The binomial (1 + 𝑥)𝑛 can be expanded as


𝑛
𝑛 𝑛 𝑛 𝑛
1 + ( ) 𝑥 + ( ) 𝑥2 + ⋯ ( ) 𝑥𝑛 = ∑ ( ) 𝑥𝑖
1 2 𝑛 𝑖
𝑖=0

When two polynomials are multiplied together, we have


𝑘 𝑙 𝑘+𝑙 𝑡
𝑖 ] [∑ 𝑗]
[∑ 𝑎𝑖 𝑥 𝑏𝑗 𝑥 = ∑ (∑ 𝑎𝑢 𝑏𝑡−𝑢 ) 𝑥 𝑡
𝑖=0 𝑗=0 𝑡=0 𝑢=0

As the sum of the powers is fixed for each term, the sum of the exponents of the powers that are multiplied together
must be fixed.
(1 + 𝑥)𝑛 = (1 + 𝑥)𝑚 (1 + 𝑥)𝑛−𝑚
𝑛 𝑚 𝑛−𝑚 𝑛 𝑠
𝑛 𝑚 𝑛−𝑚 𝑣 𝑚 𝑛−𝑚
∑ ( ) 𝑥 𝑖 = [∑( )𝑥 𝑟 ] [ ∑ ( )𝑥 ] = ∑ (∑( )( )) 𝑥 𝑠
𝑖 𝑟 𝑣 𝑟 𝑠−𝑟
𝑖=0 𝑟=0 𝑣=0 𝑠=0 𝑟=0

When i=s
𝑠
𝑛 𝑚 𝑛−𝑚
( ) = ∑( )( )
𝑠 𝑟 𝑠−𝑟
𝑟=0

𝑠 𝑚 𝑛−𝑚 𝑠 𝑚 𝑛−𝑚 𝑠 𝑚−1 𝑛−𝑚 𝑠


(𝑚 − 1) (
𝑛−𝑚
𝑟 ( )(
𝑟 𝑠 − 𝑟
) 𝑟( )(
𝑟 𝑠 − 𝑟
) 𝑚𝑠 ( 𝑟 − 1 ) ( 𝑠 − 𝑟 ) 𝑚𝑠 𝑟 − 1 𝑠−𝑟
) 𝑚𝑠
∑ 𝑛 = ∑ 𝑛 = ∑ = ∑ =
( ) ( ) 𝑛 𝑛−1 𝑛 𝑛−1 𝑛
𝑟=0 𝑠 𝑟=1 𝑠 𝑟=1 ( ) 𝑟=1 ( )
𝑠−1 𝑠−1
𝑠 𝑚 𝑛−𝑚 𝑠 𝑚 𝑛−𝑚
𝑟(𝑟 − 1) ( ) ( ) 𝑟(𝑟 − 1) ( ) ( )
∑ 𝑟 𝑠 − 𝑟 = ∑ 𝑟 𝑠−𝑟
𝑛 𝑛
𝑟=0 ( ) 𝑟=2 ( )
𝑠 𝑠
𝑠 𝑚 −2 𝑛−𝑚 𝑠
𝑚(𝑚 − 1)𝑠(𝑠 − 1) ( 𝑟 − 2 ) ( 𝑠 − 𝑟 ) 𝑚(𝑚 − 1)𝑠(𝑠 − 1) 𝑚 (𝑚 − 1)𝑠(𝑠 − 1)
=∑ = ∑
𝑛(𝑛 − 1) 𝑛 − 2 𝑛 (𝑛 − 1) 𝑛 (𝑛 − 1)
𝑟=2 ( ) 𝑟=2
𝑠−2
𝑚(𝑚 − 1)𝑠(𝑠 − 1)
=
𝑛(𝑛 − 1)
𝑚(𝑚 − 1)𝑠(𝑠 − 1) 𝑚𝑠 𝑚𝑠 (𝑚 − 1)(𝑠 − 1) 𝑚𝑠 𝑚𝑠 − 𝑠 − 𝑚 + 1 + 𝑛 − 1
+ = ( + 1) = ( )
(
𝑛 𝑛−1 ) 𝑛 𝑛 𝑛−1 𝑛 𝑛−1
𝑚𝑠 𝑚𝑠 − 𝑠 − 𝑚 + 𝑛 𝑚𝑠 𝑚𝑠 𝑚𝑠 𝑚𝑠 − 𝑠 − 𝑚 + 𝑛 𝑚𝑠
( )− ∗ = ( − )
𝑛 𝑛−1 𝑛 𝑛 𝑛 𝑛−1 𝑛
𝑚𝑠 𝑚𝑠𝑛 − 𝑠𝑛 − 𝑚𝑛 + 𝑛2 𝑚𝑠𝑛 − 𝑚𝑠 𝑚𝑠 𝑚𝑠 − 𝑠𝑛 − 𝑚𝑛 + 𝑛2
= ( − ) = ( )
𝑛 𝑛2 − 𝑛 𝑛2 − 𝑛 𝑛 𝑛2 − 𝑛
𝑚𝑠 𝑠(𝑚 − 𝑛) − 𝑛(𝑚 − 𝑛)
( )
𝑛 𝑛2 − 𝑛

𝑚𝑠 (𝑠 − 𝑛)(𝑚 − 𝑛)
( )
𝑛 𝑛2 − 𝑛

𝑚𝑠 (𝑛 − 𝑠)(𝑛 − 𝑚)
( )
𝑛 𝑛2 − 𝑛

Next, the poisson distribution will be discussed.


𝑁
𝑝(𝑋 = 𝑅) = ( ) (𝑝)𝑅 (1 − 𝑝)𝑁−𝑅
𝑅
Is the formula for the probability of a binomial distribution
𝜆
If the number of trials is very large while the expectation is fixed at a value 𝜆, 𝑝 = 𝑁
𝑅
𝑁 𝜆 𝜆 𝑁−𝑅 𝑁! 𝜆𝑅 𝜆 𝑁−𝑅
( ) ( ) (1 − ) = ∗ (1 − )
𝑅 𝑁 𝑁 𝑅! (𝑁 − 𝑅)! 𝑁𝑅 𝑁
As N approaches infinity,

𝑁! 𝜆𝑅 𝜆 𝑁−𝑅 (𝑁)(𝑁 − 1). . (𝑁 − (𝑅 − 1)) 𝜆𝑅 𝜆 𝑁 𝜆𝑅 −𝜆


∗ 𝑅 (1 − ) → ∗ (1 − ) →∗ 𝑒
𝑅! (𝑁 − 𝑅)! 𝑁 𝑁 𝑅! 𝑁𝑅 1 𝑁 𝑅!

This is the Poisson distribution. It is realized when each event occurs independently at a constant rate and they do not
occur together.

To prove that the distribution is valid, note that



𝑥𝑟
∑ = 𝑒𝑥
𝑟!
𝑟=0

So

𝜆𝑟 −𝜆
∑ 𝑒 = 𝑒 𝜆 𝑒 −𝜆 = 1
𝑟!
𝑟=0

The expectation of the Poisson distribution is


∞ ∞ ∞ ∞ ∞
𝜆𝑟 𝜆𝑟 𝜆𝑟 𝜆𝑟 𝜆𝑟−1 −𝜆
∑ 𝑟 𝑒 −𝜆 = 0 + ∑ 𝑟 𝑒 −𝜆 = ∑ 𝑟 𝑒 −𝜆 = ∑ 𝑒 −𝜆 = 𝜆 ∑ 𝑒
𝑟! 𝑟! 𝑟! (𝑟 − 1)! (𝑟 − 1)!
𝑟=0 𝑟=1 𝑟=1 𝑟=1 𝑟=1

If

𝑢 = 𝑟−1
then
∞ ∞
𝜆𝑟−1 −𝜆 𝜆𝑢 −𝜆
𝜆∑ 𝑒 =𝜆∑ 𝑒 = 𝜆𝑒 𝜆 𝑒 −𝜆 = 𝜆
(𝑟 − 1)! (𝑢)!
𝑟=1 𝑢−0
∞ ∞ ∞ ∞ ∞
𝜆𝑟 𝜆𝑟 𝜆𝑟 𝜆𝑟 𝜆𝑟−2 −𝜆
∑ 𝑟(𝑟 − 1) 𝑒 −𝜆 = 0 + 0 + ∑ 𝑒 −𝜆 = ∑ 𝑒 −𝜆 = ∑ 𝑒 −𝜆 = 𝜆2 ∑ 𝑒 = 𝜆2
𝑟! (𝑟 − 2)! (𝑟 − 2)! (𝑟 − 2)! (𝑟 − 2)!
𝑟=0 𝑟=2 𝑟=2 𝑟=2 𝑟=2

Let x=r-2 so that


∞ ∞
2∑
𝜆𝑟−2 −𝜆 𝜆𝑥 −𝜆
𝜆 𝑒 = 𝜆2 ∑ 𝑒 = 𝜆2
(𝑟 − 2)! (𝑥)!
𝑟=2 𝑥−0

This is the expectation of

𝑟(𝑟 − 1)

The expectation of

𝑟2
Is

𝜆2 + 𝜆
The variance is

𝜆2 + 𝜆 − 𝜆2 = 𝜆

Reference
H. Pishro-Nik, Introduction to Probability. Statistics and Random Processes. .

“Negative binomial distribution,” Wikipedia, 19-Jan-2023. [Online]. Available:


https://en.wikipedia.org/wiki/Negative_binomial_distribution. [Accessed: 21-Jan-2023].

“Hypergeometric distribution,” Wikipedia, 09-Sep-2022. [Online]. Available:


https://en.wikipedia.org/wiki/Hypergeometric_distribution. [Accessed: 21-Jan-2023].

“Vandermonde's identity,” Wikipedia, 02-Jan-2023. [Online]. Available:


https://en.wikipedia.org/wiki/Vandermonde%27s_identity. [Accessed: 21-Jan-2023].

You might also like