You are on page 1of 6

Discrete Probability Distribution

SHUVAJIT PAUL
CAMELLIA INSTUTE OF ENGINEERING & TECHNOLOGY
DEPT-ELECTRONICS & COMMUNICATION ENGINEERING
UNIVERSITY ROOL NO – 27100322006 , SEM-3RD SEM
PAPER NAME & CODE – Probality and Satistics : BS-M301
EMAIL ID :- shuvajitpaul6@gmail.com

Discrete probability distribution is a type of probability distribution that shows all possible values of a discrete random variable
along with the associated probabilities. In other words, a discrete probability distribution gives the likelihood of occurrence of
each possible value of a discrete random variable.

Geometric distributions, binomial distributions, and Bernoulli distributions are some commonly used discrete probability
distributions. This article sheds light on the definition of a discrete probability distribution, its formulas, types, and various
associated examples.

What is Discrete Probability Distribution?

A discrete probability distribution and a continuous probability distribution are two types of probability distributions that define
discrete and continuous random variables respectively. A probability distribution can be defined as a function that describes all
possible values of a random variable as well as the associated probabilities.

Discrete Probability Distribution Definition

A discrete probability distribution can be defined as a probability distribution giving the probability that a discrete random
variable will have a specified value. Such a distribution will represent data that has a finite countable number of outcomes.
There are two conditions that a discrete probability distribution must satisfy. These are given as follows:

• 0 ≤ P(X = x) ≤ 1. This implies that the probability of a discrete random variable, X, taking on an exact value, x, lies
between 0 and 1.

• ∑P(X = x) =1. The sum of all probabilities must be equal to 1.

Discrete Probability Distribution Example

Suppose a fair dice is rolled and the discrete probability distribution has to be created. The possible outcomes are {1, 2, 3, 4, 5,
6}. Thus, the total number of outcomes will be 6. All numbers have a fair chance of turning up. This means that the probability
of getting any one number is 1 / 6. Using this data the discrete probability distribution table for a dice roll can be given as
follows:

x 1 2 3 4 5 6

P(X = x) 1/6 1/6 1/6 1/6 1/6 1/6


Discrete Probability Distribution Formula

A discrete random variable is used to model a discrete probability distribution. There are two main functions associated with
such a random variable. These are the probability mass function (pmf) and the probability distribution function or cumulative
distribution function (CDF).

Discrete Probability Distribution PMF

The probability mass function can be defined as a function that gives the probability of a discrete random variable, X, being
exactly equal to some value, x. This function is required when creating a discrete probability distribution. The formula is given
as follows:

f(x) = P(X = x)

Discrete Probability Distribution CDF

The cumulative distribution function gives the probability that a discrete random variable will be lesser than or equal to a
particular value. The value of the CDF can be calculated by using the discrete probability distribution. Its formula is given as
follows:

F(x) = P(X ≤ x)

Discrete Probability Distribution Mean

The mean of a discrete probability distribution gives the weighted average of all possible values of the discrete random variable.
It is also known as the expected value. The formula for the mean of a discrete random variable is given as follows:

E[X] = ∑x P(X = x)

Discrete Probability Distribution Variance

The discrete probability distribution variance gives the dispersion of the distribution about the mean. It can be defined as
the average of the squared differences of the distribution from the mean, μμ. The formula is given below:

Var[X] = ∑(x - μμ)2 P(X = x)

Discrete Probability Distribution Types

A discrete probability distribution is used in a Monte Carlo simulation to find the probabilities of different outcomes. The most
commonly used types of discrete probability distributions are given below.
There are many discrete probability distributions to be used in different scenarios. We will discuss Discrete distributions in this
post. Binomial and Poisson distributions are the most discussed ones in the following list.
1. Bernoulli Distribution
This distribution is generated when we perform an experiment once and it has only two possible outcomes – success and failure.
The trials of this type are called Bernoulli trials, which form the basis for many distributions discussed below. Let p be the
probability of success and 1 – p is the probability of failure.
The PMF is given as
One example of this would be flipping a coin once. p is the probability of getting ahead and 1 – p is the probability of getting a
tail. Please note down that success and failure are subjective and are defined by us depending on the context.
2. Binomial Distribution
This is generated for random variables with only two possible outcomes. Let p denote the probability of an event is a success
which implies 1 – p is the probability of the event being a failure. Performing the experiment repeatedly and plotting the
probability each time gives us the Binomial distribution.
The most common example given for Binomial distribution is that of flipping a coin n number of times and calculating the
probabilities of getting a particular number of heads. More real-world examples include the number of successful sales calls for
a company or whether a drug works for a disease or not.
The PMF is given as,

where p is the probability of success, n is the number of trials and x is the number of times we obtain a success.
3. Hypergeometric Distribution
Consider an event of drawing a red marble from a box of marbles with different colors. The event of drawing a red ball is a
success and not drawing it is a failure. But each time a marble is drawn it is not returned to the box and hence this affects the
probability of drawing a ball in the next trial. The hypergeometric distribution models the probability of k successes over n trials
where each trial is conducted without replacement. This is unlike the binomial distribution where the probability remains
constant through the trials.

The PMF is given as,

where k is the number of possible successes, x is the desired number of successes, N is the size of the population and n is the
number of trials.

4. Negative Binomial Distribution


Sometimes we want to check how many Bernoulli trials we need to make in order to get a particular outcome. The desired
outcome is specified in advance and we continue the experiment until it is achieved. Let us consider the example of rolling a
dice. Our desired outcome, defined as a success, is rolling a 4. We want to know the probability of getting this outcome thrice.
This is interpreted as the number of failures (other numbers apart from 4) that will occur before we see the third success.
The PMF is given as,

where p is the probability of success, k is the number of failures observed and r is the desired number of successes until the
experiment is stopped.
Like in Binomial distribution, the probability through the trials remains constant and each trial is independent of the other.
5. Geometric Distribution
This is a special case of the negative binomial distribution where the desired number of successes is 1. It measures the number
of failures we get before one success. Using the same example given in the previous section, we would like to know the number
of failures we see before we get the first 4 on rolling the dice.

where p is the probability of success and k is the number of failures. Here, r = 1.

6. Poisson Distribution
This distribution describes the events that occur in a fixed interval of time or space. An example might make this clear.
Consider the case of the number of calls received by a customer care center per hour. We can estimate the average number of
calls per hour but we cannot determine the exact number and the exact time at which there is a call. Each occurrence of an event
is independent of the other occurrences.
The PMF is given as,

where λ is the average number of times the event has occurred in a certain period of time, x is the desired outcome and e is the
Euler’s number.

7. Multinomial Distribution
In the above distributions, there are only two possible outcomes – success and failure. The multinomial distribution, however,
describes the random variables with many possible outcomes. This is also sometimes referred to as categorical distribution as
each possible outcome is treated as a separate category. Consider the scenario of playing a game n number of times.
Multinomial distribution helps us to determine the combined probability that player 1 will win x1 times, player 2 will win
x2 times and player k wins xk times.
The PMF is given as,

where n is the number of trials, p1,……pk denote the probabilities of the outcomes x1……xk respectively.

How To Find Discrete Probability Distribution?

A discrete probability distribution can be represented either in the form of a table or with the help of a graph. To find a discrete
probability distribution the probability mass function is required. In other words, to construct a discrete probability distribution,
all the values of the discrete random variable and the probabilities associated with them are required. Suppose a fair coin is
tossed twice. Say, the discrete probability distribution has to be determined for the number of heads that are observed. The steps
are as follows:

• Step 1: Determine the sample space of the experiment. When a fair coin is tossed twice the sample space is {HH, HT,
TH, TT}. Here, H denotes a head and T represents a tail. Thus, the total number of outcomes is 4.

• Step 2: Define a discrete random variable, X. For the example let X be the number of heads observed.

• Step 3: Identify the possible values that the variable can assume. There are 3 possible values of X. These are 0 (no
head is observed), 1 (exactly one head is observed), and 2 (the coin lands on heads twice).
• Step 4: Calculate the probability associated with each outcome. In the given example, the probability can be calculated
by the formula, number of favorable outcomes / total number of possible outcomes.

• Step 5: To get the discrete probability distribution represent the probabilities and the corresponding outcomes in
tabular form or in graphical form. This is expressed as follows:

x 0 {TT} 1 {HT, TH} 2 {HH}

P(X = x) 1 / 4 = 0.25 2 / 4 = 0.5 1 / 4 = 0.25

A histogram can be used to represent the discrete probability distribution for this example.

❖ Example 1: Suppose a pair of fair dice are rolled. Let X be the random variable representing the sum of the dice.
Construct a discrete probability distribution for the same.

➢ Solution: The sample space for rolling 2 dice is given as follows:

Thus, the total number of outcomes is 36.

The possible values of X range between 2 to 12. X = 2 means that the sum of the dice is 2. This can happen only when (1, 1) is
obtained.

Hence, P(X = 2) = 1 / 36.

Using a similar process, the discrete probability distribution can be represented as follows:

x 2 3 4 5 6 7 8 9 10 11 12

P(X 1/ 2/ 3/ 4/ 5/ 6/ 5/ 4/ 3/ 2/ 1/
= x) 36 36 36 36 36 36 36 36 36 36 36

The graph of the discrete probability distribution is given as follows

❖ Example 2: Given a discrete probability distribution, find the value of k.

x 1 2 3 4
P(X = x) 0.2 0.5 k 0.1

➢ Solution: For a discrete probability distribution, ∑P(X = x) =1.


By using this we get,
0.2 + 0.5 + k + 0.1 = 1
k + 0.8 = 1
k = 1 - 0.8 = 0.2
➢ Answer: k = 0.

❖ Example 3: Find the expected value of the given discrete probability distribution.

x 0 1 2 3 4

P(X = x) 0.44 0.36 0.15 0.04 0.01

➢ Solution: The formula for the expected value of a discrete probability distribution is
E[X] = ∑x P(X = x)
E[X] = (0)(0.44) +(1)(0.36) + (2)(0.15) + (3)(0.04) + (4)(0.01)
= 0.82
➢ Answer: E[X] = 0.82

You might also like