You are on page 1of 44

2020 1

2020 2
2020 3
2020 4
For discrete random variable

2020 5
2020 6
X

2020 7
X

2020 8
2020 9
Continuous Random Variable Distributions
For a continuous random variable X we have a function f, called the probability density
function (pdf). Every probability density function must satisfy

 Cumulative distribution function (CDF) of continuous random variable is defined as

X X

2020 10
X

2020 11
2020 12
2020 13
2020 14
2020 15
Discrete Random variable Example
Example: An urn contains 7 balls, numbered from 1 to 7. Two balls are selected
randomly from the urn. Let Random variable X be the difference of the numbers (that
is, the largest number minus the smallest number) of the selected balls. Find the
probability distribution or PMF of X, CDF and its mean, variance and standard
deviation.

If (a)the balls are selected from the urn with replacement.

2020 16
Step II: find random variable X (difference of highest to lowest value)

Step III. Find PMF (or probability distribution for discrete random variable)

2020 17
 Find CDF: Do your self
 Mean value or Expected of discrete random variable X

 Variance of discrete random variable X

 Standard deviation of discrete random variable X.

2020 18
2020 19
Continuous

2020 20
2020 21
2020 22
2020 23
2020 24
2020 25
2020 26
2020 27
I

2020 28
Method: II

2020 29
2020 30
Special Distribution:
Continuous Probability Distributions
 UNIFORM DISTRIBUTION

PDF CDF

2020 31
 EXPONENTIAL DISTRIBUTION
PDF CDF

Application: Exponential distributions find wide applicability in queuing theory, reliability


theory, and communication theory

 Normal (Gaussian) Distribution

Application: A random variable X consisting of a number of components, each with a


general distribution tends to a normal distribution as the number becomes very large.
2020 32
 ERLANG DISTRIBUTION
The distribution of the waiting time for n successive independent events of a Poisson arrival
process is given by the Erlang distribution with n degrees of freedom and is defined for
positive integral values of n by

 GAMMA DISTRIBUTION

2020 33
LAPLACE DISTRIBUTION
The Laplace density, also called a double-exponential density, is defined by

Applications: To model navigation and pilot errors, speech, and image processing.

WEIBULL DISTRIBUTION

2020 34
Rayleigh Distribution

Wide application in modeling of communications


fading channels.

Cauchy Distribution

Application: A Cauchy distribution is a good indicator of how sensitive the tests are to heavy-
tail departures from normality.

BETA DISTRIBUTION

2020 35
Rice/Rician Distribution

In the Rayleigh distribution discussed earlier, the random variables X and Y must be zero
mean with the same variance In many communication problems involving fading
channels, X and Y will not be zero mean and will have mean values equal to
If the random variable Z is given by

Applications:
 To model the paths scattered signals take to a receiver. Specifically, this distribution
models line-of-sight scatter transmissions between two stations in view of each other
that have an unobstructed path between them.

 Line-of-sight scatter includes FM radio waves, microwaves, MRI images in the presence
of noise, and satellite transmissions.

 The distribution also models Rician fading, which is a way to show how signal
cancellations
2020 affect radio propagation. 36
Discrete Probability Distributions
 Bernoulli Distribution
A single event is Bernoulli distributed if the probability of success is p and probability of
failure is q

 BINOMIAL DISTRIBUTION

The probability b(k; n, p) that the event A occurs exactly k times in n Bernoulli trials.

This is called the binomial distribution, and b(k;n,p) is called a probability mass function

2020 37
Poisson Distribution
In the regular Poisson distribution the cumulative distribution has unit jumps at random times
and not at discrete times. Thus, it is discrete in state but continuous in time.

Applications:

2020 38
GEOMETRIC DISTRIBUTION

A sequence of independent Bernoulli trials with p as the probability of success can be


modeled in two different ways. The distribution of the number of successes k is binomial.
The waiting time until the first success occurs is geometrically distributed. If the first
success is to occur at the kth trial, then we must have (k -1) failures preceding the first
success

2020 39
NEGATIVE BINOMIAL DISTRIBUTION
Let us now define the following events to find this distribution

HYPERGEOMETRIC DISTRIBUTION
 The binomial distribution is obtained while sampling with replacement. Let us
select a sample of n microprocessor chips from a bin of N chips. It is known
that K of these chips are defective.

 If we sample with replacement then the probability of any defective chip is K/N.

2020 40
 The probability that k of them will be defective in a sample size of n is given by the
binomial distribution since each drawing of a chip constitutes a Bernoulli trial.

 On the other hand, if we sample without replacement, then the probability for the first
chip is K/N, the probability for the second chip is (K- 1)/(N-1), and for each succeeding
sample the probability changes.

 Hence the drawings of the chips do not constitute Bernoulli trials and binomial
distribution does not hold. Hence we have to use alternate methods to arrive at the
probability of finding k defective chips in n trials.

2020 41
5

2020 42
2020 43
2020 44

You might also like