You are on page 1of 49

ECT305 Analog and Digital Communication

MODULE-2 , PART-1
RANDOM VARIABLES & RANDOM PROCESSES

Dr. Susan Dominic


A s s i stant Pro fess or
De p a rtm ent O f E l e ct ronic s a n d C o m m u n i cat ion E n g i n eeri ng
R a ja giri S c h o ol Of E n gi neering A n d Te c hno logy

RAJAGIRI SCHOOL OF ENGINEERING AND TECHNOLOGY, KOCHI


• Random implies something that is not predictable with certainty

• Actual physical phenomenon that we have are understood using mathematical models
that can be
• Deterministic : there is no uncertainty regarding its time-dependent behavior at any
point in time Ex: LTI systems

• Probabilistic : most problems cannot be modeled using deterministic model since


multiple unknown factors are involved
Random Experiment :
Any experiment whose outcome cannot be predicted with certainty Ex: Flipping a

coin, throwing a dice, drawing card from a deck of cards, noise voltage in a channel at

a given time

Sample Space (𝑺):


Set of all possible outcomes of a random experiment. The individual outcomes are

denoted as 𝑠𝑖 s

Sample space can be discrete(finite number of outcomes) or non-discrete(infinite

possibilities)
Events :
Subsets of sample space to which a probability can be assigned
Random Variable (RV):
A RV is a mapping from the sample space 𝑆 to the set of real numbers
The RV is a function whose domain is the sample space and range is some set of real numbers

Ex: Die throwing experiment where the number of dots on top is to be counted
• Set of possible outcomes/sample space= S= {1, 2, 3,4, 5, 6}
• To each outcome 𝑠𝑖 ∈ S we assign the numbers X(𝑠𝑖 ) =10i . Thus X(𝑠1 )=10,…., X(𝑠6 )=60
• Here ‘X’ is the RV and the domain is the set S= {𝑠1 , 𝑠2 ,…., 𝑠6 } and its range is the set of
numbers {10, 20, 30, 40, 50, 60}
• Since the elements of the sample space(outcomes of the experiment) have
probabilities associated with them, it is said that the RV takes on various possible
values with certain probability
Types of RVs

• Discrete RV : A RV ‘X’ is called as a discrete RV if X can take only a finite number of values
Ex: die throwing

• Continuous RV : A RV ‘X’ is called a continuous RV if it can take any value in the real line
within a bounded or unbounded interval , i.e., it can take a continuous range of values
Ex: noise voltage at a particular time can take any value from negative infinity to positive
infinity

Notation used: Random variable ‘X’


Numerical value taken ‘𝑥’
Probability mass function (PMF) of R.V. X is defined as
𝑃𝑋 𝑥 =𝑃 𝑋=𝑥
Example: Consider a coin tossed two times. Let X be the R.V. that denotes the no. of
observed heads. Find the range of X and its probability mass function
𝑆 = {𝐻𝐻, 𝐻𝑇, 𝑇𝐻, 𝑇𝑇}
𝑅𝑋 = 0, 1, 2
𝑃𝑋 𝑥
Probability mass function 𝑃𝑋 𝑥 =𝑃 𝑋=𝑥
1
𝑃𝑋 0 = = 0.25 0.5
4
2 0.25
𝑃𝑋 1 = = 0.5
4
1 0 1 2 𝑥
𝑃𝑋 2 = = 0.25
4
Distribution Function or Cumulative Distribution Function(CDF): FX (𝑥) of a RV X

Distribution function FX (𝑥) of a RV X is given by

FXF (𝑥)=P(X
(𝑥)=P(X ≤ 𝑥≤ )𝑥 )
X

It is the probability that the RV X takes a value less than or equal to the real number 𝑥
Q 2.1) In a game a six faced die is thrown. If 1 or 2 comes, the player gets Rs.30, if 3 or
4 comes, the player gets Rs.10, if 5 comes, he loses Rs,30 and in the event of 6 , he
loses Rs.100. Plot the CDF and PMF of gain or loss.
Properties of distribution function :

1. 0 ≤ FX 𝑥 ≤ 1

2. FX 𝑥1 ≤ FX (𝑥2 ) if 𝑥1 ≤ 𝑥2 Monotonicity of distribution function (i.e.,

FX 𝑥 is a monotone non-decreasing function of 𝑥

3. FX (−∞) = 0

4. FX (+∞) = 1
For continuous RVs, we can define Probability density function(pdf)

where is a dummy variable

Probability of a RV taking a value between 𝑥1 and 𝑥2 can be obtained by integrating


the pdf between 𝑥1 and 𝑥2
Properties of pdf

1. Non-negativity :

𝑓𝑋 𝑥 ≥ 0 ∀ 𝑥

2. Normalization :

The total area under the graph of the pdf is equal to unity
Multiple RVs

• More than one RV can be defined on the same sample space S


• Consider two RVs X and Y
The joint pdf is given by

𝑥2 𝑦2

𝑃 𝑥1 ≤ 𝑋 ≤ 𝑥2 , 𝑦1 ≤ 𝑌 ≤ 𝑦2 = න න 𝑓𝑋,𝑌 (𝑥, 𝑦) ⅆ𝑦 ⅆ𝑥
𝑥1 𝑦1

Normalization property :

This can be extended to multiple RVs


Statistical Averages of RVs

• A continuous RV can be fully described by its pdf


• But sometimes provides more details than required
• In such cases, statistical averages can be usually considered to be adequate
for statistical characterization of a RV

1) Expected Value or mean value or average value of a RV X or first


moment of a RV

Denotes expectation or averaging


operator
2) 𝐧𝐭𝐡 moment of a pdf of a RV X

The first two moments are the most important for communication systems analysis

n=1 gives the mean value or expected value

3) n =2, mean square value of a RV X


4) Variance of a RV X

5) The square root of the variance , is called as the standard deviation of RV X

Variance is a measure of the “randomness” of a RV. Higher variance implies that the
RV can take a wider range of values ( more random)
Relation between variance and mean square value

Variance = mean square value – square of mean

6) Covariance of two RVs X and Y


Two random variables X and Y are said to be uncorrelated if and only if their
covariance is zero Based on the correlation coefficient
defined as

7) Correlation of two RVs X and Y defined by

Note:- Two RVs said to be independent if

For independent RVs, covariance is zero and hence they are uncorrelated(vice versa also true)
• For independent RVs, joint pdf is equal to product of individual pdfs
If 0, then X and Y are said to be orthogonal.
Gaussian RV

• Most commonly used distribution in the analysis of communication systems


Random Processes (RP) or Stochastic Processes

“A Stochastic process is a set of RVs indexed in time”

A random process or a stochastic process or a random signal can be viewed in two


different related ways
1. “ View RP as a collection of time functions or signals corresponding to various
outcomes of a random experiment”
• Corresponding to each outcome 𝑠𝑖 ∈ S of the random experiment, there exists a
signal X(t, 𝑠𝑖 )
• This similar to the definition of a RV where a real number is assigned to each outcome
𝑠𝑖
• For each 𝑠𝑖 there exists a time function X(t, 𝑠𝑖 ) which is called a sample function or

a realization of the random process

• The collection of all sample functions together is called as an ensemble


• At each time instant 𝑡𝑘 , for different 𝑠𝑖 s, the numbers X(𝑡𝑘 , 𝑠𝑖 )

constitute a RV denoted by X(𝑡𝑘 ) RV is nothing but assignment

of real numbers to the outcomes of a random experiment

• Bridge between the concept of a RV and a RP

• At any time, the value of a RP constitutes a RV


2. We may view the random signal at 𝑡1 , 𝑡2 ,… or any 𝑡 𝜖 ℝ as a collection of RVs

{X(𝑡1 ), X(𝑡2 ),…..} or in general {X(t), 𝑡 𝜖 ℝ }

• Hence a RP is represented as a collection of RVs indexed by time

▪ If the index set is a set of real numbers, RP is continuous time RP

▪ If index set is the set of all integers, RP is discrete time RP


▪ Note: - In a RP, the waveforms or the sample functions are not random but deterministic

- Randomness is not associated with the waveform but with the certainty as to

which waveform would occur at a given trial

Difference between a RV and a RP


• For a RV, the outcome of a random experiment is mapped into a real number
• For a RP, the outcome of a random experiment is mapped into a signal or a waveform
that is a function of time
Statistical Description of a RP

A RP is a collection of an infinite number of RVs and the complete information about


several RVs is provided by the joint pdf of those RVs

Let X(𝑡1 ), X(𝑡2 ),….. X(𝑡𝑘 ) be the RVs obtained by sampling X(t) at times 𝑡1 , 𝑡2 ,…. 𝑡𝑘 resp.

For the k samples, they can be fully characterized by the joint pdf ( 𝑘 𝑡ℎ order ) or 𝑘 𝑡ℎ
order CDF
= 𝑃 X(𝑡1 ≤ 𝑥1 , X(𝑡2 ≤ 𝑥2 , … . X(𝑡𝑘 ) ≤ 𝑥𝑘 )
The joint pdf
𝜕𝑘
𝑓X(𝑡1),X(𝑡2 ),….. X(𝑡𝑘) 𝑥1 , 𝑥2 , … , 𝑥𝑘 =
𝜕𝑥1 . 𝜕𝑥2 … 𝜕𝑥𝑘

Suppose we shift all the sampling times by a fixed amount 𝜏 and obtain a new set
of RVs , we have the new joint CDF
Mean or expectation of a RP X(t) (first order moment)

We define the mean of the process X(t) as the expectation of the RV obtained by
sampling the process at time t

where is the first order pdf of X(t) observed at time t

If we take a different instant of time 𝑡1 , we can define a different RV X(𝑡1 ) and


correspondingly a different mean 𝜇𝑋 (𝑡1 ) and so on
Autocorrelation function of a RP

Autocorrelation function of the stochastic process X(t) is the expectation of the product
of two RVs X(𝑡1 ) and X(𝑡2 ) obtained by sampling the process X(t) at times 𝑡1 and 𝑡2

𝑅𝑋𝑋 𝑡1 , 𝑡2

where 𝑓X(𝑡1 ),X(𝑡2 ) 𝑥1 , 𝑥2 is the joint pdf of X(t) sampled at times 𝑡1 and 𝑡2 and
𝑅𝑋𝑋 𝑡1 , 𝑡2 is the second order moment derived from second order pdf
• The “auto” in autocorrelation refers to the correlation of the process with itself
If we let 𝑡1 =t and 𝑡1 - 𝑡2 = 𝜏 , the definition of autocorrelation function can be
reformulated as
𝑅𝑋 𝑡, 𝑡 + 𝜏

Here 𝜏 denotes a time shift

Autocovariance function of a RP (second order moment)


𝐶𝑋𝑋 𝑡1 , 𝑡2 = 𝐸[ 𝑋 𝑡1 − 𝜇1 𝑋 𝑡2 − 𝜇2 ] 𝐸 𝑋 𝑡1 = 𝜇1 and 𝐸 𝑋 𝑡2 = 𝜇2

= 𝐸[𝑋 𝑡1 𝑋 𝑡2 − 𝜇2 𝑋 𝑡1 − 𝜇1 𝑋 𝑡2 + 𝜇1 𝜇2 ]
= 𝑅𝑋𝑋 𝑡1 , 𝑡2 − 𝜇1 𝜇2
Stationary Random processes

In many RPs, the statistics that we calculate do not change with time, i.e., the behavior is
time-invariant even though the process is random. These are called as Stationary
processes

A process is 𝑛𝑡ℎ order stationary if the joint distribution of any set of n time
samples is independent of the placement of time origin

𝐹X(𝑡1 ),X(𝑡2 ),….. X(𝑡𝑛 ) 𝑥1 , 𝑥2 , … , 𝑥𝑛 = 𝐹X(𝑡1 +τ),X(𝑡2 +τ),….. X(𝑡𝑛 +τ) 𝑥1 , 𝑥2 , … , 𝑥𝑛

holds for all values of τ (time shifts) and any choice of 𝑡1 , 𝑡2 ,…. 𝑡𝑛
Strict-sense stationarity
A process that is 𝑛𝑡ℎ order stationary for every integer n>0 is said to be strictly
stationary or strict sense stationary

First order stationary process


This would hold for pdf also

Thus mean ,I

i.e., mean is a constant independent of time

Second order stationary process


𝐹X(𝑡1 ),X(𝑡2 ) 𝑥1 , 𝑥2 = 𝐹X(𝑡1 +τ),X(𝑡2 +τ) 𝑥1 , 𝑥2 for all 𝜏
If 𝜏= - 𝑡1 (applying a shift in origin)

i.e., second order joint CDF (also joint pdf) depends only on the time difference 𝑡2 − 𝑡1

• Implies that the second order statistics like autocorrelation and autocovariance depend
solely on the time difference

• For strict sense stationarity, the invariance in distribution should be satisfied for all n >
0 and this is very difficult to check if not impossible

A relaxed notion of stationarity called as weak sense stationarity or wide-sense


stationarity is hence introduced
Wide-sense stationarity (WSS)

Weakly stationary RP or wide-sense stationary (WSS) RP requires only statistics which


can be calculated from the first order and second order density functions

A RP X(t) is said to be weakly stationary or wide-sense stationary process if it satisfies two


conditions:
1. Mean of the process X(t) is constant for all the time t
2. The autocorrelation function of the process X(t) depends solely on the difference
between any two times at which the process is sampled
Conditions for WSS RP

1. Mean of the process X(t) is constant for all the time t

2. The autocorrelation function of the process X(t) depends solely on the difference
between any two times at which the process is sampled

𝑅𝑋𝑋 𝑡1 , 𝑡2

If 𝑡1 =t and 𝑡2 −𝑡1 = 𝜏, the autocorrelation function of a WSS RP becomes


𝑅𝑋𝑋 𝑡, 𝑡 + 𝜏 = 𝑅𝑋𝑋 𝜏
Since for WSS RP

𝑅𝑋𝑋 𝑡1 , 𝑡2 = 𝑅𝑋𝑋 𝑡2 − 𝑡1

the autocovariance becomes,

𝐶𝑋𝑋 𝑡1 , 𝑡2 = 𝑅𝑋𝑋 𝑡1 , 𝑡2 − 𝐸 𝑋 𝑡1 𝐸 𝑋 𝑡2
= 𝑅𝑋𝑋 𝑡2 − 𝑡1 − 𝜇𝑋2

• Hence autocovariance function of a WSS RP X(t) depends only on the time difference
𝑡2 − 𝑡1
• If mean and autocorrelation function of a WSS RP are given, we can determine the
autocovariance
The two conditions can be used to check the weak stationarity of a RP and are not enough to check
the strict stationarity

“ Every strict sense stationary process is WSS but the reverse is not true “
Example
𝑅𝑋𝑋 𝑡, 𝑡 + 𝜏

1
2𝜋

Is only a function of time


𝑅𝑋𝑋 𝑡, 𝑡 + 𝜏
shift 𝜏
Mean of the RP is zero (how ? ) (independent of time).

Hence mean is a constant and autocorrelation is solely a function of the time difference
The RP is WSS

You might also like