Professional Documents
Culture Documents
MODULE-2 , PART-1
RANDOM VARIABLES & RANDOM PROCESSES
• Actual physical phenomenon that we have are understood using mathematical models
that can be
• Deterministic : there is no uncertainty regarding its time-dependent behavior at any
point in time Ex: LTI systems
coin, throwing a dice, drawing card from a deck of cards, noise voltage in a channel at
a given time
denoted as 𝑠𝑖 s
possibilities)
Events :
Subsets of sample space to which a probability can be assigned
Random Variable (RV):
A RV is a mapping from the sample space 𝑆 to the set of real numbers
The RV is a function whose domain is the sample space and range is some set of real numbers
Ex: Die throwing experiment where the number of dots on top is to be counted
• Set of possible outcomes/sample space= S= {1, 2, 3,4, 5, 6}
• To each outcome 𝑠𝑖 ∈ S we assign the numbers X(𝑠𝑖 ) =10i . Thus X(𝑠1 )=10,…., X(𝑠6 )=60
• Here ‘X’ is the RV and the domain is the set S= {𝑠1 , 𝑠2 ,…., 𝑠6 } and its range is the set of
numbers {10, 20, 30, 40, 50, 60}
• Since the elements of the sample space(outcomes of the experiment) have
probabilities associated with them, it is said that the RV takes on various possible
values with certain probability
Types of RVs
• Discrete RV : A RV ‘X’ is called as a discrete RV if X can take only a finite number of values
Ex: die throwing
• Continuous RV : A RV ‘X’ is called a continuous RV if it can take any value in the real line
within a bounded or unbounded interval , i.e., it can take a continuous range of values
Ex: noise voltage at a particular time can take any value from negative infinity to positive
infinity
FXF (𝑥)=P(X
(𝑥)=P(X ≤ 𝑥≤ )𝑥 )
X
It is the probability that the RV X takes a value less than or equal to the real number 𝑥
Q 2.1) In a game a six faced die is thrown. If 1 or 2 comes, the player gets Rs.30, if 3 or
4 comes, the player gets Rs.10, if 5 comes, he loses Rs,30 and in the event of 6 , he
loses Rs.100. Plot the CDF and PMF of gain or loss.
Properties of distribution function :
1. 0 ≤ FX 𝑥 ≤ 1
3. FX (−∞) = 0
4. FX (+∞) = 1
For continuous RVs, we can define Probability density function(pdf)
1. Non-negativity :
𝑓𝑋 𝑥 ≥ 0 ∀ 𝑥
2. Normalization :
The total area under the graph of the pdf is equal to unity
Multiple RVs
𝑥2 𝑦2
𝑃 𝑥1 ≤ 𝑋 ≤ 𝑥2 , 𝑦1 ≤ 𝑌 ≤ 𝑦2 = න න 𝑓𝑋,𝑌 (𝑥, 𝑦) ⅆ𝑦 ⅆ𝑥
𝑥1 𝑦1
Normalization property :
The first two moments are the most important for communication systems analysis
Variance is a measure of the “randomness” of a RV. Higher variance implies that the
RV can take a wider range of values ( more random)
Relation between variance and mean square value
For independent RVs, covariance is zero and hence they are uncorrelated(vice versa also true)
• For independent RVs, joint pdf is equal to product of individual pdfs
If 0, then X and Y are said to be orthogonal.
Gaussian RV
- Randomness is not associated with the waveform but with the certainty as to
Let X(𝑡1 ), X(𝑡2 ),….. X(𝑡𝑘 ) be the RVs obtained by sampling X(t) at times 𝑡1 , 𝑡2 ,…. 𝑡𝑘 resp.
For the k samples, they can be fully characterized by the joint pdf ( 𝑘 𝑡ℎ order ) or 𝑘 𝑡ℎ
order CDF
= 𝑃 X(𝑡1 ≤ 𝑥1 , X(𝑡2 ≤ 𝑥2 , … . X(𝑡𝑘 ) ≤ 𝑥𝑘 )
The joint pdf
𝜕𝑘
𝑓X(𝑡1),X(𝑡2 ),….. X(𝑡𝑘) 𝑥1 , 𝑥2 , … , 𝑥𝑘 =
𝜕𝑥1 . 𝜕𝑥2 … 𝜕𝑥𝑘
Suppose we shift all the sampling times by a fixed amount 𝜏 and obtain a new set
of RVs , we have the new joint CDF
Mean or expectation of a RP X(t) (first order moment)
We define the mean of the process X(t) as the expectation of the RV obtained by
sampling the process at time t
Autocorrelation function of the stochastic process X(t) is the expectation of the product
of two RVs X(𝑡1 ) and X(𝑡2 ) obtained by sampling the process X(t) at times 𝑡1 and 𝑡2
𝑅𝑋𝑋 𝑡1 , 𝑡2
where 𝑓X(𝑡1 ),X(𝑡2 ) 𝑥1 , 𝑥2 is the joint pdf of X(t) sampled at times 𝑡1 and 𝑡2 and
𝑅𝑋𝑋 𝑡1 , 𝑡2 is the second order moment derived from second order pdf
• The “auto” in autocorrelation refers to the correlation of the process with itself
If we let 𝑡1 =t and 𝑡1 - 𝑡2 = 𝜏 , the definition of autocorrelation function can be
reformulated as
𝑅𝑋 𝑡, 𝑡 + 𝜏
= 𝐸[𝑋 𝑡1 𝑋 𝑡2 − 𝜇2 𝑋 𝑡1 − 𝜇1 𝑋 𝑡2 + 𝜇1 𝜇2 ]
= 𝑅𝑋𝑋 𝑡1 , 𝑡2 − 𝜇1 𝜇2
Stationary Random processes
In many RPs, the statistics that we calculate do not change with time, i.e., the behavior is
time-invariant even though the process is random. These are called as Stationary
processes
A process is 𝑛𝑡ℎ order stationary if the joint distribution of any set of n time
samples is independent of the placement of time origin
holds for all values of τ (time shifts) and any choice of 𝑡1 , 𝑡2 ,…. 𝑡𝑛
Strict-sense stationarity
A process that is 𝑛𝑡ℎ order stationary for every integer n>0 is said to be strictly
stationary or strict sense stationary
Thus mean ,I
i.e., second order joint CDF (also joint pdf) depends only on the time difference 𝑡2 − 𝑡1
• Implies that the second order statistics like autocorrelation and autocovariance depend
solely on the time difference
• For strict sense stationarity, the invariance in distribution should be satisfied for all n >
0 and this is very difficult to check if not impossible
2. The autocorrelation function of the process X(t) depends solely on the difference
between any two times at which the process is sampled
𝑅𝑋𝑋 𝑡1 , 𝑡2
𝑅𝑋𝑋 𝑡1 , 𝑡2 = 𝑅𝑋𝑋 𝑡2 − 𝑡1
𝐶𝑋𝑋 𝑡1 , 𝑡2 = 𝑅𝑋𝑋 𝑡1 , 𝑡2 − 𝐸 𝑋 𝑡1 𝐸 𝑋 𝑡2
= 𝑅𝑋𝑋 𝑡2 − 𝑡1 − 𝜇𝑋2
• Hence autocovariance function of a WSS RP X(t) depends only on the time difference
𝑡2 − 𝑡1
• If mean and autocorrelation function of a WSS RP are given, we can determine the
autocovariance
The two conditions can be used to check the weak stationarity of a RP and are not enough to check
the strict stationarity
“ Every strict sense stationary process is WSS but the reverse is not true “
Example
𝑅𝑋𝑋 𝑡, 𝑡 + 𝜏
1
2𝜋
Hence mean is a constant and autocorrelation is solely a function of the time difference
The RP is WSS