You are on page 1of 78

ECT 305

ANALOG AND DIGITAL COMMUNICATION

Module-2

Review of Random Variables and Random Processes

Prepared By
Sithara Jeyaraj
Assistant professor
Dept of ECE,MACE

Analog and Digital communication 1


Syllabus
• Review of random variables – both discrete and continuous.
• CDF and PDF, statistical averages.(Only definitions, computations and significance)
• Entropy,
• differential entropy.
• Differential entropy of a Gaussian RV.
• Conditional entropy,
• mutual information.
• Stochastic processes,
• Stationarity.
• Conditions for WSS and SSS.
• Autocorrelation and power spectral density.
• LTI systems with WSS as input.

Analog and Digital communication 2


Basics of Probability

Analog and Digital communication 3


Basics of Probability

Analog and Digital communication 4


Conditional Probability

Analog and Digital communication 5


Total Probability Theorem

Analog and Digital communication 6


Bayes’ Theorem

Analog and Digital communication 7


Statistical Independence

Analog and Digital communication 8


Random Variable
• A random variable is mapping from the sample space S to the set of real numbers
• It is an assignment of real numbers to the outcomes of a random experiment.
• Example:
• Tossing two coins, We can define a random variable X as the number of Heads.
• Now Sample Space{HH,HT,TH,TT} and X can take values 0,1,2
• Random variable is denoted by Capital Letters X,Y etc.

Analog and Digital communication 9


Random Variable

Analog and Digital communication 10


Cumulative Distribution Function(CDF)

Analog and Digital communication 11


Properties of CDF

Analog and Digital communication 12


Types of random variable based on CDF
• Discrete random variable
• Will have a CDF which has a staircase shape
• Continuous Random variable
• A Random variable with a CDF which is continuous function of x
• Mixed Random Variable
• A random variable which is neither discrete nor continuous rv

Analog and Digital communication 13


Probability Distribution Function(PDF)

Analog and Digital communication 14


Probability Density Function(PDF)
• The PDF of continuous rv X will be continuous function of x
• The PDF of a mixed rv involves impulses but need not necessarily be zero between any
consecutive impulses.

Analog and Digital communication 15


Question
• Consider a random experiment of tossing three fair coins. Let a random variable X be defined as
The number of heads in each outcome. Plot the CDF of X
• In a game a six faced die is thrown. If 1 or 2 comes the player gets Rs 30, if 3 or 4 the player gets
Rs 10, if 5 comes he loses Rs. 30 and in the event of 6 he loses Rs. 100. Plot the CDF and PDF
of gain or loss

Analog and Digital communication 16


Types of Random Variable
• Continuous Random Variable
• Uniform Random Variable
• Gaussian Random Variable
• Discrete Random Variable
• Bernoulli Random Variable
• Binomial Random Variable

Analog and Digital communication 17


Uniform Random Variable

Analog and Digital communication 18


Gaussian or Normal Random variable

Analog and Digital communication 19


Gaussian or Normal Random variable

Analog and Digital communication 20


Gaussian or Normal Random variable

Analog and Digital communication 21


Bernoulli Random Variable
• A discrete rv X is said to be Bernoulli random variable provided it takes value 1 and 0 with
probabilities of P and (1-P)
• This random a variable is useful in modelling binary data generator
• And modelling the error pattern in the received binary data when channel introduces random
error.

Analog and Digital communication 22


Binomial Random Variable

Analog and Digital communication 23


Mean of a Random Variable

Analog and Digital communication 24


Variance of a Random variable

Analog and Digital communication 25


Properties of Mean and Variance

Analog and Digital communication 26


Functions of Two Random Variables

Analog and Digital communication 27


Functions of Two Random Variables
• The marginal distribution functions and density functions can be obtained from the joint
distribution and density functions respectively as

Analog and Digital communication 28


Joint Probability for Discrete random variables

Analog and Digital communication 29


Conditional CDFs and Conditional PDF

Analog and Digital communication 30


Conditional Mean and Variance

Analog and Digital communication 31


Independence , Uncorrelatedness and orthogonality

Analog and Digital communication 32


Random Process

Analog and Digital communication 33


Random Process

Analog and Digital communication 34


Note

Analog and Digital communication 35


Example
• Thermal Noise in electronic circuits:
• This type of noise is due to the random movement of electrons as a result of thermal agitation
• So the resulting current can only be described statistically
• Reflection of radio waves from different layers of ionosphere
• Due to the randomness of reflections the received signal can be modelled as a random signal.
• Characterization of information source
• An information source such as speech source , generates time varying signals whose contents are not
known in advance.
• Otherwise there is no need to transmit them.
• Random process provide a natural way to model information source.

Analog and Digital communication 36


Statistical Average

Analog and Digital communication 37


Statistical Average

Analog and Digital communication 38


Autocorrelation Function

Analog and Digital communication 39


Auto-Covariance

Analog and Digital communication 40


Cross Correlation

Analog and Digital communication 41


Independent, Uncorrelated and orthogonal Process

Analog and Digital communication 42


Stationarity

Analog and Digital communication 43


Stationarity

Analog and Digital communication 44


Question

Analog and Digital communication 45


Power Spectral Density of a Random Process

Analog and Digital communication 46


Power Spectral Density of a Random Process

Analog and Digital communication 47


Random Process and Linear System

X(t) h(t) Y(t)

Analog and Digital communication 48


Random Process and Linear System

Analog and Digital communication 49


Random Process and Linear System

Analog and Digital communication 50


Random Process and Linear System

Analog and Digital communication 51


Power Spectra in LTI Systems

Analog and Digital communication 52


Information Theory

Maximize the amount of information that can be transmitted over an imperfect communication channel
Analog and Digital communication 53
Information source
• It can be viewed as an object which produces an event, the outcome of which is selected at
random according to a probability distribution.
• A discrete information source is a source which has only a finite set of symbols as possible
outputs.
• The set of source symbols is called the source alphabet and the elements of the set are called
symbols or letters.
• A source with memory is the one for which a current symbol depends on the previous symbols
• A memoryless source is the one for which each symbol produced is independent of previous
symbols.
• A discrete memoryless source can be characterized by the list of the symbols the probability
assignment to these symbols and specification of the rate of generating these symbols by the
source.

Analog and Digital communication 54


Information Content
• The amount of information contained in an event is closely related to its uncertainty
• Messages containing knowledge of high probability of occurrence convey relatively less
information
• i.e. If the event is certain it conveys zero information
• Thus a mathematical measure of information should be a function of probability of the outcome
and should satisfy
• Information should be proportional to the uncertainty of an outcome
• Information contained in independent outcomes should add

Eg: Tossing a coin and transmitting the outcome across a channel

Analog and Digital communication 55


Information Content of a symbol

Analog and Digital communication 56


Entropy or Average Information
• In practical communication system, we usually transmit long sequences of symbols from an
information source.
• Thus we are more interested in the average information that a source produces than the
information content of a single symbol
• In order to get the information content of the symbol, we take notice of the fact that the flow of
information in a system can fluctuate widely because of randomness involved into selection of the
symbols.
• Thus we requires to talk about the average information content of the symbols in a long message.
• Thus for quantitative representation of average information per symbol we make the following
assumptions
• The source is stationary so that the probabilities may remain constant with time
• The successive symbols are statistically independent and come from the source at a average rate r
symbols per second

Analog and Digital communication 57


Entropy or Average Information

Analog and Digital communication 58


Entropy or Average Information

Analog and Digital communication 59


Information Rate

Analog and Digital communication 60


Discrete Memory Less channel

Analog and Digital communication


61
Discrete Memory Less channel

Analog and Digital communication 62


Discrete Memory Less channel

Analog and Digital communication 63


Discrete Memory Less channel

Analog and Digital communication 64


Discrete Memory Less channel

Analog and Digital communication 65


Conditional and Joint Entropies

Analog and Digital communication 66


Conditional and Joint Entropies

Analog and Digital communication 67


Conditional and Joint Entropies

Analog and Digital communication 68


Mutual Information

Analog and Digital communication 69


Mutual Information

Analog and Digital communication 70


Differential Entropy

Analog and Digital communication 71


Differential Entropy

Analog and Digital communication 72


Differential entropy of a Gaussian RV

Analog and Digital communication 73


Differential entropy of a Gaussian RV

Analog and Digital communication 74


Differential entropy of a Gaussian RV

Analog and Digital communication 75


Differential entropy of a Gaussian RV

Analog and Digital communication 76


Mutual Information

Analog and Digital communication 77


Mutual Information

Analog and Digital communication 78

You might also like