You are on page 1of 151

PROBABILITY THEORY AND STOCHASTIC

PROCESS(PTSP-8C304)

Unit VI LINEAR SYSTEMS WITH RANDOM INPUTS

Faculty: V.RAJENDRA CHARY(VRC),


Assistant Professor,ECE,SNIST
Class: II B.Tech,I Semester, ECE
SYLLABUS
UNIT-
1. PROBABILITY THEORY
2. RANDOM VARIABLES
3. RANDOM VECTORS
…..mid 1
4. RANDOM PROCESSES
5. STATIONARITY AND CORRELATION THEORY
6. LINEAR SYSTEMS WITH RANDOM INPUTS
…..mid 2

RAJENDRA CHARY 2
UNIT IV RANDOM PROCESSES
• Definition: The concept. Probabilistic Structure. Classification.
Formal Definition. Description: Joint Distribution. Analytical
Description using Random Variables. Average Values: Mean. Auto-
correlation, Auto-covariance and Auto-correlation Coefficient. Two
or More Random Processes: Cross-correlation Function. Cross-
covariance Function. Cross-correlation Coefficient.
• Applications: Calculation Coding efficiency of Shanon-Fano
Coding.

RAJENDRA CHARY 3
UNIT V STATIONARITY AND CORRELATION
THEORY
• Strict-sense Stationarity. Wide-sense Stationarity (WSS). Auto-
correlation Function of Real WSS Random Process and its
Properties.Cross-correlation Function and its Properties.Power Spectral
Density Function of a WSS Random Process and its properties.Wiener-
Khinchine Theorem.Power and Bandwidth Calculations. Cross-power
Spectral Density Function and its Properties ,Time Averaging and
Ergodicity: Time Averages – Interpretation. Mean and Variance.
Ergodicity.General Definition.Mean-ergodic. Correlation -ergodic.

• Applications: Removal of noise using correlation, probability of error


in Digital Communications.
RAJENDRA CHARY 4
UNIT VI LINEAR SYSTEMS WITH RANDOM
INPUTS
• Value of System Random Signal Response of Linear Systems: System
Response – Convolution. Mean and Mean-squared Response.
Autocorrelation Function of Response.Cross-Correlation Functions of
Input and Output.Spectral Characteristics of System Response.Power
Density Spectrum of Response.Cross-Power Density Spectrums of
Input and Output.Band Pass. Band-Limited and Narrowband Processes.
Properties.Thermal Noise. Shot noise.
• Information Theory: Entropy, Joint Entropy, Conditional Entropy and
Mutual Information

• Applications– Modulation, SNR calculations.


RAJENDRA CHARY 5
REFERENCE TEXTBOOKS FOR UNIT-VI

RAJENDRA CHARY 6
Outline Of The Unit
• Introduction, Value of System Random Signal Response of Linear
Systems: System Response – Convolution. Mean (1)
• Mean-squared Response. Autocorrelation Function of Response,
Cross-Correlation Functions of Input and Output (1)
• Spectral Characteristics of System Response: Power Density
Spectrum of Response, Cross-Power Density Spectrums of Input
and Output, problems (2)

RAJENDRA CHARY 7
Outline Of The Unit
• Band Pass, Band-Limited and Narrowband Processes, Properties
(1)
• Thermal Noise, Shot noise.(1)
• Entropy, Joint Entropy, Conditional Entropy and Mutual
Information, problems (4)

RAJENDRA CHARY 8
RAJENDRA CHARY 9
Introduction
System:
• A meaningful interconnection of physical devices/components/sub
systems is called a System.
• A system can also be defined as an entity that acts on an input
signal and transforms it into an output signal. It is cause and effect
relation between two or more signals.

RAJENDRA CHARY 10
Introduction(contd.)
Linear System:
• A system which obeys the principle of Superposition and
Homogenity is called a Linear system.
• Superposition principle: A system which produces an output y1(t)
for an input x1(t) and output y2(t) for an input x2(t) must produce an
output y1(t) + y2(t) for an input x1(t) + x2(t) .
• Homogenity principle: A system which produces an output y(t)
for an input x(t) must produce an output ay(t) for an input ax(t).
RAJENDRA CHARY 11
Introduction(contd.)
Time Invariant System:
• A system is said to be Time Invariant System if input-output
characteristics doesnot change with time.
• The condition for time-invariance is:
y(n,k)=y(n-k),where y(n,k)=T[x(n-k)]

RAJENDRA CHARY 12
Introduction(contd.)
Linear Time-Invariant(LTI) System:
• A system is said to be LTI system if satisfies both the Linearity
and Time Invariance Property.

RAJENDRA CHARY 13
Introduction(contd.)
• Causal System: A system is said to be Causal or non-
anticipative if the output of the system at any time instant t(or n)
depends only on the present and past values of input but not on
future inputs.
• Stable System: A system is said to be Stable if every bounded
input produces an bounded output.

RAJENDRA CHARY 14
Introduction(contd.)
• Transfer function of an LTI system: The Fourier transform of the
impulse response of LTI system is called the transfer function .
• It is given by:
H()=F[h(t)]
h(t)=F-1[H()]

RAJENDRA CHARY 15
Value of System Random Signal Response Of
Linear Systems
• System response-convolution
• Mean value of output response
• Mean Square value of output response
• Autocorrelation function of output response
• Crosscorrelation function of output response

RAJENDRA CHARY 16
System Response-Convolution
• Let a Random Process X(t) be applied to a continuous LTI system
whose impulse response is h(t) as shown below:

X(t) Y(t)
• The output response,Y(t) is also a Random Process.
• The output response is obtained from the convolution integral given
by:
Y(t) = h(t)  X(t)

RAJENDRA CHARY 17
Mean Value of the Response
Consider a WSS Random Process X(t) with Y(t) as output.
Mean value of the response = E[Y(t)]

RAJENDRA CHARY 18
Mean Value of the Response(contd.)
But E[X(t-)] = X = Constant , as X(t) is a Wide Sense Stationary
Process.
Hence,

• This expression indicates that the mean value of Y(t) is equal to


mean value of X(t) times the area under the impulse response if
X(t) is WSS.

RAJENDRA CHARY 19
Mean Value of the Response(contd.)
• Let H() be the Fourier Transform of h(t), it is given by

• At =0,

RAJENDRA CHARY 20
Mean Value of the Response(contd.)
• H(0) is called the zero frequency response of the system.

, which is a constant.

• Thus the Mean Value of the response Y(t) of a Wide Sense


Stationary Process X(t), is equal to the product of the mean
value of the input process and the zero frequency response
of the system.
RAJENDRA CHARY 21
Mean Square Value of the Response
• The Mean Square Value of the response, Y(t) is given by

RAJENDRA CHARY 22
Mean Square Value of the Response(contd.)

Where 1 and 2 are shifts in the time intervals.


• If the input X(t) is WSS random process, then,

RAJENDRA CHARY 23
Mean Square Value of the Response

• The above expression is independent of time,t.


• Mean square value of response gives the value of Output
Power.

RAJENDRA CHARY 24
Variance Of The Response
• The variance of the response is given by:
σY2 = m2 - m12 =E[Y2]-E[Y]2

RAJENDRA CHARY 25
Auto-Correlation Function of the Response
• The Auto-Correlation function of the response,Y(t) is given
by:

RAJENDRA CHARY 26
Auto-Correlation Function of the
Response(contd.)
We know that,

Assume that X(t) is a Wide Sense Stationary Process.


Also let the time difference  = t1 – t2 .

RAJENDRA CHARY 27
Auto-Correlation Function of the
Response(contd.)
• From substitution, we get

• From the above expression, we observe that the response Y(t) is a


Wide Sense Stationary Process if X(t) is Wide Sense Stationary.

RAJENDRA CHARY 28
Autocorrelation Function Of the
Response(contd.)
• If RXX( ) is the autocorrelation of X(t),then

• The above equation RYY( ) is the twofold convolution of the


input autocorrelation with the system’s impulse response

RAJENDRA CHARY 29
Cross-correlation Function Of The Input And
Output
• If the input X(t) is a Wide Sense Stationary Process, then the
Cross-Correlation function of the input X(t) and output Y(t)
is given by:
RXY(t,t+ )=E[X(t)Y(t+ )]

RAJENDRA CHARY 30
Cross-correlation Function Of The Input And
Output(contd.)

It is the convolution integral of RXX() and h()



Similarly,

and

RAJENDRA CHARY 31
Cross-correlation Function Of The Input And
Output(contd.)
• These expressions show the relationship between the
autocorrelation functions and the cross correlation functions.

RAJENDRA CHARY 32
RAJENDRA CHARY 33
Spectral Characteristics of System Response

• Power Density Spectrum Of Response


• Cross-Power Density Spectrum Of Input and Output

RAJENDRA CHARY 34
Power Density Spectrum Of Response
• Consider that a random process X(t) is applied on an LTI system
having a transfer function H(ω).The output response is Y(t).

X(t) LTI
Y(t)
SXX (ω) System
SYY (ω)
H(ω)
• If the power density spectrum of the input process is SXX(ω) ,then the
power density spectrum of the output response, SYY(ω) is given by:
SYY(ω)=|H(ω)|2 SXX(ω)
RAJENDRA CHARY 35
Power Density Spectrum Of Response
Proof:
We know that,

Applying Fourier transform on both sides,We get,


FT{RYY(τ)} = FT{ RXX(τ) h(τ) h(-τ) }

RAJENDRA CHARY 36
Power Density Spectrum Of Response
From properties of Fourier transform,
convolution in time domain=multiplication in frequency domain

SYY() = SXX() . H() . H*()


SYY() = H()2 .SXX()
Where SXX() and SYY() are the Power Density Spectrum of
random processes X(t) and Y(t) respectively.

RAJENDRA CHARY 37
Average Power of the Response
• We know that the average power of input is given by:

• Similarly, the average power of the response is given by:


PYY= SYY(ω) dω
But, SYY() = H()2 .SXX()
Therefore, PYY= H()2 .SXX() dω

RAJENDRA CHARY 38
Cross Power Density Spectrum Of Input &
Output
• The cross power density spectrum of input and output is given by:
SXY(ω)=SXX(ω)H(ω)
and SYX(ω)=SXX(ω)H*(ω)
Proof:
We know that,

Applying Fourier transform on both sides,

RAJENDRA CHARY 39
Cross Power Density Spectrum Of Input &
Output(contd.)
We get,
SXY(ω)=SXX(ω)H(ω)
Similarly, SYX(ω)=SXX(ω)H*(ω)

-Similar to Average Power,write the equations of Cross average


power

RAJENDRA CHARY 40
Types of Random Processes
Based on frequency(spectral properties),Random Processes are
classified into various types viz..
• Lowpass Random Process
• Bandpass Random Process
• Bandlimited Random Process
• Narrowband Random Process

RAJENDRA CHARY 41
Lowpass Random Process

• A random process N(t) is said to be Lowpass Random Process if


its power spectral density SNN(ω) has significant components
within the frequency band as shown in the figure below.

eg. Baseband signals such as speech,image and video


RAJENDRA CHARY 42
Bandpass Random Process
• A random process N(t) is said to be a Bandpass Random Process
if its power spectral density SNN(ω) has significant components
within a bandwidth W that does not include ω=0.
• But in practice, the spectrum may have a small amount of power
spectrum at ω=0, as shown in the figure .The spectral components
outside the band W are very small and can be neglected.
eg.(i)modulated signals with carrier frequency ωo and bandwidth W.
(ii)The noise transmitting over a communication channel.

RAJENDRA CHARY 43
Bandpass Random Process(contd.)

RAJENDRA CHARY 44
Bandlimited Random Process
• A random process N(t) is said to be Bandlimited Random
Process if its power spectral density SNN(ω) components are zero
outside the frequency band of width W that does not include ω=0
as shown in the figure below.

RAJENDRA CHARY 45
RAJENDRA CHARY 46
Narrowband Random Process
• A bandlimited random process N(t) is said to be a Narrowband
Random Process if the bandwidth W is very small compared to
the band centre frequency i.e W<< ωo, where W=bandwidth and ωo
is the frequency at which the power spectrum is maximum.

RAJENDRA CHARY 47
Narrowband Random Process(contd.)
• The narrowband process can be modelled as a cosine function with
angular frequency ωo and slowly varying amplitude and phase.
• It can be expressed as:
N(t)=A(t)cos[ωot+θ(t)]
Where A(t) is a random process representing the slowly varying
amplitude and θ(t) is a random process representing the slowly
varying phase

RAJENDRA CHARY 48
Representation Of Narrowband Process
• For any arbitrary WSS random processes N(t),narrowband process
can be represented by:
N(t)=X(t)cosωot - Y(t) sinωot
Where X(t) and Y(t) are in-phase and quadrature-phase
components of N(t).They can be expressed as:
X(t)=A(t)cos[θ(t)]
Y(t)=A(t)sin[θ(t)]

RAJENDRA CHARY 49
Representation Of Narrowband
Process(contd.)
• The relationship between the processes A(t) and θ(t) is given by:

RAJENDRA CHARY 50
Properties Of Bandlimited Random Process

Let N(t) be any bandlimited WSS random process with zero mean
value and a PSD SNN(ω).If the random process is represented by:
N(t)=X(t)cosωot - Y(t) sinωot
Then some of the properties are:
1)If N(t) is WSS,then X(t) and Y(t) are jointly WSS
2)If N(t) has zero mean (i.e E[N(t)]=0),then E[X(t)]=E[Y(t)]=0
3)The mean square values of the processes are equal i.e.
E[N2(t)]=E[X2(t)]=E[Y2(t)]
RAJENDRA CHARY 51
Properties Of Bandlimited Random
Process(contd.)
4)Both processes X(t) and Y(t) have same autocorrelation functions
i.e.. RXX( )= RYY( )
5)The crosscorrelation function of X(t) and Y(t) are given by
RYX( )= -RXY( ), RXY( )= -RXY(- )
If the processes are orthogonal, then
RXY( )= RYX( )=0

RAJENDRA CHARY 52
Properties Of Bandlimited Random
Process(contd.)
6)Both X(t) and Y(t) have same power spectral densities i.e

- ω0 ≤ ω ≤ ω 0
,otherwise
7)The cross power density spectrums are
SXY(ω) = SYX(-ω)

RAJENDRA CHARY 53
Properties Of Bandlimited Random
Process(contd.)
8)If N(t) is a Gaussian random process, then X(t) and Y(t) are jointly
Gaussian.
9)The relationship between autocorrelation and power spectrum
SNN(ω) is

RAJENDRA CHARY 54
Properties Of Bandlimited Random
Process(contd.)
10)If N(t) is a zero mean Gaussian and its PSD SNN(ω) is symmetric
about ±ωo, then X(t) and Y(t) are statistically independent.
11) SXX(ω)=Lp[SNN(ω- ωo)+SNN(ω+ ωo)]
12) SXY(ω)=jLp[SNN(ω- ωo) - SNN(ω+ ωo)]

RAJENDRA CHARY 55
Problem
• A stationary random process X(t) with zero mean and
autocorrelation RXX( )=e—2| 𝜏 | is applied to a system of function
H(ω)=1/(2+j ω).Find the mean and PSD of its output. (R1eg.8.33)

RAJENDRA CHARY 56
Problem
• The power density spectrum of a random process is given by
Find whether it is a valid spectral density or
not.If it is transmitted through a system as shown in figure, find the
output power density spectrum. (R1 eg8.12)

RAJENDRA CHARY 57
RAJENDRA CHARY 58
Problem
• A random process X(t) whose mean is 2 and autocorrelation
function RXX( )=4e-2|𝜏| is applied to a system whose transfer
function is 1/(2+jω).Find the mean value, power density spectrum,
autocorrelation and average power of the output Y(t). (R1 eg8.4)

RAJENDRA CHARY 59
Problem
• A random noise X(t) having power spectrum SXX(ω)=3/(49+ ω2) is
applied to a network for which h(t)=u(t)t2e-7t.The network
response is denoted by Y(t)
(a)What is the average power in X(t)
(b)Find the power spectrum of Y(t)
(c)Find the average power in Y(t) (R1 eg8.18)

RAJENDRA CHARY 60
Problem
• A random process Y(t) is obtained from the system as shown in
figure. Find the output spectral density in terms of input spectral
density, where A0 and ω0 are constants, X(t) is the input random
process and its power spectral density is SXX(ω) (R1 eg8.14)

RAJENDRA CHARY 61
Problems
• A WSS random process X(t) is applied to a LTI system with
impulse response h( )= 3e-2𝜏u( ) having a mean value X=2.Find
the value of Y.
• A WSS random process X(t) is applied to a LTI system with
impulse response h(t)=5te-2tu(t) having a mean value of input as
3.Find the mean of output.
• A WSS random process X(t) having a mean value of input as 2 is
applied to a system with impulse response h(t)= 3t2e-8tu(t) .Find
the mean of output.

RAJENDRA CHARY 62
RAJENDRA CHARY 63
Basic Block Diagram Of A Communication
Channel

RAJENDRA CHARY 64
Information Theory
• Information theory is a branch of Probability Theory, which can
be applied to the study of Communication Systems.
• Information theory deals with the following concepts:
1)The measure of information: It is the evaluation of the rate at
which the source generates the information
2)The information capacity of the channel: It determines the
maximum rate at which reliable information transmission is
possible over a given channel with an arbitrary small error.

RAJENDRA CHARY 65
Information Theory(contd.)
3)Coding: It is a scheme for efficient utilization of the information
capacity of the channel.

RAJENDRA CHARY 66
Rate of Information
• Rate of information,R is defined as the number of bits of
information per second.
• It is given by:
R=rH bits/sec
Where H= average no.of bits of information per message(i.e entropy)
r=message generated at source per second

RAJENDRA CHARY 67
Entropy
• Entropy is defined as the average information gained for the
symbol or the message.
• It is measured in bits/symbol or bits/message.
• It is given by:

RAJENDRA CHARY 68
Entropy(contd.)
Proof:
Let there be n different messages m1,m2,….mn with their respective
probabilities P1,P2,…..Pn.
Let us assume that in a long interval, L messages have been
generated.
Let L be very large such that L>>n,then
no. of messages, m1=P1L
Amount of information in message m1 =log2 (1/P1)
RAJENDRA CHARY 69
Entropy(contd.)
Thus,
the total amount of information in all m1messages=P1L log2 (1/P1)
Similarly,
the total amount of information in all m2 messages=P2L log2 (1/P2)
.
.
the total amount of information in all mn messages=PnL log2 (1/Pn)

RAJENDRA CHARY 70
Entropy(contd.)
The total information in all L messages is:
Itotal=P1L log2 (1/P1)+P2L log2 (1/P2)+………+PnL log2 (1/Pn)
The average information per message or entropy is given by:
H=Itotal/L
={P1Llog2(1/P1)+P2Llog2(1/P2)+…+PnLlog2(1/Pn)}/L
= P1log2 (1/P1)+P2log2 (1/P2)+………+Pnlog2 (1/Pn)

RAJENDRA CHARY 71
Entropy(contd.)

H= Pilog2 (1/Pi) bits/symbol

RAJENDRA CHARY 72
Entropy(contd.)
• The Source Entropy is given by:

H[X]= - P(xi)log2P(xi)

• The Receiver Entropy is given by:

H[Y]= - P(yj) log2 P(yj)

RAJENDRA CHARY 73
Entropy(contd.)
• In a continuous Information theory, the entropy is given by:

Source Entropy, H[X]= - f(x) log2 f(x) dx

Receiver Entropy, H[Y]= - f(y) log2 f(y) dy

RAJENDRA CHARY 74
Problems
• A discrete memory less source has an alphabet of 8 letters, xi,
i=1,2,…8 with probabilities
0.15,0.17,0.25,0.15,0.10,0.08,0.05,0.05.Determine the entropy of
the source.
• Find the entropy of Uniform density function.

RAJENDRA CHARY 75
RAJENDRA CHARY 76
Problems
• An analog signal is band-limited to B Hz sampled at the Nyquist
rate and the samples are quantized into 4 levels. The quantization
levels Q1,Q2,Q3,Q4 occur with probabilities P1=P4=1/8 and
P2=P3=3/8.Find the entropy and rate of information.
• Find the entropy of Gaussian density function with zero mean.

RAJENDRA CHARY 77
Joint Entropy
• Let X be the source and Y be the receiver of a communication
system. Then the Joint Entropy is given by:

H[XY]=- P(xi yj) log2 P(xi yj)

RAJENDRA CHARY 78
Joint Entropy(contd.)
Proof:
Let X be the source and Y be the receiver. Each message of X may occur
jointly with any message of Y i.e any x of X is transmitted as any y of
Y.
• The above fact can be represented using Joint probability matrix which
is given by:

contd..

RAJENDRA CHARY 79
Joint Entropy(contd.)

RAJENDRA CHARY 80
Joint Entropy(contd.)
P(x1)=P(x1 y1)+P(x1 y2)+………+P(x1 ym)
P(y1)=P(x1 y1)+P(x2 y1)+………+P(xn y1)
In general ,these two expressions can be written as:

P(xi) = P(xi yj)

P(yj) = P(xi yj)

RAJENDRA CHARY 81
Joint Entropy(contd.)
• Entropy of the Source,X is given by:
H[X]=-[P(x1) log2 P(x1) + P(x2) log2 (P(x2) +…+ P(xn) log2 P(xn) ]
=- P(xi) log2 P(xi)

• Similarly, the entropy of the Receiver,Y is given by:


H[Y]=- P(yj) log2 P(yj)

RAJENDRA CHARY 82
Joint Entropy(contd.)

H[XY]=- P(xi yj) log2 P(xi yj)

• Here, H[X] and H[Y] are called marginal entropies and H[XY] is
called Joint entropy.

RAJENDRA CHARY 83
Joint Entropy(contd.)
• In continuous Information theory, the Joint Entropy is given by:

H[XY]= - fX,Y(x,y) log2 fX,Y(x,y) dx dy

RAJENDRA CHARY 84
Conditional Entropy
• The Conditional Entropy is given by:
H[X/Y]=H[XY]-H[Y]
H[Y/X]=H[XY]-H[X]

RAJENDRA CHARY 85
Properties Of Entropy
1)When the probability of occurrence of an event is 1,the entropy
becomes zero. Hence the entropy is a measure of uncertainity.
Proof:
H=-Plog2(P)
When P=1,
H=-1log2(1)=0

RAJENDRA CHARY 86
Properties Of Entropy(contd.)
2)Entropy is maximum when all the source symbols are equally
likely to occur.
Proof:
Consider two symbols m1 and m2 with the probabilities P and 1-P
We know that,

RAJENDRA CHARY 87
Properties Of Entropy(contd.)
= -[PlogP + (1-P)log(1-P)]
For maximum, dH/dP=0
{[P.(1/P)+1.log(P)]+[(1-P).(1/(1-P))(-1)+log(1-P)(-1)]}=0
log(P/1-P)=0
P/1-P =1
P=1-P
2P=1
P=1/2

RAJENDRA CHARY 88
Properties Of Entropy(contd.)
• The variation of H with P reveals that it has broad maximum
centred at P=1-P=1/2 where H=1bit/symbol and decreases
monotonically to zero on both sides as P tends to 1 or (1-P) tends
to zero.(or vice-versa)
• Similarly for M-messages(M-Ary),entropy is maximum when all
the messages are equally likely. In this case, maximum entropy
will be log2 M bits/message

RAJENDRA CHARY 89
Properties Of Entropy(contd.)
3)H[XY]=H[X]+H[Y/X]
=H[Y]+H[X/Y]
Proof:
Consider the joint probability
H[XY]=- P(xi yj) log2 P(xi yj)

We know that, P(xiyj)=P(xi)P(yj/xi)

RAJENDRA CHARY 90
Properties Of Entropy(contd.)

RAJENDRA CHARY 91
Properties Of Entropy(contd.)

RAJENDRA CHARY 92
Properties Of Entropy(contd.)

RAJENDRA CHARY 93
RAJENDRA CHARY 94
Problems
• Calculate the entropy of alphabets in English language assuming
that each of the 26 characters in alphabet occur with equal
probability.
• For a signal x(t) which is having a uniform density function in the
range[0,4],find H[X].If the same signal is amplified by a factor of
8,determine H[X].

RAJENDRA CHARY 95
Problem
• A transmitter has an alphabet consisting of letters, the joint
probabilities matrix is given below. Determine the marginal,
conditional and joint entropies of the channel.
Y(Receiver)
F G H I
A 0.25 0 0 0
B 0.10 0.30 0 0
X(Trans
mitter) C 0 0.05 0.10 0
D 0 0 0.05 0.10
E 0 0 0.05 0
RAJENDRA CHARY 96
Mutual Information or Transinformation
• Mutual Information is the average information gained by the
channel due to uncertainity present in the channel.
• It is given by:
I[XY]=H[X]-H[X/Y]
(or) I[XY]=H[Y]-H[Y/X]
(or) I[XY]=H[X]+H[Y]-H[XY]

RAJENDRA CHARY 97
Mutual Information
Proof:
• Uncertainity present due to the transmitted symbol = –logP(xi)
• Uncertainity present at the receiver due to the transmitted
signal(symbol) with known state of received symbol = –logP(xi/yj)
• The net uncertainity is equal to difference of initial uncertainity
and final uncertainity.
i.e I(xi yj)=-logP(xi) – [-logP(xi/yj)]

RAJENDRA CHARY 98
Mutual Information

RAJENDRA CHARY 99
Mutual Information

RAJENDRA CHARY 100


Mutual Information

RAJENDRA CHARY 101


Mutual Information

RAJENDRA CHARY 102


Mutual Information

RAJENDRA CHARY 103


Mutual Information

RAJENDRA CHARY 104


Mutual Information

RAJENDRA CHARY 105


Mutual Information
• If both the source and receiver are independent, then the joint
entropy H[XY] becomes H[X]+H[Y].So the mutual information
that can be shared between the source and receiver is zero.
• When source and receiver are independent, H[X/Y]=H[X] and
H[Y/X]=H[Y].Therefore I[XY]=0.

RAJENDRA CHARY 106


Problem
• A transmitter has an alphabet of 4 letters [x1,x2,x3,x4] and the
receiver has an alphabet of 3 letters [y1,y2,y3].The joint probability
matrix is given. Calculate all the entropies and mutual information
. (R3 eg10.4.3,10.5.1)

RAJENDRA CHARY 107


Problem
• A discrete source transmits messages x1,x2,x3 with probabilities
0.3,0.4 and 0.3.The source is connected to a channel as shown in
figure. Calculate all the entropies and mutual information.(R3 eg
10.4.2,10.5.1)

RAJENDRA CHARY 108


RAJENDRA CHARY 109
Problems
• An event has six possible outcomes with the probabilities
p1=1/2,p2=1/4,p3=1/8,p4=1/16,p5=1/32,p6=1/32.Find the entropy of
the system. Also find the rate of information if there are 16
outcomes per second (R3 eg10.3.1)
• A signal is bandlimited to 5KHz at Nyquist rate.The signal is
quantized in 8 levels of a PCM system with the probabilities
0.25,0.2,0.2,0.1,0.1,0.05,0.05 and 0.05.Calculate the entropy and
the rate of information. (R3 eg10.3.2)

RAJENDRA CHARY 110


Problem
• Noise characteristics of a channel is given by a conditional probability
matrix as:.
y1 y2 y3

x1 0.6 0.2 0.2

x2 0.2 0.6 0.2

x3 0.2 0.2 0.6

P(x1)=1/8, P(x2)=1/8, P(x3)=6/8 .Find the mutual information.

RAJENDRA CHARY 111


Noise
• Noise may be defined as an undesirable or unwanted electrical
signal which accompanies the desired signal.
• Noise is an unintentional fluctuation that tends to disturb
transmission and reproduction of transmitted signals.

1/27/2023 RAJENDRA CHARY 112


Types Of Noise
• External noise: Noise whose sources are external to the receiver is
called external noise.
eg. Atmospheric noise, Industrial noise, Extraterrestrial noise
like solar noise, cosmic noise
• Internal noise: The noise created within a device or system is
called internal noise. Internal noise is generated by any of the
active or passive devices found in the system.
eg. Shot noise, Transit time noise, Flicker noise, Thermal
noise
RAJENDRA CHARY 113
Thermal or Resistor or Johnson noise
• In any conducting material, electrons move randomly. The noise
produced here is called thermal noise.
• Each free electron inside a conducting medium is in motion due to
temperature. When temperature increases, random motion of
electrons increases and hence noise increases.
• At 0o kelvin, there is no motion of electrons and hence noise is
zero.
• Thermal noise amplitude mainly depends on resistance. More noise
is generated for a high resistive path.
RAJENDRA CHARY 114
Thermal or Resistor or Johnson noise
• Thermal noise is proportional to the temperature in degree kelvin
and bandwidth of the system.

Pn TB
Where T=temperature in kelvin
B=bandwidth in Hz

RAJENDRA CHARY 115


Thermal or Resistor or Johnson noise
• Noise power is given by
Pn =kTB
Where k= Boltzman’s constant=1.38 X 10-23 J/oK
T=temperature in kelvin
B=bandwidth in Hz

RAJENDRA CHARY 116


Thermal or Resistor or Johnson noise
• The power density spectrum of the noise voltage contributing to
the thermal noise is given by:

Where R=resistance, =average collisions per second per electron.

RAJENDRA CHARY 117


Thermal or Resistor or Johnson noise
• The plot of the power density spectrum of Thermal noise is as
shown in figure.

RAJENDRA CHARY 118


Thermal or Resistor or Johnson noise
• Since the number of free electrons in a conductor are very large,
the value of is in the order of 1014,the spectrum may be
considered to be flat upto the frequency of 1013Hz.Hence for all
practical communication systems, the power spectrum density is
considered to be independent of frequency.
• The thermal noise distribution may be approximated to a Gaussian
distribution with zero mean value
i.e Sv(ω) = 2kTR

RAJENDRA CHARY 119


Thermal or Resistor or Johnson noise
• The power density spectrum for noise current is
SI(ω)=Sv(ω)/R2
= 2kTR/ R2
= 2kTG
Where G =conductance

RAJENDRA CHARY 120


Shot Noise
• In 1918,while conducting some experiments on vacuum tubes,
Schotlky observed that there were fluctuations in the anode
current at the output terminal of a vacuum tube. These
fluctuations were named as shot noise.
• Shot noise fluctuations spread evenly over all frequencies.
• It appears as a randomly varying noise current superimposed on
the output current.

RAJENDRA CHARY 121


Shot Noise
• Shot noise follows Poisson’s distribution, since there is a
randomness in the number of particles arriving at the output.
• Some examples of shot noise are emission of electrons from
cathode of a vacuum tube, flow and recombination of electrons and
holes in semiconductors.
• In BJT’s ,shot noise is mainly due to the random drift of current
carriers across the junctions and also due to random generation and
recombination of electron-hole pairs.

RAJENDRA CHARY 122


Shot Noise
• Due to shot noise, the external current at the output is composed of
a large no. of random independent current pulses as shown in
figure, the shot noise can be represented in terms of its mean
square variable about the mean value.

RAJENDRA CHARY 123


Shot Noise

RAJENDRA CHARY 124


Shot Noise
• The plot of shot noise is given by:

RAJENDRA CHARY 125


Shot Noise
• In shot noise, the no. of fluctuations during the observed time is
very large. The complete random process may be considered as
statistically independent and WSS.
• Therefore the shot noise distribution may be approximated to a
Gaussian distribution with zero mean value.
• Shot noise is more effective at very high frequencies than at lower
frequencies.
• The time taken to reach the electron from cathode to anode is
called transition time, .
RAJENDRA CHARY 126
Shot Noise
• The power density spectrum is independent of frequency upto (1/ )
Hz. Beyond this frequency, the power density varies with
frequency.
• The most convenient method of dealing with shot noise is to find
the value or formula for an equivalent input noise resistor Req.

RAJENDRA CHARY 127


Shot Noise
• The power density spectrum of shot noise is:
SN(ω) = qeID
(or) SN(ω)=4kTReq
• The shot noise voltage is:
vn(t)=in(t)/gm
Where gm=transconductance of the device, qe= charge of electron
ID=DC current, Req=equivalent input noise resistor

RAJENDRA CHARY 128


RAJENDRA CHARY 129
White noise
• Consider a WSS thermal noise process N(t),it is called white
noise if the power spectral density of N(t) is constant at all
frequencies i.e the PSD of white noise is independent of
frequency.
• White noise can be expressed as:
SNN(ω) = No/2
where No is the noise power

RAJENDRA CHARY 130


White noise
• According to central limit theorem, the probability density function
of white noise can be assumed to be a Gaussian distribution with
zero mean, hence white noise is also called as White Gaussian
noise.
• The bandwidth of white noise is infinite. The average power is also
infinite, so it not physically realizable.
• If white noise exists within a frequency range(i.e if it is band
limited),then it is called coloured noise, which is physically
realizable

RAJENDRA CHARY 131


White noise

RAJENDRA CHARY 132


White noise

RAJENDRA CHARY 133


White noise

• Picture taken from Mallikarjun textbook pgno.512

RAJENDRA CHARY 134


Problem
• A white noise with PSD No/2 is transmitted through a linear system
as shown in figure. Find the output spectral density and average
power. (R1 eg8.19)

RAJENDRA CHARY 135


Problem
• Find the mean square value of the output response for a system
having h( )= e-𝜏u( ) and input of white noise with PSD No/2
(R1 eg. 8.5)

RAJENDRA CHARY 136


Applications
• Modulation: It is the process in which any one of the
characteristics of carrier signal(amplitude, frequency, phase)is
varied in accordance with the instantaneous value of the message
signal.
• SNR calculations- SNR is defined as the ratio of signal power to
the noise power, often expressed in decibels.SNR is the acronym
for signal to noise ratio.

RAJENDRA CHARY 137


Lecture Plan
Lecture no. Date&period no. Lecture no. Date&period no.
1 17/01/2023,P-1 7 25/01/2023,P-2
2 17/01/2023,P-2 8 27/01/2023,P-1
3 18/01/2023,P-2 9 27/01/2023,P-2
4 19/01/2023,P-3 10 27/01/2023,P-4
5 20/01/2023,P-1 11 28/01/2023,P-6
6 24/01/2023,P-1

RAJENDRA CHARY 138


RAJENDRA CHARY 139
Previous Year Problems(A20 Reg,A18 Supply
March2022)
• a)Describe the different Noise sources
b)Find the Power density spectrum of output a series RC circuit
when it is subjected to white noise.
• Find the Cross correlation function of response and input of Linear
system

RAJENDRA CHARY 140


Previous Year Problems(A18 Reg,A18 Supply
July 2021)
• a)List out the properties of band-limited random process.
b)Find the mean square value of the output response for a system
having h(t)=e-t u(t) and input of white noise No/2.
• Write notes on the following terms:
• i) Thermal noise ii) Narrowband noise.

RAJENDRA CHARY 141


Previous Year Problems(A18 Reg,A18 Supply
Dec 2019)
• Define thermal noise.
• Relate input and output Power density spectrums of a linear system
• a)Derive the auto correlation function of response of an LTI
system.
b)A product device response Y(t) = X(t) A0 Cos ω0t, where A0 and ω0
are constants.Find PSD of Y(t) in terms of PSD of input X(t).
• Define bandpass, bandlimited and narrow band processes.

RAJENDRA CHARY 142


Previous Year Problems(A17 Reg Nov2018)
• Differentiate White noise and colored noise.
• a)Describe different types of Noise with example
b)Find the Power density spectrum and average power of response of a
Series RL electrical network when subjected to white noise as input.
• a)Write short notes on the following:
Auto correlation function and cross correlation function of response of a
linear system.
b)Band limited process and its properties.

RAJENDRA CHARY 143


Previous Year Problems(A14 Reg Dec 2015)

• Derive an expression for Auto correlation function of response in a


linear system.
• Write short notes on the following
a)Auto correlation function and cross correlation function of
response of a linear system.
b)Narrow band process and its properties.

RAJENDRA CHARY 144


RAJENDRA CHARY 145
Complex numbers
• If Z=x+iy, then |Z|= √(x2+y2)

RAJENDRA CHARY 146


Properties of Fourier Transform
• Convolution Property:

• Frequency shifting /multiplication by an exponential:

For more properties-Refer A.Anand Kumar or any other standard signals &
systems textbook

RAJENDRA CHARY 147


Definite and Indefinite integrals,
Fourier transform of standard functions
• Refer Appendix of Unit-5
• Refer A.Anand Kumar or any other standard Signals and Systems
textbook for Fourier Transform, Inverse Fourier Transform &
Fourier Transform of standard functions.

RAJENDRA CHARY 148


Conclusion of the Unit
In this unit we studied about:
• Value of System Random Signal Response of Linear Systems:
System Response – Convolution. Mean and Mean-squared
Response. Autocorrelation Function of Response,Cross-Correlation
Functions of Input and Output.
• Spectral Characteristics of System Response:Power Density
Spectrum of Response,Cross-Power Density Spectrums of Input
and Output.

RAJENDRA CHARY 149


Conclusion of the Unit
• Classification of Random process, BandPass, Band-Limited and
Narrowband Processes. Properties.
• Information Theory: Entropy, Joint Entropy, Conditional Entropy
and Mutual Information
• Thermal Noise, Shot noise, White noise.
• Applications
• Problems

RAJENDRA CHARY 150


Thank you
For Your Attention !
Any Questions
RAJENDRA CHARY 151

You might also like