Professional Documents
Culture Documents
PROCESS(PTSP-8C304)
RAJENDRA CHARY 2
UNIT IV RANDOM PROCESSES
• Definition: The concept. Probabilistic Structure. Classification.
Formal Definition. Description: Joint Distribution. Analytical
Description using Random Variables. Average Values: Mean. Auto-
correlation, Auto-covariance and Auto-correlation Coefficient. Two
or More Random Processes: Cross-correlation Function. Cross-
covariance Function. Cross-correlation Coefficient.
• Applications: Calculation Coding efficiency of Shanon-Fano
Coding.
RAJENDRA CHARY 3
UNIT V STATIONARITY AND CORRELATION
THEORY
• Strict-sense Stationarity. Wide-sense Stationarity (WSS). Auto-
correlation Function of Real WSS Random Process and its
Properties.Cross-correlation Function and its Properties.Power Spectral
Density Function of a WSS Random Process and its properties.Wiener-
Khinchine Theorem.Power and Bandwidth Calculations. Cross-power
Spectral Density Function and its Properties ,Time Averaging and
Ergodicity: Time Averages – Interpretation. Mean and Variance.
Ergodicity.General Definition.Mean-ergodic. Correlation -ergodic.
RAJENDRA CHARY 6
Outline Of The Unit
• Introduction, Value of System Random Signal Response of Linear
Systems: System Response – Convolution. Mean (1)
• Mean-squared Response. Autocorrelation Function of Response,
Cross-Correlation Functions of Input and Output (1)
• Spectral Characteristics of System Response: Power Density
Spectrum of Response, Cross-Power Density Spectrums of Input
and Output, problems (2)
RAJENDRA CHARY 7
Outline Of The Unit
• Band Pass, Band-Limited and Narrowband Processes, Properties
(1)
• Thermal Noise, Shot noise.(1)
• Entropy, Joint Entropy, Conditional Entropy and Mutual
Information, problems (4)
RAJENDRA CHARY 8
RAJENDRA CHARY 9
Introduction
System:
• A meaningful interconnection of physical devices/components/sub
systems is called a System.
• A system can also be defined as an entity that acts on an input
signal and transforms it into an output signal. It is cause and effect
relation between two or more signals.
RAJENDRA CHARY 10
Introduction(contd.)
Linear System:
• A system which obeys the principle of Superposition and
Homogenity is called a Linear system.
• Superposition principle: A system which produces an output y1(t)
for an input x1(t) and output y2(t) for an input x2(t) must produce an
output y1(t) + y2(t) for an input x1(t) + x2(t) .
• Homogenity principle: A system which produces an output y(t)
for an input x(t) must produce an output ay(t) for an input ax(t).
RAJENDRA CHARY 11
Introduction(contd.)
Time Invariant System:
• A system is said to be Time Invariant System if input-output
characteristics doesnot change with time.
• The condition for time-invariance is:
y(n,k)=y(n-k),where y(n,k)=T[x(n-k)]
RAJENDRA CHARY 12
Introduction(contd.)
Linear Time-Invariant(LTI) System:
• A system is said to be LTI system if satisfies both the Linearity
and Time Invariance Property.
RAJENDRA CHARY 13
Introduction(contd.)
• Causal System: A system is said to be Causal or non-
anticipative if the output of the system at any time instant t(or n)
depends only on the present and past values of input but not on
future inputs.
• Stable System: A system is said to be Stable if every bounded
input produces an bounded output.
RAJENDRA CHARY 14
Introduction(contd.)
• Transfer function of an LTI system: The Fourier transform of the
impulse response of LTI system is called the transfer function .
• It is given by:
H()=F[h(t)]
h(t)=F-1[H()]
RAJENDRA CHARY 15
Value of System Random Signal Response Of
Linear Systems
• System response-convolution
• Mean value of output response
• Mean Square value of output response
• Autocorrelation function of output response
• Crosscorrelation function of output response
RAJENDRA CHARY 16
System Response-Convolution
• Let a Random Process X(t) be applied to a continuous LTI system
whose impulse response is h(t) as shown below:
X(t) Y(t)
• The output response,Y(t) is also a Random Process.
• The output response is obtained from the convolution integral given
by:
Y(t) = h(t) X(t)
RAJENDRA CHARY 17
Mean Value of the Response
Consider a WSS Random Process X(t) with Y(t) as output.
Mean value of the response = E[Y(t)]
RAJENDRA CHARY 18
Mean Value of the Response(contd.)
But E[X(t-)] = X = Constant , as X(t) is a Wide Sense Stationary
Process.
Hence,
RAJENDRA CHARY 19
Mean Value of the Response(contd.)
• Let H() be the Fourier Transform of h(t), it is given by
• At =0,
RAJENDRA CHARY 20
Mean Value of the Response(contd.)
• H(0) is called the zero frequency response of the system.
, which is a constant.
RAJENDRA CHARY 22
Mean Square Value of the Response(contd.)
RAJENDRA CHARY 23
Mean Square Value of the Response
RAJENDRA CHARY 24
Variance Of The Response
• The variance of the response is given by:
σY2 = m2 - m12 =E[Y2]-E[Y]2
RAJENDRA CHARY 25
Auto-Correlation Function of the Response
• The Auto-Correlation function of the response,Y(t) is given
by:
RAJENDRA CHARY 26
Auto-Correlation Function of the
Response(contd.)
We know that,
RAJENDRA CHARY 27
Auto-Correlation Function of the
Response(contd.)
• From substitution, we get
RAJENDRA CHARY 28
Autocorrelation Function Of the
Response(contd.)
• If RXX( ) is the autocorrelation of X(t),then
RAJENDRA CHARY 29
Cross-correlation Function Of The Input And
Output
• If the input X(t) is a Wide Sense Stationary Process, then the
Cross-Correlation function of the input X(t) and output Y(t)
is given by:
RXY(t,t+ )=E[X(t)Y(t+ )]
RAJENDRA CHARY 30
Cross-correlation Function Of The Input And
Output(contd.)
and
RAJENDRA CHARY 31
Cross-correlation Function Of The Input And
Output(contd.)
• These expressions show the relationship between the
autocorrelation functions and the cross correlation functions.
RAJENDRA CHARY 32
RAJENDRA CHARY 33
Spectral Characteristics of System Response
RAJENDRA CHARY 34
Power Density Spectrum Of Response
• Consider that a random process X(t) is applied on an LTI system
having a transfer function H(ω).The output response is Y(t).
X(t) LTI
Y(t)
SXX (ω) System
SYY (ω)
H(ω)
• If the power density spectrum of the input process is SXX(ω) ,then the
power density spectrum of the output response, SYY(ω) is given by:
SYY(ω)=|H(ω)|2 SXX(ω)
RAJENDRA CHARY 35
Power Density Spectrum Of Response
Proof:
We know that,
RAJENDRA CHARY 36
Power Density Spectrum Of Response
From properties of Fourier transform,
convolution in time domain=multiplication in frequency domain
RAJENDRA CHARY 37
Average Power of the Response
• We know that the average power of input is given by:
RAJENDRA CHARY 38
Cross Power Density Spectrum Of Input &
Output
• The cross power density spectrum of input and output is given by:
SXY(ω)=SXX(ω)H(ω)
and SYX(ω)=SXX(ω)H*(ω)
Proof:
We know that,
RAJENDRA CHARY 39
Cross Power Density Spectrum Of Input &
Output(contd.)
We get,
SXY(ω)=SXX(ω)H(ω)
Similarly, SYX(ω)=SXX(ω)H*(ω)
RAJENDRA CHARY 40
Types of Random Processes
Based on frequency(spectral properties),Random Processes are
classified into various types viz..
• Lowpass Random Process
• Bandpass Random Process
• Bandlimited Random Process
• Narrowband Random Process
RAJENDRA CHARY 41
Lowpass Random Process
RAJENDRA CHARY 43
Bandpass Random Process(contd.)
RAJENDRA CHARY 44
Bandlimited Random Process
• A random process N(t) is said to be Bandlimited Random
Process if its power spectral density SNN(ω) components are zero
outside the frequency band of width W that does not include ω=0
as shown in the figure below.
RAJENDRA CHARY 45
RAJENDRA CHARY 46
Narrowband Random Process
• A bandlimited random process N(t) is said to be a Narrowband
Random Process if the bandwidth W is very small compared to
the band centre frequency i.e W<< ωo, where W=bandwidth and ωo
is the frequency at which the power spectrum is maximum.
RAJENDRA CHARY 47
Narrowband Random Process(contd.)
• The narrowband process can be modelled as a cosine function with
angular frequency ωo and slowly varying amplitude and phase.
• It can be expressed as:
N(t)=A(t)cos[ωot+θ(t)]
Where A(t) is a random process representing the slowly varying
amplitude and θ(t) is a random process representing the slowly
varying phase
RAJENDRA CHARY 48
Representation Of Narrowband Process
• For any arbitrary WSS random processes N(t),narrowband process
can be represented by:
N(t)=X(t)cosωot - Y(t) sinωot
Where X(t) and Y(t) are in-phase and quadrature-phase
components of N(t).They can be expressed as:
X(t)=A(t)cos[θ(t)]
Y(t)=A(t)sin[θ(t)]
RAJENDRA CHARY 49
Representation Of Narrowband
Process(contd.)
• The relationship between the processes A(t) and θ(t) is given by:
RAJENDRA CHARY 50
Properties Of Bandlimited Random Process
Let N(t) be any bandlimited WSS random process with zero mean
value and a PSD SNN(ω).If the random process is represented by:
N(t)=X(t)cosωot - Y(t) sinωot
Then some of the properties are:
1)If N(t) is WSS,then X(t) and Y(t) are jointly WSS
2)If N(t) has zero mean (i.e E[N(t)]=0),then E[X(t)]=E[Y(t)]=0
3)The mean square values of the processes are equal i.e.
E[N2(t)]=E[X2(t)]=E[Y2(t)]
RAJENDRA CHARY 51
Properties Of Bandlimited Random
Process(contd.)
4)Both processes X(t) and Y(t) have same autocorrelation functions
i.e.. RXX( )= RYY( )
5)The crosscorrelation function of X(t) and Y(t) are given by
RYX( )= -RXY( ), RXY( )= -RXY(- )
If the processes are orthogonal, then
RXY( )= RYX( )=0
RAJENDRA CHARY 52
Properties Of Bandlimited Random
Process(contd.)
6)Both X(t) and Y(t) have same power spectral densities i.e
- ω0 ≤ ω ≤ ω 0
,otherwise
7)The cross power density spectrums are
SXY(ω) = SYX(-ω)
RAJENDRA CHARY 53
Properties Of Bandlimited Random
Process(contd.)
8)If N(t) is a Gaussian random process, then X(t) and Y(t) are jointly
Gaussian.
9)The relationship between autocorrelation and power spectrum
SNN(ω) is
RAJENDRA CHARY 54
Properties Of Bandlimited Random
Process(contd.)
10)If N(t) is a zero mean Gaussian and its PSD SNN(ω) is symmetric
about ±ωo, then X(t) and Y(t) are statistically independent.
11) SXX(ω)=Lp[SNN(ω- ωo)+SNN(ω+ ωo)]
12) SXY(ω)=jLp[SNN(ω- ωo) - SNN(ω+ ωo)]
RAJENDRA CHARY 55
Problem
• A stationary random process X(t) with zero mean and
autocorrelation RXX( )=e—2| 𝜏 | is applied to a system of function
H(ω)=1/(2+j ω).Find the mean and PSD of its output. (R1eg.8.33)
RAJENDRA CHARY 56
Problem
• The power density spectrum of a random process is given by
Find whether it is a valid spectral density or
not.If it is transmitted through a system as shown in figure, find the
output power density spectrum. (R1 eg8.12)
RAJENDRA CHARY 57
RAJENDRA CHARY 58
Problem
• A random process X(t) whose mean is 2 and autocorrelation
function RXX( )=4e-2|𝜏| is applied to a system whose transfer
function is 1/(2+jω).Find the mean value, power density spectrum,
autocorrelation and average power of the output Y(t). (R1 eg8.4)
RAJENDRA CHARY 59
Problem
• A random noise X(t) having power spectrum SXX(ω)=3/(49+ ω2) is
applied to a network for which h(t)=u(t)t2e-7t.The network
response is denoted by Y(t)
(a)What is the average power in X(t)
(b)Find the power spectrum of Y(t)
(c)Find the average power in Y(t) (R1 eg8.18)
RAJENDRA CHARY 60
Problem
• A random process Y(t) is obtained from the system as shown in
figure. Find the output spectral density in terms of input spectral
density, where A0 and ω0 are constants, X(t) is the input random
process and its power spectral density is SXX(ω) (R1 eg8.14)
RAJENDRA CHARY 61
Problems
• A WSS random process X(t) is applied to a LTI system with
impulse response h( )= 3e-2𝜏u( ) having a mean value X=2.Find
the value of Y.
• A WSS random process X(t) is applied to a LTI system with
impulse response h(t)=5te-2tu(t) having a mean value of input as
3.Find the mean of output.
• A WSS random process X(t) having a mean value of input as 2 is
applied to a system with impulse response h(t)= 3t2e-8tu(t) .Find
the mean of output.
RAJENDRA CHARY 62
RAJENDRA CHARY 63
Basic Block Diagram Of A Communication
Channel
RAJENDRA CHARY 64
Information Theory
• Information theory is a branch of Probability Theory, which can
be applied to the study of Communication Systems.
• Information theory deals with the following concepts:
1)The measure of information: It is the evaluation of the rate at
which the source generates the information
2)The information capacity of the channel: It determines the
maximum rate at which reliable information transmission is
possible over a given channel with an arbitrary small error.
RAJENDRA CHARY 65
Information Theory(contd.)
3)Coding: It is a scheme for efficient utilization of the information
capacity of the channel.
RAJENDRA CHARY 66
Rate of Information
• Rate of information,R is defined as the number of bits of
information per second.
• It is given by:
R=rH bits/sec
Where H= average no.of bits of information per message(i.e entropy)
r=message generated at source per second
RAJENDRA CHARY 67
Entropy
• Entropy is defined as the average information gained for the
symbol or the message.
• It is measured in bits/symbol or bits/message.
• It is given by:
RAJENDRA CHARY 68
Entropy(contd.)
Proof:
Let there be n different messages m1,m2,….mn with their respective
probabilities P1,P2,…..Pn.
Let us assume that in a long interval, L messages have been
generated.
Let L be very large such that L>>n,then
no. of messages, m1=P1L
Amount of information in message m1 =log2 (1/P1)
RAJENDRA CHARY 69
Entropy(contd.)
Thus,
the total amount of information in all m1messages=P1L log2 (1/P1)
Similarly,
the total amount of information in all m2 messages=P2L log2 (1/P2)
.
.
the total amount of information in all mn messages=PnL log2 (1/Pn)
RAJENDRA CHARY 70
Entropy(contd.)
The total information in all L messages is:
Itotal=P1L log2 (1/P1)+P2L log2 (1/P2)+………+PnL log2 (1/Pn)
The average information per message or entropy is given by:
H=Itotal/L
={P1Llog2(1/P1)+P2Llog2(1/P2)+…+PnLlog2(1/Pn)}/L
= P1log2 (1/P1)+P2log2 (1/P2)+………+Pnlog2 (1/Pn)
RAJENDRA CHARY 71
Entropy(contd.)
RAJENDRA CHARY 72
Entropy(contd.)
• The Source Entropy is given by:
H[X]= - P(xi)log2P(xi)
RAJENDRA CHARY 73
Entropy(contd.)
• In a continuous Information theory, the entropy is given by:
RAJENDRA CHARY 74
Problems
• A discrete memory less source has an alphabet of 8 letters, xi,
i=1,2,…8 with probabilities
0.15,0.17,0.25,0.15,0.10,0.08,0.05,0.05.Determine the entropy of
the source.
• Find the entropy of Uniform density function.
RAJENDRA CHARY 75
RAJENDRA CHARY 76
Problems
• An analog signal is band-limited to B Hz sampled at the Nyquist
rate and the samples are quantized into 4 levels. The quantization
levels Q1,Q2,Q3,Q4 occur with probabilities P1=P4=1/8 and
P2=P3=3/8.Find the entropy and rate of information.
• Find the entropy of Gaussian density function with zero mean.
RAJENDRA CHARY 77
Joint Entropy
• Let X be the source and Y be the receiver of a communication
system. Then the Joint Entropy is given by:
RAJENDRA CHARY 78
Joint Entropy(contd.)
Proof:
Let X be the source and Y be the receiver. Each message of X may occur
jointly with any message of Y i.e any x of X is transmitted as any y of
Y.
• The above fact can be represented using Joint probability matrix which
is given by:
contd..
RAJENDRA CHARY 79
Joint Entropy(contd.)
RAJENDRA CHARY 80
Joint Entropy(contd.)
P(x1)=P(x1 y1)+P(x1 y2)+………+P(x1 ym)
P(y1)=P(x1 y1)+P(x2 y1)+………+P(xn y1)
In general ,these two expressions can be written as:
RAJENDRA CHARY 81
Joint Entropy(contd.)
• Entropy of the Source,X is given by:
H[X]=-[P(x1) log2 P(x1) + P(x2) log2 (P(x2) +…+ P(xn) log2 P(xn) ]
=- P(xi) log2 P(xi)
RAJENDRA CHARY 82
Joint Entropy(contd.)
• Here, H[X] and H[Y] are called marginal entropies and H[XY] is
called Joint entropy.
RAJENDRA CHARY 83
Joint Entropy(contd.)
• In continuous Information theory, the Joint Entropy is given by:
RAJENDRA CHARY 84
Conditional Entropy
• The Conditional Entropy is given by:
H[X/Y]=H[XY]-H[Y]
H[Y/X]=H[XY]-H[X]
RAJENDRA CHARY 85
Properties Of Entropy
1)When the probability of occurrence of an event is 1,the entropy
becomes zero. Hence the entropy is a measure of uncertainity.
Proof:
H=-Plog2(P)
When P=1,
H=-1log2(1)=0
RAJENDRA CHARY 86
Properties Of Entropy(contd.)
2)Entropy is maximum when all the source symbols are equally
likely to occur.
Proof:
Consider two symbols m1 and m2 with the probabilities P and 1-P
We know that,
RAJENDRA CHARY 87
Properties Of Entropy(contd.)
= -[PlogP + (1-P)log(1-P)]
For maximum, dH/dP=0
{[P.(1/P)+1.log(P)]+[(1-P).(1/(1-P))(-1)+log(1-P)(-1)]}=0
log(P/1-P)=0
P/1-P =1
P=1-P
2P=1
P=1/2
RAJENDRA CHARY 88
Properties Of Entropy(contd.)
• The variation of H with P reveals that it has broad maximum
centred at P=1-P=1/2 where H=1bit/symbol and decreases
monotonically to zero on both sides as P tends to 1 or (1-P) tends
to zero.(or vice-versa)
• Similarly for M-messages(M-Ary),entropy is maximum when all
the messages are equally likely. In this case, maximum entropy
will be log2 M bits/message
RAJENDRA CHARY 89
Properties Of Entropy(contd.)
3)H[XY]=H[X]+H[Y/X]
=H[Y]+H[X/Y]
Proof:
Consider the joint probability
H[XY]=- P(xi yj) log2 P(xi yj)
RAJENDRA CHARY 90
Properties Of Entropy(contd.)
RAJENDRA CHARY 91
Properties Of Entropy(contd.)
RAJENDRA CHARY 92
Properties Of Entropy(contd.)
RAJENDRA CHARY 93
RAJENDRA CHARY 94
Problems
• Calculate the entropy of alphabets in English language assuming
that each of the 26 characters in alphabet occur with equal
probability.
• For a signal x(t) which is having a uniform density function in the
range[0,4],find H[X].If the same signal is amplified by a factor of
8,determine H[X].
RAJENDRA CHARY 95
Problem
• A transmitter has an alphabet consisting of letters, the joint
probabilities matrix is given below. Determine the marginal,
conditional and joint entropies of the channel.
Y(Receiver)
F G H I
A 0.25 0 0 0
B 0.10 0.30 0 0
X(Trans
mitter) C 0 0.05 0.10 0
D 0 0 0.05 0.10
E 0 0 0.05 0
RAJENDRA CHARY 96
Mutual Information or Transinformation
• Mutual Information is the average information gained by the
channel due to uncertainity present in the channel.
• It is given by:
I[XY]=H[X]-H[X/Y]
(or) I[XY]=H[Y]-H[Y/X]
(or) I[XY]=H[X]+H[Y]-H[XY]
RAJENDRA CHARY 97
Mutual Information
Proof:
• Uncertainity present due to the transmitted symbol = –logP(xi)
• Uncertainity present at the receiver due to the transmitted
signal(symbol) with known state of received symbol = –logP(xi/yj)
• The net uncertainity is equal to difference of initial uncertainity
and final uncertainity.
i.e I(xi yj)=-logP(xi) – [-logP(xi/yj)]
RAJENDRA CHARY 98
Mutual Information
RAJENDRA CHARY 99
Mutual Information
Pn TB
Where T=temperature in kelvin
B=bandwidth in Hz
For more properties-Refer A.Anand Kumar or any other standard signals &
systems textbook