Professional Documents
Culture Documents
Digital Communication Unit-III Stochastic Process
Digital Communication Unit-III Stochastic Process
Ashok N Shinde
ashok.shinde0349@gmail.co
m
International Institute of Information Technology
Hinjawadi Pune
Signals
Deterministic: can be reproduced exactly with repeated
measurements.
Signals
Deterministic: can be reproduced exactly with repeated
measurements.
s(t) = A c o s ( 2 π f c t + θ)
Signals
Deterministic: can be reproduced exactly with repeated
measurements.
s(t) = A c o s ( 2 π f c t + θ)
Signals
Deterministic: can be reproduced exactly with repeated
measurements.
s(t) = A c o s ( 2 π f c t + θ)
Signals
Deterministic: can be reproduced exactly with repeated
measurements.
s(t) = A c o s ( 2 π f c t + θ)
Signals
Deterministic: can be reproduced exactly with repeated
measurements.
s(t) = A c o s ( 2 π f c t + θ)
Signals
Deterministic: can be reproduced exactly with repeated
measurements.
s(t) = A c o s ( 2 π f c t + θ)
Definition:
A stochastic process is a set of random variables indexed in time.
Mathematically:
Definition:
A stochastic process is a set of random variables indexed in time.
Mathematically:
Mathematically relationship between probability theory and
stochastic processes is as follows-
Definition:
A stochastic process is a set of random variables indexed in time.
Mathematically:
Mathematically relationship between probability theory and
stochastic processes is as follows-
Sample point →Sample Function
Definition:
A stochastic process is a set of random variables indexed in time.
Mathematically:
Mathematically relationship between probability theory and
stochastic processes is as follows-
Sample point →Sample Function
Sample space →Ensemble
Definition:
A stochastic process is a set of random variables indexed in time.
Mathematically:
Mathematically relationship between probability theory and
stochastic processes is as follows-
Sample point →Sample Function
Sample space →Ensemble
Random Variable→ Random
Process
Definition:
A stochastic process is a set of random variables indexed in time.
Mathematically:
Mathematically relationship between probability theory and
stochastic processes is as follows-
Sample point →Sample Function
Sample space →Ensemble
Random Variable→ Random Process
Sample point s is function of time :
Definition:
A stochastic process is a set of random variables indexed in time.
Mathematically:
Mathematically relationship between probability theory and
stochastic processes is as follows-
Sample point →Sample Function
Sample space →Ensemble
Random Variable→ Random Process
Sample point s is function of time :
X ( s , t), − T ≤ t ≤ T
Definition:
A stochastic process is a set of random variables indexed in time.
Mathematically:
Mathematically relationship between probability theory and
stochastic processes is as follows-
Sample point →Sample Function
Sample space →Ensemble
Random Variable→ Random Process
Sample point s is function of time :
X ( s , t), − T ≤ t ≤ T
Sample function denoted as:
Definition:
A stochastic process is a set of random variables indexed in time.
Mathematically:
Mathematically relationship between probability theory and
stochastic processes is as follows-
Sample point →Sample Function
Sample space →Ensemble
Random Variable→ Random Process
Sample point s is function of time :
X ( s , t), − T ≤ t ≤ T
Sample function denoted as:
x j (t) = X ( t , s j ), − T ≤ t ≤ T
Stochastic Process
Each sample point in S is associated with a sample function x(t)
Stochastic Process
Each sample point in S is associated with a sample function x(t)
X ( t, s) is a random process
Stochastic Process
Each sample point in S is associated with a sample function x(t)
X (t , s) is a random process
is an ensemble of all time functions together with a probability rule
Stochastic Process
Each sample point in S is associated with a sample function x(t)
X ( t, s) is a random process
is an ensemble of all time functions together with a probability rule
X ( t k , s j ) is a realization or sample function of the random process
{x 1 (t k ), x 2 (t k ), . . . , x n ( t k ) = X ( t k , s 1 ), X ( t k , s 2 ), . . . , X ( t k , s n )}
Stochastic Process
Each sample point in S is associated with a sample function x(t)
X (t , s) is a random process
is an ensemble of all time functions together with a probability rule
X ( t k , s j ) is a realization or sample function of the random process
{x 1 (t k ), x 2 (t k ), . . . , x n ( t k ) = X ( t k , s 1 ), X ( t k , s 2 ), . . . , X ( t k , s n )}
Probability rules assign probability to any meaningful event associated
with an observation An observation is a sample function of the random
process
Stochastic Process
Each sample point in S is associated with a sample function x(t)
X ( t, s) is a random process
is an ensemble of all time functions together with a probability rule
X ( t k , s j ) is a realization or sample function of the random process
{x 1 (t k ), x 2 (t k ), . . . , x n ( t k ) = X ( t k , s 1 ), X ( t k , s 2 ), . . . , X ( t k , s n )}
Probability rules assign probability to any meaningful event associated
with an observation An observation is a sample function of the random
process
Stochastic Process
Each sample point in S is associated with a sample function x(t)
X (t , s) is a random process
is an ensemble of all time functions together with a probability rule
X ( t k , s j ) is a realization or sample function of the random process
{x 1 (t k ), x 2 (t k ), . . . , x n ( t k ) = X ( t k , s 1 ), X ( t k , s 2 ), . . . , X ( t k , s n )}
Probability rules assign probability to any meaningful event associated
with an observation An observation is a sample function of the random
process
Stochastic Process
Each sample point in S is associated with a sample function x(t)
X (t , s) is a random process
is an ensemble of all time functions together with a probability rule
X ( t k , s j ) is a realization or sample function of the random process
{x 1 (t k ), x 2 (t k ), . . . , x n ( t k ) = X ( t k , s 1 ), X ( t k , s 2 ), . . . , X ( t k , s n )}
Probability rules assign probability to any meaningful event associated
with an observation An observation is a sample function of the random
process
Stationary Process:
Stationary Process:
If a process is divided into a number of time intervals exhibiting
same statistical properties, is called asStationary.
Stationary Process:
If a process is divided into a number of time intervals exhibiting
same statistical properties, is called asStationary.
It is arises from a stable phenomenon that has evolved into a
steady-state mode of behavior.
Stationary Process:
If a process is divided into a number of time intervals exhibiting
same statistical properties, is called asStationary.
It is arises from a stable phenomenon that has evolved into a
steady-state mode of behavior.
Non-Stationary Process:
Stationary Process:
If a process is divided into a number of time intervals exhibiting
same statistical properties, is called asStationary.
It is arises from a stable phenomenon that has evolved into a
steady-state mode of behavior.
Non-Stationary Process:
If a process is divided into a number of time intervals exhibiting
different statistical properties, is called asNon-Stationary.
Stationary Process:
If a process is divided into a number of time intervals exhibiting
same statistical properties, is called asStationary.
It is arises from a stable phenomenon that has evolved into a
steady-state mode of behavior.
Non-Stationary Process:
If a process is divided into a number of time intervals exhibiting
different statistical properties, is called asNon-Stationary. It is
arises from an unstable phenomenon.
Mean:
Mean:
Mean of real-valued stochastic process X(t), is expectation of the
random variable obtained by sampling the process at some time t, as
shown by
Mean:
Mean of real-valued stochastic process X(t), is expectation of the
random variable obtained by sampling the process at some time t, as
shown by
µ X (t) = E [X(t)]
Mean:
Mean of real-valued stochastic process X(t), is expectation of the
random variable obtained by sampling the process at some time t, as
shown by
µ X (t) = E[X(t)]
,
µ X (t ) = −∞∞ x f X (t ) (x )dx
Mean:
Mean of real-valued stochastic process X(t), is expectation of the
random variable obtained by sampling the process at some time t, as
shown by
µ X (t) = E[X(t)]
,
µ X (t ) = −∞∞ x f X (t ) (x )dx
where f is the first-order probability density function of the
X(t)( x )
process X(t).
Mean:
Mean of real-valued stochastic process X(t), is expectation of the
random variable obtained by sampling the process at some time t, as
shown by
µ X (t) = E[X(t)]
,
µ X (t ) = −∞∞ x f X (t ) (x )dx
where f is the first-order probability density function of the
X(t)( x )
process X(t).
µ X (t) = µ X for weakly stationary process
Mean:
Mean of real-valued stochastic process X(t), is expectation of the
random variable obtained by sampling the process at some time t, as
shown by
µ X (t) = E[X(t)]
,
µ X (t ) = −∞∞ x f X (t ) (x )dx
where f is the first-order probability density function of the
X(t)( x )
process X(t).
µ X (t) = µ X for weakly stationary process
Correlation:
Mean:
Mean of real-valued stochastic process X(t), is expectation of the
random variable obtained by sampling the process at some time t, as
shown by
µ X (t) = E[X(t)]
,
µ X (t ) = −∞∞ x f X (t ) (x )dx
where f is the first-order probability density function of the
X(t)( x )
process X(t).
µ X (t) = µ X for weakly stationary process
Correlation:
Autocorrelation function of the stochastic process X (t ) is product of two
random variables, X ( t 1 ) and X ( t 2 )
Mean:
Mean of real-valued stochastic process X(t), is expectation of the
random variable obtained by sampling the process at some time t, as
shown by
µ X (t) = E[X(t)]
,
µ X (t ) = −∞∞ x f X (t ) (x )dx
where f is the first-order probability density function of the
X(t)( x )
process X(t).
µ X (t) = µ X for weakly stationary process
Correlation:
Autocorrelation function of the stochastic process X (t ) is product of two
random variables, X ( t 1 ) and X ( t 2 )
MX X (t 1 , t 2 ) = E[X(t 1 )X(t 2 )]
Mean:
Mean of real-valued stochastic process X(t), is expectation of the
random variable obtained by sampling the process at some time t, as
shown by
µ X (t) = E[X(t)]
,
µ X (t ) = −∞∞ x f X (t ) (x )dx
where f is the first-order probability density function of the
X(t)( x )
process X(t).
µ X (t) = µ X for weakly stationary process
Correlation:
Autocorrelation function of the stochastic process X (t ) is product of two
random variables, X ( t 1 ) and X ( t 2 )
M X X (t 1 , t 2 ) = E,[X(t 1,)X(t 2 )]
∞ ∞
M X X (t1 2, t ) = x x f (x 1, x2 )dx dx
1 2
1 2 X ( t ) , X1 ( x ) 2
− ∞ − ∞
where f X ( t 1 ), X ( x 2 ) ( x 1 , x 2 ) is joint probability density function of the
process X ( t ) sampled at times t 1 and t 2 . M X X (t 1 , t 2 ) is a
second-order moment. It is depend only on time difference t 1 − t 2 so that
the process X ( t ) satisfies the second condition of weak stationarity and
reduces to.
Mean:
Mean of real-valued stochastic process X(t), is expectation of the
random variable obtained by sampling the process at some time t, as
shown by
µ X (t) = E[X(t)]
,
µ X (t ) = −∞∞ x f X (t ) (x )dx
where f is the first-order probability density function of the
X(t)( x )
process X(t).
µ X (t) = µ X for weakly stationary process
Correlation:
Autocorrelation function of the stochastic process X (t ) is product of two
random variables, X ( t 1 ) and X ( t 2 )
M X X (t 1 , t 2 ) = E,[X(t 1,)X(t 2 )]
∞ ∞
M X X (t1 2, t ) = x x f (x 1, x2 )dx dx
1 2
1 2 X ( t ) , X1 ( x ) 2
− ∞ − ∞
where f X ( t 1 ), X ( x 2 ) ( x 1 , x 2 ) is joint probability density function of the
process X ( t ) sampled at times t 1 and t 2 . M X X (t 1 , t 2 ) is a
second-order moment. It is depend only on time difference t 1 − t 2 so that
the process X ( t ) satisfies the second condition of weak stationarity and
reduces to.
M X X (t 1 , t 2 ) = E[X (t 1 )X (t 2 )] = R X X (t 2 − t 1 )
Properties:
where τ denotes a time shift; that is,t = t 2 and τ = t 1 − t 2
Σ 2
R X X (0) = E XΣ (t) (Mean-Square Value)
R X X (τ ) = R X X ( − τ ) also R X X (τ ) = E [ X ( t − τ )X(t)]
(Symmetry)
Properties:
where τ denotes a time shift; that is,t = t 2 and τ = t 1 − t 2
Σ 2
R X X (0) = E X Σ (t) (Mean-Square Value)
R X X (τ ) = R X X ( − τ ) also R X X (τ ) = E [ X ( t − τ )X(t)]
(Symmetry)
| R X X (τ )| ≤ R X X (0) (Bound)
Properties:
where τ denotes a time shift; that is,t = t 2 and τ = t 1 − t 2
Σ 2
R X X (0) = E X Σ (t) (Mean-Square Value)
R X X (τ ) = R X X ( − τ ) also R X X (τ ) = E [ X ( t − τ )X(t)]
(Symmetry)
| R X X (τ )| ≤ R X X (0) (Bound)
R X X (τ )
ρ X X (τ ) = R X X (0) (Normalization [−1, 1])
Properties:
where τ denotes a time shift; that is,t = t 2 and τ = t 1 − t 2
Σ 2
R X X (0) = E X Σ (t) (Mean-Square Value)
R X X (τ ) = R X X ( − τ ) also R X X (τ ) = E [ X ( t − τ )X(t)]
(Symmetry)
| R X X (τ )| ≤ R X X (0) (Bound)
R X X (τ )
ρ X X (τ )= R X X (0) (Normalization [−1, 1])
Covariance:
Properties:
where τ denotes a time shift; that is,t = t 2 and τ = t 1 − t 2
Σ 2
R X X (0) = E X Σ (t) (Mean-Square Value)
R X X (τ ) = R X X ( − τ ) also R X X (τ ) = E [ X ( t − τ )X(t)]
(Symmetry)
| R X X (τ )| ≤ R X X (0) (Bound)
R X X (τ )
ρ X X (τ )= R X X (0) (Normalization [−1, 1])
Covariance:
Autocovariance function of a weakly stationary process X(t ) is
defined by
Properties:
where τ denotes a time shift; that is,t = t 2 and τ = t 1 − t 2
Σ 2
R X X (0) = E X Σ (t) (Mean-Square Value)
R X X (τ ) = R X X ( − τ ) also R X X (τ ) = E [ X ( t − τ )X(t)]
(Symmetry)
| R X X (τ )| ≤ R X X (0) (Bound)
R X X (τ )
ρ X X (τ )= R X X (0) (Normalization [−1, 1])
Covariance:
Autocovariance function of a weakly stationary process X( t) is
defined by
C XX (t 1 , t 2 ) = E [ ( X ( t 1 ) − µ x ) ( X ( t 2 ) − µ x )]
Properties:
where τ denotes a time shift; that is,t = t 2 and τ = t 1 − t 2
Σ 2
R X X (0) = E X Σ (t) (Mean-Square Value)
R X X (τ ) = R X X ( − τ ) also R X X (τ ) = E [ X ( t − τ )X(t)]
(Symmetry)
| R X X (τ )| ≤ R X X (0) (Bound)
R X X (τ )
ρ X X (τ )= R X X (0) (Normalization [−1, 1])
Covariance:
Autocovariance function of a weakly stationary process X( t) is
defined by
C XX (t 1 , t 2 ) = E [ ( X ( t 1 ) − µ x ) ( X ( t 2 )2 − µ x )]
C X X (t 1 , 2t ) = R X X (t 2− t )1 − µ x
Properties:
where τ denotes a time shift; that is,t = t 2 and τ = t 1 − t 2
Σ 2
R X X (0) = E X Σ (t) (Mean-Square Value)
R X X (τ ) = R X X ( − τ ) also R X X (τ ) = E [ X ( t − τ )X(t)]
(Symmetry)
| R X X (τ )| ≤ R X X (0) (Bound)
R X X (τ )
ρ X X (τ )= R X X (0) (Normalization [−1, 1])
Covariance:
Autocovariance function of a weakly stationary process X( t) is
defined by
C XX (t 1 , t 2 ) = E [ ( X ( t 1 ) − µ x ) ( X ( t 2 )2 − µ x )]
C X X (t 1 , 2t ) = R X X (t 2− t )1 − µ x
The autocovariance function of a weakly stationary process X ( t )
depends only on the time difference (t 2 − t 1 )
Properties:
where τ denotes a time shift; that is,t = t 2 and τ = t 1 − t 2
Σ 2
R X X (0) = E X Σ (t) (Mean-Square Value)
R X X (τ ) = R X X ( − τ ) also R X X (τ ) = E [ X ( t − τ )X(t)]
(Symmetry)
| R X X (τ )| ≤ R X X (0) (Bound)
R X X (τ )
ρ X X (τ )= R X X (0) (Normalization [−1, 1])
Covariance:
Autocovariance function of a weakly stationary process X( t) is
defined by
C XX (t 1 , t 2 ) = E [ ( X ( t 1 ) − µ x ) ( X ( t 2 )2 − µ x )]
C X X (t 1 , 2t ) = R X X (t 2− t )1 − µ x
The autocovariance function of a weakly stationary process X (t )
depends only on the time difference (t 2 − t 1 )
The mean and autocorrelation function only provide a weak
description of the distribution of the stochastic process X(t).
Examples
Show that the Random Process X ( t ) = Acos(ω c t + Θ) is wide sense stationary process,
where Θ is RV uniformly distributed in range (0, 2π)
Answer:
The ensemble consist of sinusoids of constant amplitude A and constant
frequency ωc, but phase Θ is random.
Examples
Show that the Random Process X ( t ) = Acos(ω c t + Θ) is wide sense stationary process,
where Θ is RV uniformly distributed in range (0, 2π)
Answer:
The ensemble consist of sinusoids of constant amplitude A and constant
frequency ωc, but phase Θ is random.
The phase is equally likely to any value in the range (0, 2π).
Examples
Show that the Random Process X ( t ) = Acos(ω c t + Θ) is wide sense stationary process,
where Θ is RV uniformly distributed in range (0, 2π)
Answer:
The ensemble consist of sinusoids of constant amplitude A and constant
frequency ωc, but phase Θ is random.
The phase is equally likely to any value in the range (0, 2π). Θ
is RV uniformly distributed over the range (0, 2π).
Examples
Show that the Random Process X ( t ) = Acos(ω c t + Θ) is wide sense stationary process,
where Θ is RV uniformly distributed in range (0, 2π)
Answer:
The ensemble consist of sinusoids of constant amplitude A and constant
frequency ωc, but phase Θ is random.
The phase is equally likely to any value in the range (0, 2π). Θ
is RV uniformly distributed over the range (0, 2π).
1
f Θ (θ) = , 0 ≤ θ ≤ 2π
2π
Examples
Show that the Random Process X ( t ) = Acos(ω c t + Θ) is wide sense stationary process,
where Θ is RV uniformly distributed in range (0, 2π)
Answer:
The ensemble consist of sinusoids of constant amplitude A and constant
frequency ωc, but phase Θ is random.
The phase is equally likely to any value in the range (0, 2π). Θ
is RV uniformly distributed over the range (0, 2π).
1
f Θ (θ) = , 0 ≤ θ ≤ 2π
2π
= 0, elsewhere
Answer:
Because cos(ω ct + Θ) is function of RV Θ, Mean of Random Process
X ( t) is
Answer:
Because cos(ω ct + Θ) is function of RV Θ, Mean of Random Process
X ( t) is
X(t) = Acos(ω c t + Θ)
Answer:
Because cos(ω ct + Θ) is function of RV Θ, Mean of Random Process
X ( t) is
X (t)
= Acos(ω c t + Θ)
= Acos(ω c t + Θ)
Answer:
Because cos(ω ct + Θ) is function of RV Θ, Mean of Random Process
X ( t ) is
X(t) = Acos(ω c t + Θ)
¸ 2π c t + Θ)
= Acos(ω
cos(ω ct + Θ) = 0
cos(ω ct + θ)f (θ)dθ
Θ
Answer:
Because cos(ω ct + Θ) is function of RV Θ, Mean of Random Process
X ( t ) is
X(t) = Acos(ω c t + Θ)
¸ 2π c t + Θ)
= Acos(ω
cos(ω ct + Θ) = 0
cos(ω ct + θ)f (θ)dθ
Θ
1 ¸ 2π
= cos(ω ct + θ)dθ
2π 0
Answer:
Because cos(ω ct + Θ) is function of RV Θ, Mean of Random Process
X ( t ) is
X(t) = Acos(ω c t + Θ)
¸ 2π c t + Θ)
= Acos(ω
cos(ω ct + Θ) = 0
cos(ω ct + θ)f (θ)dθ
Θ
1 ¸ 2π
= cos(ω ct + θ)dθ
2π 0
=0
Answer:
Because cos(ω ct + Θ) is function of RV Θ, Mean of Random Process
X ( t ) is
X(t) = Acos(ω c t + Θ)
¸ 2π c t + Θ)
= Acos(ω
cos(ω ct + Θ) = 0
cos(ω ct + θ)f (θ)dθ
Θ
1 ¸ 2π
= cos(ω ct + θ)dθ
2π 0
=0
X (t ) =0
Answer:
Because cos(ω ct + Θ) is function of RV Θ, Mean of Random Process
X ( t ) is
X(t) = Acos(ω c t + Θ)
¸ 2π c t + Θ)
= Acos(ω
cos(ω ct + Θ) = 0
cos(ω ct + θ)f (θ)dθ
Θ
1 ¸ 2π
= cos(ω ct + θ)dθ
2π 0
=0
X (t ) =0
Thus the ensemble mean of sample function amplitude at any time
instant t is zero.
Answer:
Because cos(ω ct + Θ) is function of RV Θ, Mean of Random Process
X ( t ) is
X(t) = Acos(ω c t + Θ)
¸ 2π c t + Θ)
= Acos(ω
cos(ω ct + Θ) = 0
cos(ω ct + θ)f (θ)dθ
Θ
1 ¸ 2π
= cos(ω ct + θ)dθ
2π 0
=0
X(t) =0
Thus the ensemble mean of sample function amplitude at any time
instant t is zero.
The Autocorrelation function R X X ( t 1 , t 2 ) for this process can also be determined
as
R XX (t 1 , t 2 ) = E [ X ( t 1 ) X ( t 2 ) ]
= A 2 cos(ω c t 1 + Θ)cos(ωct 2 + Θ)
Answer:
Continue. . .
Answer:
Continue. . .
= A 2 cos(ω c t 1 + Θ)cos(ω ct 2 + Θ)
Answer:
Continue. . .
= A 2 cos(ω c t 1 + Θ)cos(ωct 2 + Θ)
= A
2 cos(ωc(t2 − t 1 )) + cos(ω c(t 2 + t 1 ) + 2Θ)
Answer:
Continue. . .
= A 2 cos(ω c t 1 + Θ)cos(ωct 2 + Θ)
A
= cos(ω c(t 2 − t 1 )) + cos(ω c(t2 + t 1 ) + 2Θ) 2
Answer:
Continue. . .
= A 2 cos(ω c t 1 + Θ)cos(ωct 2 + Θ)
A Σ
Σ
= cos(ω (t 2 − t 1 )) + cos(ω c(t2 + t 1 ) + 2Θ) 2
The term cos(ωc(t 2 − t 1 )) does not containc RV Hence,
Answer:
Continue. . .
= A 2 cos(ω c t 1 + Θ)cos(ωct 2 + Θ)
A Σ
Σ
= cos(ω (t 2 − t 1 )) + cos(ω c(t2 + t 1 ) + 2Θ) 2
The term cos(ω c(t 2 − t 1 )) does not containc RV Hence,
Answer:
Continue. . .
= A 2 cos(ω c t 1 + Θ)cos(ωct 2 + Θ)
A Σ
Σ
= cos(ω (t 2 − t 1 )) + cos(ω c(t2 + t 1 ) + 2Θ) 2
The term cos(ω c(t 2 − t 1 )) does not containc RV Hence,
Answer:
Continue. . .
= A 2 cos(ω c t 1 + Θ)cos(ωct 2 + Θ)
A Σ
Σ
= cos(ω (t 2 − t 1 )) + cos(ω c(t2 + t 1 ) + 2Θ) 2
The term cos(ω c(t 2 − t 1 )) does not containc RV Hence,
Answer:
Continue. . .
= A 2 cos(ω c t 1 + Θ)cos(ωct 2 + Θ)
A Σ
Σ
= cos(ω (t 2 − t 1 )) + cos(ω c(t2 + t 1 ) + 2Θ) 2
The term cos(ω c(t 2 − t 1 )) does not containc RV Hence,
Answer:
Continue. . .
= A 2 cos(ω c t 1 + Θ)cos(ωct 2 + Θ)
A Σ
Σ
= cos(ω (t 2 − t 1 )) + cos(ω c(t2 + t 1 ) + 2Θ) 2
The term cos(ω c(t 2 − t 1 )) does not containc RV Hence,
Answer:
Continue. . .
= A 2 cos(ω c t 1 + Θ)cos(ωct 2 + Θ)
A Σ
Σ
= cos(ω (t 2 − t 1 )) + cos(ω c(t2 + t 1 ) + 2Θ) 2
The term cos(ω c(t 2 − t 1 )) does not containc RV Hence,
Ensemble Average
Ensemble Average
Difficult to generate a number of realizations of a random process
Ensemble Average
Difficult to generate a number of realizations of a random process Use
time averages
Ensemble Average
Difficult to generate a number of realizations of a random process Use
time averages
1 ¸ +T
Mean ⇒ µX(T ) = x (t)dt
2T − T
Ensemble Average
Difficult to generate a number of realizations of a random process Use
time averages
1 ¸ +T
Mean ⇒ µX(T ) = x (t)dt
2T − T
1 ¸ +T
Autocorrelation ⇒ R X X (τ, T ) = x(t)x(t + τ )dt
2T − T
Ensemble Average
Difficult to generate a number of realizations of a random process Use
time averages
1 ¸ +T
Mean ⇒ µX(T ) = x (t)dt
2T − T
1 ¸ +T
Autocorrelation ⇒ R X X (τ, T ) = x(t)x(t + τ )dt
2T − T
Ergodic Process
Ensemble Average
Difficult to generate a number of realizations of a random process Use
time averages
1 ¸ +T
Mean ⇒ µX(T ) = x (t)dt
2T − T
1 ¸ +T
Autocorrelation ⇒ R X X (τ, T ) = x(t)x(t + τ )dt
2T − T
Ergodic Process
Ergodicity: A random process is called Ergodic if
Ensemble Average
Difficult to generate a number of realizations of a random process Use
time averages
1 ¸ +T
Mean ⇒ µX(T ) = x (t)dt
2T − T
1 ¸ +T
Autocorrelation ⇒ R X X (τ, T ) = x(t)x(t + τ )dt
2T − T
Ergodic Process
Ergodicity: A random process is called Ergodic if it
is ergodic in mean:
Ensemble Average
Difficult to generate a number of realizations of a random process Use
time averages
1 ¸ +T
Mean ⇒ µX(T ) = x (t)dt
2T − T
1 ¸ +T
Autocorrelation ⇒ R X X (τ, T ) = x(t)x(t + τ )dt
2T − T
Ergodic Process
Ergodicity: A random process is called Ergodic if it
is ergodic in mean:
l i m T → ∞ µ X (T ) = µ X
Ensemble Average
Difficult to generate a number of realizations of a random process Use
time averages
1 ¸ +T
Mean ⇒ µX(T ) = x (t)dt
2T − T
1 ¸ +T
Autocorrelation ⇒ R X X (τ, T ) = x(t)x(t + τ )dt
2T − T
Ergodic Process
Ergodicity: A random process is called Ergodic if it
is ergodic in mean:
l i m T → ∞ µ X (T ) = µ X
l i m T → ∞ var [µ X (T )] = 0
Ensemble Average
Difficult to generate a number of realizations of a random process Use
time averages
1 ¸ +T
Mean ⇒ µX(T ) = x (t)dt
2T − T
1 ¸ +T
Autocorrelation ⇒ R X X (τ, T ) = x(t)x(t + τ )dt
2T − T
Ergodic Process
Ergodicity: A random process is called Ergodic if it
is ergodic in mean:
l i m T → ∞ µ X (T ) = µ X
l i m T → ∞ var [µ X (T )] = 0
it is ergodic in autocorrelation:
Ensemble Average
Difficult to generate a number of realizations of a random process Use
time averages
1 ¸ +T
Mean ⇒ µX(T ) = x (t)dt
2T − T
1 ¸ +T
Autocorrelation ⇒ R X X (τ, T ) = x(t)x(t + τ )dt
2T − T
Ergodic Process
Ergodicity: A random process is called Ergodic if it
is ergodic in mean:
l i m T → ∞ µ X (T ) = µ X
l i m T → ∞ var [µ X (T )] = 0
it is ergodic in autocorrelation:
limT → ∞ R X X (τ, T ) = R X X (τ )
Ensemble Average
Difficult to generate a number of realizations of a random process Use
time averages
1 ¸ +T
Mean ⇒ µX(T ) = x (t)dt
2T − T
1 ¸ +T
Autocorrelation ⇒ R X X (τ, T ) = x(t)x(t + τ )dt
2T − T
Ergodic Process
Ergodicity: A random process is called Ergodic if it
is ergodic in mean:
l i m T → ∞ µ X (T ) = µ X
l i m T → ∞ var [µ X (T )] = 0
it is ergodic in autocorrelation:
l i m T → ∞ R X X (τ, T ) = R X X (τ )
l i m T → ∞ var [ R X X (τ, T )] = 0
Ensemble Average
Difficult to generate a number of realizations of a random process Use
time averages
1 ¸ +T
Mean ⇒ µX(T ) = x (t)dt
2T − T
1 ¸ +T
Autocorrelation ⇒ R X X (τ, T ) = x(t)x(t + τ )dt
2T − T
Ergodic Process
Ergodicity: A random process is called Ergodic if it
is ergodic in mean:
l i m T → ∞ µ X (T ) = µ X
l i m T → ∞ var [µ X (T )] = 0
it is ergodic in autocorrelation:
l i m T → ∞ R X X (τ, T ) = R X X (τ )
l i m T → ∞ var [ R X X (τ, T )] = 0
where µ X and R X X (τ ) are the ensemble averages of the same random
process.
.
“If the input to a stable linear time invariant filter is a weakly
stationary process, then the output of the filter is also a weakly
stationary process”.
.
“If the input to a stable linear time invariant filter is a weakly
stationary process, then the output of the filter is also a weakly
stationary process”.
mean square value of the outputΣ process
Σ Y (t) is obtained by putting
τ = 0. We haveR Y Y (0) = E Y 2(t)
.
“If the input to a stable linear time invariant filter is a weakly
stationary process, then the output of the filter is also a weakly
stationary process”.
mean square value of the outputΣ process
Σ Y (t) is obtained by putting
τ = 0. We haveR Y Y (0) = E Y 2(t)
∫ +∞ ∫ +∞
Σ 2
E ΣY (t) = h(τ 1 )h (τ 2 )R X X (τ1 − τ 2 )dτ 1 dτ 2
−∞ −∞
.
“If the input to a stable linear time invariant filter is a weakly
stationary process, then the output of the filter is also a weakly
stationary process”.
mean square value of the outputΣ process
Σ Y (t) is obtained by putting
τ = 0. We haveR Y Y (0) = E Y 2(t)
∫ +∞ ∫ +∞
Σ 2
E ΣY (t) = h(τ 1 )h (τ 2 )R X X (τ1 − τ 2 )dτ 1 dτ 2
−∞
−∞
which, of Ashok
course, is a constant.
N Shinde DC Unit-III Stochastic Process 18/28
Power Spectral Density of a Weakly Stationary Process
The impulse response of a linear time-invariant filter is equal to the
inverse Fourier transform of the frequency response of the filter
by τ = τ 1 − τ 2
∫ +∞ Σ∫ +∞ ∫ +∞
= H(f ) h(τ 2 ) exp(j2πfτ 2 )dτ 2 RX X (τ ) e x p (− j2 π fτ )dτ
−∞ −∞ −∞
∫ Σ
+∞ Σ∫ +∞
= | H ( f )|2 RX X (τ ) e x p( −j 2 πf τ )dτ df
−∞ −∞
∫ +∞
SX X (f ) = RX X (τ ) e x p( −j 2 πf τ )dτ
−∞
∫ +∞
Σ 2 2
E ΣY (t) = | H ( f )| S X ( f )df
X
−∞
∫ +∞
SX X (f ) = RX X (τ ) e x p( −j 2 πf τ )dτ
−∞
∫ +∞
Σ 2 2
E ΣY (t) = | H ( f )| S X ( f )df
X
−∞
RX X (τ )dτ
−∞
Properties of PSD
Properties of PSD
4 Nonnegativeness of Power Spectral Density
Properties of PSD
4 Nonnegativeness of Power Spectral Density
The power spectral density of a stationary process X(t) is always
nonnegative.
Properties of PSD
4 Nonnegativeness of Power Spectral Density
The power spectral density of a stationary process X(t) is always
nonnegative.
Properties of PSD
4 Nonnegativeness of Power Spectral Density
The power spectral density of a stationary process X(t) is always
nonnegative.
SX X (f ) ≥ 0
Properties of PSD
4 Nonnegativeness of Power Spectral Density
The power spectral density of a stationary process X(t) is always
nonnegative.
SX X (f ) ≥ 0
5 Symmetry
Properties of PSD
4 Nonnegativeness of Power Spectral Density
The power spectral density of a stationary process X(t) is always
nonnegative.
SX X (f ) ≥ 0
5 Symmetry
The power spectral density of a real-valued weakly stationary process is an
even function of frequency
Properties of PSD
4 Nonnegativeness of Power Spectral Density
The power spectral density of a stationary process X(t) is always
nonnegative.
SX X (f ) ≥ 0
5 Symmetry
The power spectral density of a real-valued weakly stationary process is an
even function of frequency
SX X (f ) = S X X (−f )
Properties of PSD
4 Nonnegativeness of Power Spectral Density
The power spectral density of a stationary process X(t) is always
nonnegative.
SX X (f ) ≥ 0
5 Symmetry
The power spectral density of a real-valued weakly stationary process is an
even function of frequency
SX X (f ) = S X X (−f )
6 Normalization
Properties of PSD
4 Nonnegativeness of Power Spectral Density
The power spectral density of a stationary process X(t) is always
nonnegative.
SX X (f ) ≥ 0
5 Symmetry
The power spectral density of a real-valued weakly stationary process is an
even function of frequency
SX X (f ) = S X X (−f )
6 Normalization
The power spectral density, appropriately normalized, has the properties
associated with a probability density function in probability theory
Properties of PSD
4 Nonnegativeness of Power Spectral Density
The power spectral density of a stationary process X(t) is always
nonnegative.
SX X (f ) ≥ 0
5 Symmetry
The power spectral density of a real-valued weakly stationary process is an
even function of frequency
SX X (f ) = S X X (−f )
6 Normalization
The power spectral density, appropriately normalized, has the properties
associated with a probability density function in probability theory
S X X (f )
PX X ( f ) = , ∞
S X X ( f )df
−∞
g(t )X (t )dt
0
We refer to Y as a linear functional
of X ( t )
g(t)X(t)dt
0
We refer to Y as a linear functional of X ( t )
1 If the weighting function g(t) is such that the mean-square value of the
random variable Y is finite
2 And if the random variable Y is a Gaussian-distributed random
variable for every g(t) in this class of functions
Then the process X (t ) is said to be a Gaussian process.
A process X(t) is said to be a Gaussian process if every linear
functional of X (t)Ashok
is aN Shinde
Gaussian random variable
DC Unit-III Stochastic Process 23/28
Properties The Gaussian Process
Random variable Y has a Gaussian distribution if its probability
density function has the form
. Σ
1 (y − µ) 2
f Y (y) = √ exp − 2σ 2
2πσ2
where µ is the mean and σ 2 is the variance of the random variable Y
Properties The Gaussian Process
Adjective ’white’ is used in the sense that white light contains equal
amount of all frequencies within the visible band of electromagnetic
radiation.
Adjective ’white’ is used in the sense that white light contains equal
amount of all frequencies within the visible band of electromagnetic
radiation.
White noise denoted by W (t), is a stationary process whose PSD
S W (f ) has constant value across all frequencies.
Adjective ’white’ is used in the sense that white light contains equal
amount of all frequencies within the visible band of electromagnetic
radiation.
White noise denoted by W (t), is a stationary process whose PSD
S W (f ) has constant value across all frequencies.
PSD of white noise is
S W W ( f ) = N 02for all f
Adjective ’white’ is used in the sense that white light contains equal
amount of all frequencies within the visible band of electromagnetic
radiation.
White noise denoted by W (t), is a stationary process whose PSD
S W (f ) has constant value across all frequencies.
PSD of white noise is
S W W ( f ) = N 0 2for all f
Its auto-correlation function is
N0 2
R WW (τ ) = δ(τ )
Adjective ’white’ is used in the sense that white light contains equal
amount of all frequencies within the visible band of electromagnetic
radiation.
White noise denoted by W (t), is a stationary process whose PSD
S W (f ) has constant value across all frequencies.
PSD of white noise is
S W W ( f ) = N 0 2for all f
Its auto-correlation function is
N0 2
R WW (τ ) = δ(τ )