Professional Documents
Culture Documents
A random experiment has the following properties: • Suppose that in n trials of the experiment, event A occurs
nA times. We may then assign the ratio nA/n to the event A.
• On any trial of the experiment, the outcome is This ratio is called the relative frequency of the event A.
unpredictable.
• For a large number of trials of the experiment, nA
0d d1
the outcomes exhibit statistical regularity. That n
is, a definite average pattern of outcomes is • The experiment exhibits statistical regularity if, for any
observed if the experiment is repeated a large sequence of n trials, the relative frequency nA/n converges
number of times. to a limit as n becomes large. We define this limit to be the
probability of event A.
§n ·
5 [ A] lim ¨ A ¸
n o f© n ¹
Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 3 Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 4
Axioms of Probability Random Variables
• With each possible outcome of the experiment, we associate a • We use the expression random variable to
point called the sample point, which we denote by sk.
describe the process of assigning a number to the
• A probability system consists of the triple
outcome of a random experiment.
1. A sample space S of elementary events (outcomes).
• If the outcome of the experiment is s, we denote
2. A class H of events that are subsets of S. the random variable as X(s) or just X.
3. A probability measure P[A] assigned to each event A in the
class H, which has the following properties
• We denote a particular outcome of a random
experiment by x; that is X(sk)=x.
(i) P[S] = 1
(ii) 0 ≤ P[A] ≤ 1 • Random variables may be either discrete or
(iii) if A B is the union of two mutually exclusive events in continuous.
the class H, then
5 [ A B] 5 [ A] 5 [B]
Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 5 Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 6
1 p , x 0
°
5 [X x] ® p , x 1
°0 , otherwise
¯
Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 7 Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 8
Distribution Functions Distribution Functions II
Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 9 Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 10
• The probability density function has three basic • The probability distribution function of the
properties: Bernoulli random variable is given by
1. Since the distribution function is monotone non-
decreasing, it follows that the density function is 0 , x0
nonnegative for all values of x. °
FX ( x) ®1 p , 0 d x 1
2. The distribution function may be recovered from the °1, xt1
density function by integration as shown by ¯
x
• A uniformly distributed random variable has the
probability density function
FX ( x) ³ f X (s) ds
f 1
° , adxdb
3. Property 1 implies that the total area under the curve of f X ( x) ®b a
the density function is unity. °̄0 , otherwise
Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 11 Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 12
Several Random Variables
Several Random Variables
II
• We define the joint distribution function FX,Y(x,y) • Suppose that the joint distribution function
as the probability that the random variable X is FX,Y(x,y) is continuous everywhere, and that the
less than or equal to a specified value x and that partial derivative
the random variable Y is less than or equal to a
w 2 FX ,Y ( x , y)
specified value y. f X ,Y ( x , y)
wx wy
Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 13 Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 14
Variance Variance II
• The variance is an estimate of the spread of the • For N observations of the random variable, we
probability distribution about the mean. may estimate the variance
• For discrete random variables, the variance is 1 N
Vˆ X2 ¦
N 1 n 1
( xn Pˆ X ) 2
V X2 Var( X )
>
( (X P X )2 @ • If we consider X as a random variable
representing observations of the voltage of a
¦ ( x P X ) 2 5>X x@ random signal, then the variance represents the
X
ac power of the signal.
• For a continuous random variable with a density
function fX(x) • The second momentum E[X2], is also called the
f mean-square value of the random signal and it
physically represents the total power of the
V X2 ³ (x PX )
2
f X ( x) dx
f signal.
Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 19 Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 20
Transformation of Random
Covariance
Variables
• The covariance of two random variables is given • If Y=g(X) is a one-to-one transformation of the
by random variable X to the random variable Y, then
Cov(X , Y) (>(X PX )(Y PY )@ the distribution function of Y is given by
or
Cov(X , Y) (>XY @ PX PY FY ( y)
FX g 1 ( y)
• If the two random variables are continuous with where g-1(y) denotes the functional inverse of
joint density function fX,Y(x,y) then g(y).
f f
• Also if X has a density fX(x) then Y has the
(>XY @ ³ ³ x y f X ,Y ( x, y) dx dy density
f f
dg1 ( y)
• If the two variables are independent, then 1
fY ( y) f X g ( y)
(>XY @ (>X@ (>Y @ Cov(X , Y) 0 dy
Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 21 Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 22
Transformation of Random
Gaussian Random Variables
Variables: Example
• A Gaussian random variable is a continuous
• If a random variable X with distribution FX(x) is
random variable with a density function
transformed to
1
° (x PX ) ½
2
°
Y AX b f X ( x) e xp® ¾
VX 2S °̄ 2V X °
2
¿
then
Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 23 Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 24
Properties of a Gaussian Normalized Gaussian
Random Variable Random Variable
1. A Gaussian random variable is completely characterized by
its mean and variance. • A normalized Gaussian random variable has a
zero mean, PX=0, and a unity variance V 2X=1.
2. A Gaussian random variable plus a constant is another
Gaussian random variable with the mean adjusted by the • In this case, its density function is
constant.
3. A Gaussian random variable multiplied by a constant is x2 ½
1
another Gaussian random variable where both the mean e xp®
f X ( x) ¾ f x f
and variance are affected by the constant. 2S ¯ 2 ¿
4. The sum of two independent Gaussian random variables is • Its distribution function is
also a Gaussian random variable.
5. The weighted sum of N independent Gaussian random 1 x
s2 ½
variables is a Gaussian random variable. FX ( x) ³ e xp® 2 ¾ ds
6. If two Gaussian random variables have zero covariance
2S f ¯ ¿
(uncorrelated), they are also independent.
Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 25 Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 26
• The Q-function is defined as • Let X1, X2,...Xn be a set of random variables with
the following properties:
f
1 s2 ½
Q( x) ³ e xp® ¾ ds 1. The Xk with k=1, 2, 3,...,n are statistically independent
2S ¯ 2 ¿
x
2. The Xk all have the same probability density function.
1 FX ( x) 3. Both the mean and the variance exist for each Xk.
Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 27 Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 28
The Central Limit Theorem
Random Processes
II
• The normalized random variable is
• In communications, the received time-varying
Y ([Y ] signal is random in nature.
Z
VY • Random processes represent the formal
• As n becomes large, mathematical model of these random signals.
z
1 s2 ½ • Random processes have the following
FZ ( z) o ³ e xp® ¾ ds
properties:
f 2S ¯ 2¿
1. Random processes are a function of time.
• Therefore, the normalized distribution of the sum
2. Random processes are random in the sense that it is
of independent, identically distributed random
not possible to predict exactly what waveform will be
variables approaches a Gaussian distribution as observed in the future.
the number of random variables increases,
regardless of the individual distributions.
Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 29 Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 30
x j (t) X(t , s j )
Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 31 Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 32
Stationary Random
Random Processes IV
Processes
• With a random variable, the outcome of a
random experiment is mapped to a real number. • If the statistical characterization of a process is
independent of the time at which the
• With a random process, the outcome of a random
observations occur, then the process is said to be
experiment is mapped to a waveform that is a
function of time. stationary.
Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 35 Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 36
Correlation of Random Properties of the
Processes II Autocorrelation Function
• If X(t) is stationary to the second order or higher,
• The autocorrelation function of a wide-sense
then
stationary random process has the following
RX (t , s) RX (t s) properties for a real-valued process:
• For many applications, we often only require
1. Power of a Wide-Sense Stationary Process: The
that:
second moment or mean-square value of a real-
1. The mean of the random process is a constant valued random process is given by
independent of time: E[X(t)]=PX for all t.
2. The autocorrelation of the random process only
RX (0) >
( X(t) 2 @
depends upon the time difference for all t and W: 2. Symmetry: The autocorrelation of a real-valued
( >X(t )X (t W )@ RX (W )*
wide-sense stationary process has even
• If a random process has these two properties, symmetry
then we say it is wide-sense stationary or weakly RX (W ) RX (W )
stationary.
Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 37 Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 38
Properties of the
Ergodicity
Autocorrelation Function II
• The ensemble averages are given by
3. Maximum Value: The autocorrelation of a real-
valued wide-sense stationary process is 1 N
(>X(t k )@ ¦ x j (t k )
maximum at the origin. Nj 1
and
RX (0) t RX (W )
>
( X 2 (t k ) @ 1 N 2
¦ x (t k )
Nj 1 j
Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 39 Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 40
Ergodicity II Ergodicity III
• The time average of only one continuous sample • Random processes for which various time
function x(t) drawn from the process is given by averages of the sample functions may be used to
approximate the corresponding ensemble
1 T averages or expectations are said to be ergodic.
H >x@ lim ³ x(t ) dt
T of 2T
T • Wide-sense stationary processes are assumed to
• The time-autocorrelation of the sample function be ergodic, in which case time averages and
x(t) is given by expectations can be used interchangeably.
1 T
Rx (W ) lim ³ x(t)x(t W ) dt
T of 2 T
T
Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 41 Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 42
• Define the Fourier transform of a sample function • The power spectral density of a random process
xT(t) as X(t) is
> @
f
( j 2S f t ) 1
[T ( f ) ³ xT (t) e dt SX ( f ) lim
T of 2T
( ;T ( f )
2
f
Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 43 Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 44
Properties of the Power Properties of the Power
Spectral Density Spectral Density II
• The same Wiener-Khintchine relations that apply 2. Nonnegativity:
to deterministic processes also apply to wide-
sense stationary random processes. SX ( f ) t 0, for all f
• Other properties of a wide-sense stationary
process: 3. Symmetry:
SX ( f ) SX ( f )
1. Mean-Square Value:
> @
f
4. Filtering Random Processes (through a linear
³ SX ( f ) d f
2
( X(t )
f filter):
2
SY ( f ) H( f ) SX ( f )
Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 45 Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 46
Properties
Gaussian Processes
of a Gaussian Process
1. If a Gaussian process is wide-sense stationary,
• A random process X(t), with t taking values in then it is also stationary in the strict sense.
the set T, is said to be a Gaussian process if, for
any integer k and any subset {t1, t2,..., tk} of T, 2. If a Gaussian process is applied to a stable linear
the k random variables {X(t1), X(t2),...X(tk)} are filter, then the random process Y(t) produced at
jointly Gaussian distributed, that is their joint the output of the filter is also Gaussian.
density has a Gaussian shape. 3. If integration is defined in the mean-square
sense, then we may interchange the order of the
operations of integration and expectation with a
Gaussian random process. For example:
ªt º t
(>Y(t )@ ( «³ h(t s) X( s) ds» ³ h(t s) (>X(s)@ ds P Y (t )
¬« 0 ¼» 0
Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 47 Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 48
White Noise
White Noise
II
• The power spectral density of white noise is • Autocorrelation of white noise is given by
independent of frequency. N0
RW (W ) G (W )
• The power spectral density of a white noise 2
process W(t) is • Any two different samples of white noise, no
matter how close together in time they are taken,
N0
SW ( f ) are uncorrelated.
2
• The effect of white noise is observed only after
passing through a system of finite bandwidth.
• The dimensions of N0 are in watts per hertz.
• As long as the bandwidth of a noise process at
• White noise has no dc power.
the input of a system is appreciably larger than
that of the system itself, we may model the noise
process as white.
Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 49 Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 50
SN ( f fc ) SN ( f fc ) Bd f d B
SN I ( f ) SNQ ( f ) ®
¯0, otherwise
5. NI(t) and NQ(t) have the same variance as N(t).
Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 51 Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 52
Noise-Equivalent Bandwidth
Low-Pass Filter
Narrowband • Suppose a white noise source with spectrum
SW(f) = N0/2 is connected to the input of an
Noise III arbitrary filter of transfer function H(f).
• The average output noise power is
2
PN N 0 BN H(0)
f
2
³ H( f ) df
0
BN 2
H( f c )
Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 55 Notre Dame University-Louaize - EEN 443 Communication Systems II - Spring 2021 56