You are on page 1of 32

Communications Engineering

Chapter 2

Deterministic and Random Signal


Analysis
Thái Truyển Đại Chấn
Outline
• Introduction
• 2.1 Bandpass and lowpass signal representation

• 2.2 Signal Space Representation of


Waveforms
• 2.3 Some useful random variables
• 2.6 Complex random variables
Introduction
• Some signal definitions
Introduction
• Forward Fourier Transform

• Inverse Fourier Transform


Introduction
Introduction
2.1 Bandpass and lowpass signal representation
2.1 Bandpass and lowpass signal representation
• In analog system: how do you transmit voice to a far-away destination?

Low-
frequency
Low- electrical High-
frequency wave frequency
human voice Telephone Modulation Transmitter
electrical
Low- wave High-frequency
frequency
Low- electrical wave
electrical High-
frequency wave frequency
Telephone Demodulation Receiver
human voice electrical
wave
2.1 Bandpass and lowpass signal representation

Bandpass signal Equivalently Lowpass signal


Real, narrowband, and represented by Complex, and low-
high-frequency signal frequency signal
2.1–1 Bandpass and Lowpass Signals
• Fourier transform and Hermitain functions
• The function f is real-valued iff the Fourier transform of f is Hermitian.
• The function f is Hermitian iff the Fourier transform of f is real-valued.
• f is a Hermitian function iff
• Re(f) is an even function and Im(f) is an odd function.
• Or |f| is an even function and ϕ(f) is an odd function
2.1–1 Bandpass and Lowpass Signals
• Lowpass signal/baseband signal
• Bandwidth = ½ × frequency support set
• Positive spectrum and the negative spectrum of a signal
x(t)
2.1–2 Lowpass Equivalent of Bandpass Signals
• Bandpass signal
2.2 Signal Space Representation of Waveforms
• A vector is equivalent to a signal
• A signal constellation is a signal space representation
• 2.2–1 Vector Space Concepts
• Gram–Schmidt procedure for constructing a set of orthonormal vectors from a set of n-
dimensional vectors
• 2.2–2 Signal Space Concepts
• 2.2–3 Orthogonal Expansions of Signals
2.2–1 Vector Space Concepts

• The inner product

• Properties:

• Vector represented as a linear combination of orthogonal unit vectors (orthonormal basis) ei, 1
≤i≤n
2.2–1 Vector Space Concepts
• Orthogonal:
• A set of m vectors vk, 1 ≤ k ≤ m, are orthogonal if
• <vi , vj> = 0 for all 1 ≤ i, j ≤ m, and i ≠ j.
• Norm:

• Orthonomal: orthogonal and unit norm.


• Linearly independent if no vector can be represented as a linear combination of the remaining
vectors.
2.2–1 Vector Space Concepts
• Triangle inequality:
• Cauchy–Schwarz inequality:

• Properties:
• If orthogonal, <v1, v2> = 0 and
(Pythagorean relation)
• Linear transformation
• V: an eigenvector of the transformation and
λ: the corresponding eigenvalue
2.2–1 Vector Space Concepts
• Gram–Schmidt procedure for constructing a set of orthonormal vectors from a set of n-
dimensional vectors vi, 1 ≤ i ≤ m

• By continuing this procedure, we construct a set of N orthonormal vectors, where N ≤ min(m,


n).
2.2–2 Signal Space Concepts
• Inner product:
• Orthogonal if their inner product is zero.
• Norm (Ex is the energy in x(t))
• Other properties as in a vector space:
(orthonormal, linearly independent, triangle inequality, and Cauchy–Schwarz inequality)
2.2–3 Orthogonal Expansions of Signals
• Suppose that s(t) is a deterministic signal with finite energy

• An orthonormal set of functions

• Approximating s(t) by

• With approximation error


2.2–3 Orthogonal Expansions of Signals
• Select the coefficients {sk} so as to minimize the energy Ee of the approximation error:

• The minimum of Ee with respect to the {sk} is obtained when the error is orthogonal to each of
the functions in the series expansion:
• Or
2.2–3 Orthogonal Expansions of Signals
• is the projection of s(t) onto the K –dimensional signal space spanned by the
functions therefore
• The minimum mean-square approximation error is
2.2–3 Orthogonal Expansions of Signals
• When the minimum mean square approximation error Emin = 0,

• and

• When every finite energy signal can be represented by a series expansion for which Emin = 0,
the set of orthonormal functions is said to be complete.
2.3 SOME USEFUL RANDOM VARIABLES
• The Bernoulli Random Variable

• The Binomial Random Variable


2.3 SOME USEFUL RANDOM VARIABLES
• The Uniform Random Variable
2.3 SOME USEFUL RANDOM VARIABLES
• The Gaussian (Normal) Random Variable
2.3 SOME USEFUL RANDOM VARIABLES
• The Gaussian (Normal) Random Variable
• Height, birth weight, IQ, student grade,…
2.3 SOME USEFUL RANDOM VARIABLES
• The Gaussian (Normal) Random Variable
2.3 SOME USEFUL RANDOM VARIABLES
• The Gaussian (Normal) Random Variable
• The complementary error function
2.6 COMPLEX RANDOM VARIABLES
• Z = X + jY
• Complex random variable ↔ Two-dimensional random vector with components X and Y.
• The PDF of a zero-mean complex Gaussian random variable Z with iid real and imaginary parts

• The mean and variance


2.6–1 Complex Random Vectors
• Z = X + jY, where X and Y are real-valued random vectors of size n.
• The covariance matrices of real random vectors X and Y, respectively,

• Cross-covariance matrix
2.6–1 Complex Random Vectors
• Define the 2n-dimensional real vector

• The PDF of the complex vector Z is the PDF of the real vector Z˜. It is clear that CZ˜, the
covariance matrix of Z˜ , can be written as

• The covariance and the pseudo-covariance of the complex random vector Z, respectively
Exercises
• Orthogonality:
• 2.10, 2.11, 2.12
• Gaussian:
• 2.19, 2.32, 2.34, 2.35

You might also like