Professional Documents
Culture Documents
Background/Random Processes
• Eigenvalues and Eigenvectors
• Optimization Theory
• Ensemble Averages, Jointly Distributed
RVs, Joint Moments
1
Eigenvalues and Eigenvectors
• Let A be an n x n matrix and consider the following
set of homogeneous linear equations
where λ is a constant
• In order for a nonzero solution vector to exist, the
matrix (A - λI) must be singular => det(A - λI) must
be zero
2
Eigenvalues and Eigenvectors
• Charateristic polynomial of A: the nth order
polynomial p(λ) in λ
• Eigenvalues of A: the n roots, λi
• For each eigenvalue λi , matrix (A - λI) will be singular
and and ther will be at least one nonzero vector vi
that solves Eq. (2.44)
3
Eigenvalues and Eigenvectors
and λ = 0 is an eigenvalue of A.
• Furthermore, there will be k = n − r(A) linearly
independent solution to Eq. (2.48) => A will have
• r(A) nonzero eigenvalues
• n − r(A) zero eigenvalues
4
Eigenvalues and Eigenvectors
5
Eigenvalues and Eigenvectors
6
Eigenvalues and Eigenvectors
7
Eigenvalues and Eigenvectors
8
Eigenvalues and Eigenvectors
9
Eigenvalues and Eigenvectors
10
Eigenvalues and Eigenvectors
11
Optimization Theory
• Minimization (or maximization) of a function of one or more
variables
• Simplest form: Finding the minimum of a scalar function f(x)
of a single real variable x
• Assuming objective function (OF) f(x) to be differentiable, the
stationary points of f(x) i.e., local and global minima must
satisfy the following conditions
12
Optimization Theory
• f(x) strictly convex function: Only one solution to Eq. (2.67),
which is the global minimum of f(x)
• f(x) not convex: Each stationary point must be checked to see
if it is the global minimum or not
• When OF f(z) is a function of a complex variable z and
differentiable (analytic): stationary points of f(z) found similar
to the real case
• What if f(z) is not differentiable?
• Example:
14
Optimization Theory
Setting both derivatives to zero and solving the pair of
equations, we obtain the solution z = 0
• Therefore, for
we can set either Eq. (2.69) or (2.70) to zero and solve for z.
15
Optimization Theory
• How to find the minimum of a function of two or more
variables?
• Scalar function of n variables
17
Optimization Theory
• f(x) strictly convex: Solution to Eq. (2.71) is unique and is the
global minimum of f(x)
• Function of complex vectors f(z, z*): treat z and z* as
independent variables, stationary points may be obtained
using the theorem
18
Optimization Theory
19
Optimization Theory
• Example of a constrained minimization problem (encountered
in array processing)
{
min zH Rz
z
}
s.t. zH a = 1
20
Optimization Theory
21
Optimization Theory
• Solution: Introduce a Lagrange multiplier λ and solve the
unconstrained objective function
22
Optimization Theory
Substituting Eq. (2.74) into (2.75)
23
Optimization Theory
Substituting this vector z into the quadratic form (main OF),
we obtain the minimum value of the quadratic form
24
Random Processes
25
Ensemble Averages
• Sample mean or expected value
• Expected value of a discrete random variable (RV) x that
assumes a value of αk with probability Pr{x = αk}
26
Ensemble Averages
27
Ensemble Averages
• For RV x with pdf fx(α), if y = g(x)
• Variance of RV x, Var{x}
28
Ensemble Averages
• Expectation is a linear operator
• When E{x} = 0,
29
Jointly Distributed RVs
• Joint distribution function for two RVs, x(1) and x(2)
30
Jointly Distributed RVs
• Statistical characterization of complex RVs: If z = x + jy is a
complex RV and α = a + jb is a complex number then the
distribution function for z is the joint distribution function
31
Joint Moments
• Correlation: 2nd-order joint moment
• Covariance
32
Joint Moments
• Due to normalization, correlation coefficient is bounded by
one in magnitude
33