Professional Documents
Culture Documents
Problem 1.
√ √
2 √2
a) Let A be the matrix . Find the singular values of A, and
0 3
compute kAk2 .
2 2
Solution: We first compute A∗ A = , and we find the eigenvalues
2 5
√
of this to be 6 and 1. The
√ singular values are thus σ 1 = 6 and σ2 = 1. In
particular kAk2 = σ1 = 6.
3 α
b) Consider the matrix A = , where α is a real number. For which
α 1
values of α is A positive definite?
Solution: The eigenvalues of A are solutions to (3 − λ)(1 2
√− λ) − α = 0,
4± 16−4(3−α2 )
so that λ2 − 4λ + 3 − α2 . The solutions to these are 2 , and
p
2
these values are bigger than zero if and only if 16 − 4(3 − α ) is real and
< 4, i.e.
√ when
√ 0 < 4(3 − α2 ) ≤ 16, i.e. when −1 ≤ α2 < 3, i.e. when
α ∈ (− 3, 3) (since α is real).
c) We would like to fit the points p1 = (0, 1), p2 = (1, 0), p3 = (2, 1) to a
straight line in the plane. Find a line p(t) which minimizes
3
X
kp(xi ) − yi k2
i=1
Denote the coefficient matrix by A, and the right hand side by b. We have
that
T 3 3 T 2
A A= A b= .
3 5 2
c) Explain how one can find plane rotations on the form Pi1 ,n+1 ,
Pi2 ,n+1 ,. . . ,Pin ,n+1 so that
∗ 0
L R
Pi1 ,n+1 Pi2 ,n+1 · · · Pin ,n+1 = , (1)
z∗ 0
withR0 upper triangular, and explain how to obtain a QR-decomposition of
L∗
from this. In particular you should write down the numbers i1 , . . . , in .
z∗
Is it possible to choose the plane rotations so that R0 in (1) also has positive
diagonal entries?
Solution: Assume that {wi }i is a basis for W . The first equation is satisfied
if and only if
wiT AT Ax̂ = wiT AT b
P
for all i. Any x̂ ∈ W can be written uniquely as j cj wj , and inserting this
gives j wi A Awj cj = wiT AT b. With W the matrix with wi as columns,
P T T
The matrix on the left hand side has entries hAwi , Awj i, which we proved
in a) to be positive definite (the Awi are linearly independent whenever the
wi are, since A is nonsingular), and thus nonsingular. It follows that the cj ,
and hence x̂, is unique.
Equation (2) is also the normal equations for the least squares problem
min kAWc − bk. Thus, c is the least squares solutions to AWc = b, so that
x̂ = Wc is the least squares solution from W to Ax̂ = b. This is equivalent
to the second statement. The least squares view also gives an alternative
proof for the first statement: AW has linearly independent columns, which
ensures uniqueness of the least squares solution.
c) In the rest of this problem we consider the situation above, but where the
vector space W is taken to be the Krylov spaces
so that
Use this to suggest how the search vectors pk can be computed recursively.
Solution: It follows by the definitions of the spaces Wk and the properties
of the vectors pj that Apk−1 ∈ Wk+1 . As a consequence, since {pj }kj=0 is
an orthonormal basis for Wk+1 with respect to h·, ·iA , we must have that
k
X k
X
Apk−1 = hApk−1 , pj iA pj = hpk−1 , Apj iA pj ,
j=0 j=0
where we have used the symmetry of A with respect to h·, ·iA . However,
for j < k − 2, Apj ∈ Wk−1 , and as a consequence hpk−1 , Apj iA = 0. We
therefore obtain that
k
X
Apk−1 = hpk−1 , Apj iA pj ,
j=k−2
so that
The right hand side now gives a vector q with the same direction as pk .
Finaly we compute kqkA , and define pk = q/kqkA .
Good luck!