Professional Documents
Culture Documents
1 0 1
Answer : False. Let A = ; then is an eigenvector of eigenvalue
0 2 0
0 1
1 and is an eigenvector of eigenvalue 2, but their sum is not an
1 1
eigenvector of A.
(15) If A is similar to a diagonalizable matrix B, then A is also diagonalizable.
Answer : True. Since B is diagonalizable, there exists an invertible ma-
trix P and a diagonal matrix D such that B = P DP −1 . Since A is similar
to B, there exists an invertible matrix Q such that A = QBQ−1 . Then
A = QP DP −1 Q−1 = (QP )D(QP )−1 . Thus A is diagonalizable.
1 2 1 0
(16) The matrix is similar to .
0 1 0 2
1 2 1 0 1 2
Answer : False. If were similar to , then would
0 1 0 2 0 1
1 0 1 2
be diagonalizable (since is a diagonal matrix). However, is
0 2 0 1
not diagonalizable because the multiplicity of the eigenvalue 1 is 2, while
dim E1 = 1.
(17) There exists a 3 × 3 matrix A with real entries such that the eigenvalues of
A are 1, 2 + 3i, 4 − i.
Answer : False. The characteristic polynomial of a matrix with real
coefficients is a polynomial with real coefficients, hence its roots are either
real or come in conjugate pairs.
b · v1 b · vm
projW b = v1 + · · · + vm .
v1 · v1 vm · vm
Thus we have
projW b · v1 projW b · v`
projV (projW b) = v1 + · · · + v`
v1 · v1 v` · v`
b · v1 b · v`
= v1 + · · · + v`
v1 · v1 v` · v`
= projV b .
(14) The set of all vectors in Rn orthogonal to one fixed vector is a subspace of
Rn .
Answer : True.
(15) If W is a subspace of Rn , then W and W ⊥ have no vectors in common.
Answer : False. It has (only) the zero vector in common.
(16) Let 0 be the zero vector in Rn . Then {0}⊥ = Rn .
Answer : True. Every vector is orthogonal to the zero vector.
(17) Let W be a subspace of Rn . Then dim W + dim W ⊥ = n.
Answer : True. Let W = Span{u1 , . . . , um } and A = [u1 · · · um ].
Then dim W = dim Col A = dim Row A = dim Col AT . On the other
hand, W ⊥ = (Col A)⊥ = Nul AT . Thus dim W + dim W ⊥ = dim Col AT +
dim Nul AT = n (where the last equality follows by Rank-Nullity Theorem).
(18) If {v1 , v2 , v3 } is an orthogonal set and if c1 , c2 , c3 are scalars, then {c1 v1 , c2 v2 , c3 v3 }
is an orthogonal set.
Answer : True. The assumption implies that vi · vj = 0 if i 6= j. Thus
(ci vi ) · (cj vj ) = ci cj (vi · vj ) = 0 if i 6= j.
(19) Let A be an m × n matrix such that the ith column is ui ∈ Rm . Then the
(i, j)th entry in AT A is ui · uj .
Answer : True.
(20) If an m × n matrix U has orthogonal columns,
then necessarily
m ≥ n.
0 1 0 0 0 0
Answer : False. Consider A = or . (By Theorem 4
0 0 1 0 0 0
of Section 6.2, the existence of a column of zeros is the only way a matrix
with orthogonal columns may have more columns than rows.)
(21) If an m × n matrix U has orthonormal columns, then necessarily m ≥ n.
Answer : True. Suppose that U = [u1 · · · un ]. Each vector ui is nonzero
because it has norm 1. Then by Theorem 4 in Section 6.2, {u1 , . . . , un } is
a linearly independent set (of vectors in Rm ). Since Rm has dimension m,
we must have m ≥ n.
(22) If an m × n matrix U has orthonormal columns, then U T U = In .
Answer : True.
(23) If an m × n matrix U has orthonormal columns, then U U T = Im .
Answer : False.
(24) A square matrix with orthogonal columns is an orthogonal matrix.
Answer : False. The columns must also be unit vectors (normal vectors).
(25) If a square matrix has orthogonal columns, then it also has orthogonal rows.
PARTIAL SOLUTIONS TO MIDTERM 2 REVIEW 5
0 1
Answer : False. Consider A = .
0 1
(26) If a square matrix has orthonormal columns, then it also has orthonormal
rows.
Answer : True. If a matrix A has orthonormal columns, then AT A = I.
By the IMT, A is invertible and its inverse is AT . Thus AAT = I. Thus
the rows of A are orthonormal.
(27) If W is a subspace of Rn , then kprojW vk2 + kv − projW vk2 = kvk2 .
Answer : True. This is the pythagorean theorem.
(28) If x̂ is a least squares solution to Ax = b, then Ax̂ 6= b.
Answer : False. If b ∈ Col A, then Ax̂ = projCol A b = b.
(29) If x̂ is a least squares solution to Ax = b, then Ax̂ ∈ Col A.
Answer : True. For any vector x, we have Ax ∈ Col A.
(30) If A is an orthogonal matrix, then A is diagonalizable.
Answer
: False.
Some orthgonal matrices with real coefficients, like
0 −1
A = , have complex eigenvalues, hence are not diagonalizable
1 0
with real coefficients (i.e. there does not exist an invertible matrix P with
real coefficients and a diagonal matrix D with real coefficients such that
A = P DP −1 ).
If all the eigenvalues of an orthogonal matrix A are real, then we can
use Exercise 16 on page 337 (combined with the fact that any orthogonal
upper-triangular matrix is diagonal) to show that A is diagonalizable with
real coefficients.
The same exercise also shows that any unitary matrix A (i.e. satisfying
T
A A = In ) can be diagonalized with complex coefficients.