Practice Questions Math 3273
1 Practice Questions for final exam
1.1 Invertible matrices, subspaces, dimension, etc
1. Let
0 3 4 1 −13 −21
0 −1 2 −1 −9 −9
A=
0 1 0 −2 −12 −11
0 1 2 2 2 −3
and
0 3 4 1 −13 −21 1 0 0 0
0 −1 2 −1 −9 −9 0 1 0 0
B=
0 1 0 −2 −12 −11 0 0 1 0
0 1 2 2 2 −3 0 0 0 1
You are given that the rref of B is:
2 4 3
0 1 0 0 −2 −3 13 − 13 13 0
3 7 1
0 0 1 0 −3 −4 26 26 − 13 0
S= 1 2 5
0 0 0 1 5 4 13 − 13 − 13 0
7 1 9
0 0 0 0 0 0 − 13 13 13 1
(a) Give a matrix P such that A = P R where R is the rref of A.
(b) Write down a basis for the nullspace, rowspace, columnspace and left nullspace of A.
(c) Find the set of solutions to the equation
1
2
Ax = 3
2. (a) Let U, W be subspaces of a vector space V . Prove that U ∩ W and U + W are both subspaces of V .
(b) Let {Uα }α∈I be a collection of subspaces of V . Prove that ∩α∈I Uα is a subspace of V .
3. Let V be a finite dimensional vector space. Assume that for every subspace U ⊆ V and linearly
independent vectors u1 , . . . , uk ∈ U , there exists a basis u1 , u2 , . . . , uk , uk+1 , . . . , uN of U .
(a) Prove that for any two subspaces U, W ⊆ V we have
dim(U ) + dim(W ) = dim(U ∩ W ) + dim(U + W ).
(b) Prove that for any subspace U ⊆ V there exists a subspace W ⊆ V such that U + W = V and
U ∩ W = 0.
4. Let V = R[x] be the vector space consisting of polynomials with real coefficients. Let O = {f ∈
V | f (−x) = −f (x)} and E = {f ∈ V | f (−x) = f (x)}. Assume that O and E are subspaces of V .
Prove that O ⊕ E = V .
5. (a) Suppose A is an n-by-n matrix with real entries such that for all b ∈ Rn , there exists x ∈ Rn such
that Ax = b.
Prove that there exists a matrix B such that AB = In where In is the n-by-n identity matrix.
(b) Suppose A and B are n-by-n matrices with real entries such that AB = In .
Suppose that yT A = 0T .
Prove that y = 0.
1.2 Block matrices
A B
6. Let X = be a 2-by-2 block matrix.
0 C
(a) Suppose A is a 3-by-3 matrix and C is a 4-by-1 matrix.
What are the dimensions of B and the zero matrix occuring in X?
(b) Assume that A and C are square invertible matrices.
Find the inverse of X in terms of A, B, C.
7. Let S : V → V be a linear transformation such that S 2 = S. Let T = I − S. Let α, β ∈ k be scalars.
Prove that if αβ =
6 0 then αS + βT is invertible.
8. Let v1 , v2 , v3 , v4 , v5 be a basis for a vector space V . Suppose that
T (v1 ) = v2
T (v2 ) = v3
T (v3 ) = v4
T (v4 ) = v5
T (v5 ) = 0
Find the matrix of T relative to the basis B = v1 , . . . , v5 . Find the kernel of T .
A B
9. (a) Prove using block matrices that if X = then
0 C
An B0
n
X = .
0 Cn
A B
(b) Assume that for any polynomials f and any block matrix X = we have
0 C
B0
f (A)
f (X) = .
0 f (C)
Prove that if f (A) = 0 and g(C) = 0 then
A B
h =0
0 C
where h = f · g.
A B
10. Let X = be a 2-by-2 block matrix.
0 C
(a) Suppose A is a 2-by-2 matrix and X is a 6-by-6 matrix.
What are the dimensions of B, C and the zero matrix occuring in X?
(b) Assume that A and C are invertible matrices.
Find the inverse of X in terms of A, B, C.
(c) Let b ∈ R2 and c ∈ R4 .
Solve for x, y.
Ax + By = b
Cy = c
Page 2
1.3 Inner product spaces
11. Let V = R2 .
Define the standard real inner product on V .
12. Let V be a real inner product space.
Prove that for all v, w ∈ V , we have
kv + wk2 − kv − wk2 = 4hv, wi.
13. Suppose that V is a real inner product space.
Suppose that T : V → V is a linear transformation such that
kT (v)k = kvk
for all v ∈ V .
Prove that
hT (v), T (w)i = hv, wi.
14. Suppose that u ∈ Cn and w ∈ Cm are both unit vectors.
Let P = uw∗ .
Prove that
tr(P ∗ P ) = 1.
15. Let V = C3 .
Define the standard complex inner product on V .
16. Let V be a complex inner product space. Prove that for all v, w ∈ V , we have
|hv, wi| ≤ kvkkwk.
17. Let u1 , u2 , u3 be orthonormal vectors in Cn .
Define an n-by-n matrix
P = u1 u∗1 + u2 u∗2 + u3 u∗3 .
(a) Prove that P 2 = P = P ∗ .
(b) Let Q = I − P .
Prove that QP = P Q = 0.
(c) Prove that if v, w ∈ Cn , then P v is orthogonal to Qw.
18. Let x, y ∈ V where V is a real inner product space. Suppose that kxk = kyk 6= 0.
Prove that x + y is orthogonal to x − y.
2
19. Let w = 1.
1
Compute the Householder matrix U = 1 − 2 w ∗ w corresponding to w.
∗
ww
Page 3
1.4 Diagonalization, eigenvalues, etc
20. (a) Define what it means for a symmetric matrix A to be positive semi-definite.
(b) Suppose X is an m-by-n matrix.
Prove that X T X and XX T have the same positive eigenvalues.
21. (a) Define what it means for a n-by-n matrix with complex entries to be normal.
(b) Suppose that K is a real n-by-n matrix.
Suppose that K T = −K.
i. Prove that K is normal.
ii. Explain the notation U = eK .
iii. If U = eK then prove that U is an orthogonal matrix.
cos(θ) − sin(θ)
iv. Let θ ∈ R and let U = .
sin(θ) cos(θ)
Find a real matrix K such that U = e .
K
22. Suppose that A = AT is a symmetric n-by-n matrix with real entries.
Suppose that A = QT QT where Q is orthogonal and T is an upper-triangular matrix.
Prove that T is in fact a diagonal matrix and that A therefore diagonalizable.
23. Suppose that Q is an orthogonal n-by-n matrix. Prove that the complex eigenvectors and eigenvalues of
Q occur in complex conjugate pairs. Deduce that
Q = λ1 u1 u∗1 + λ1 u1 u∗1 + · · · + λr ur u∗r + λr ur u∗r + λr+1 vr+1 vr+1
T T
+ · · · + λr+k vr+k vr+k
where λr+1 , . . . , λr+k are either ±1 and vr+1 , . . . , vr+k are in Rn .
i 0
24. (a) Let u1 = √13 1. Let v = √12 1 . Let Q = iu1 u∗1 + iu1 u∗1 − vvT . Find Q and prove that it
1 −1
is orthogonal.
1 1 1
(b) Let u1 = √13 −1, u2 = √12 0 and u3 = √16 2. Let Q = λ1 u1 uT1 + λ2 u2 uT2 + u3 uT3 . Find
1 −1 1
Q in the following cases: λ1 = λ2 = −1, λ1 = −1, λ2 = 1, λ1 = λ2 = 1.
1.5 Singular Value Decomposition
25. Let X be an m-by-n matrix with complex entries.
State the singular value decomposition of X.
26. Suppose that the singular values of X are 5, 1, 0.1 and the rest are zeros.
Use the singular value decomposition of X to find a rank two matrix X̃ such that kX − X̃k <= 0.1.
27. Suppose that
A = U ΣW ∗
where A is an m-by-n matrix with complex entries, U is an m-by-m unitary matrix, W is an n-by-n
unitary matrix. Suppose that
Σr 0
Σ=
0 0
Page 4
σ1
σ2
where Σr = .. is a diagonal matrix. Let U = (Ur U 0 ) and W = (Wr W 0 ) where Ur and
.
σr
Wr both have r columns. Prove that
A = Ur Σr Wr∗ .
28. Suppose that A = U ΣW ∗ is the singular value decomposition. Prove that the columns of U are the
eigenvectors of AA∗ .
Page 5