You are on page 1of 7

LA Revision Notes

1 Vector Spaces
• Steinitz Exchange Lemma: Let V be a FDVS over F. Let {v1 , . . . , vm } be LI and
{w1 , . . . , wn } span V . Then:
– m ≤ n.
– after reordering, {v1 , . . . , vm , wm+1 , . . . , wn } spans V .
• If V is a FDVS and U ≤ V , then dim(V ) = dim(U ) + dim(V /U ).
P P
• Direct Sum: For V1 , . . . , Vn ≤ V , Vi is direct iff Vj ∩ i6=j Vi = {0} ∀ j.

2 Linear Maps
• Rank-Nullity: For V a FDVS over F and α : V → W we have r(α) + n(α) = dim(V ).
• Every m × n matrix A is equivalent to
 
I 0
 r ,
0 0

where r = r(A).
• Any invertible matrix can be written as a product of elementary matrices.

3 Dual Spaces - Dual Maps


• Dual Space: V ∗ = L(V, F).
• Dual Basis: Let V be a FDVS over F. Consider basis B = {e1 , . . . , en } of V . Its dual
basis for V ∗ is B ∗ = {ε1 , . . . , εn } where
X 
εj ai ei = aj .

• If V is FD, then dim(V ) = dim(V ∗ ).


• Annihilator: For U ≤ V the annihilator of U is

U ◦ = {α ∈ V ∗ : ∀ u ∈ U, α(u) = 0}.

If V is FD, then dim(V ) = dim(U ) + dim(U ◦ ).


• Dual Map: For α : V → W its dual is α∗ : W ∗ → V ∗ , ε 7→ ε ◦ α.

If V, W are FDVS over F with bases B, C, then

[α∗ ]C ∗ ,B∗ = [α]TB,C ,

so r(α) = r(α∗ ).

1
• We have:
◦
– ker(α∗ ) = Im(α) ,
◦
– Im(α∗ ) ≤ ker(α) , with equality if V, W are FD.
• There is a natural homomorphism ˆ· : V → V ∗∗ , v 7→ v̂ st v̂(ε) = ε(v). If V is FD, ˆ· is an
isomorphism.
• Û ≤ U ◦◦ (equality if V is FD; then U ∼
= Û = U ◦◦ ).
• For V a FDVS over F:
– (U1 + U2 )◦ = U1◦ ∩ U2◦ ,
– (U1 ∩ U2 )◦ = U1◦ + U2◦ .

4 Determinant - Volume Forms


• Volume Form: A function d : Fn × · · · × Fn → F (n copies of Fn ) which is multilinear
(linear in every component) and alternating (if vi = vj for i 6= j, then d(v1 , . . . , vn ) = 0).
Example. Determinant (defined on the columns of a matrix).
X n
Y
det A = ε(σ) aσ(i)i .
σ∈Sn i=1

• In a volume form d, swapping two entries changes the sign. Hence if σ ∈ Sn ,



d vσ(1) , . . . , vσ(n) = ε(σ) d (v1 , . . . , vn ) .

• For d a volume form defined on the columns of A = A(1) | · · · |A(n) we have
 
d A(1) , . . . , A(n) = det A · d (e1 , . . . , en ) .

Use this to prove that det AB = det A det B by defining dA (v1 , . . . , vn ) = det (Av1 | · · · |Avn ),
and applying dA on columns of B.


A B
= det A det B.
0 C

• Adjugate: (adj A)ij = (−1)i+j det Aji


b.

Then (adj A)A = det A · I.


P
Proof. For fixed j, det A = i aij (adj A)ji = (A adj A)jj .
For j 6= k, replace the j’th column with the k’th:
 
0 = det A(1) | · · · |A(k) | · · · |A(k) | · · · |A(n)
X
= (adj A)ji aik
i
= (A adj A)jk .

2
5 Endomorphisms
Here V is a FDVS.

• α ∈ L(V ) is triangulable iff χα (t) can be written as a product of linear factors over F.

Proof. ⇐: Consider a root λ of χα (t) and a basis for Vλ . Extend to basis for V , then α is
represented by  
λIk ∗
 .
0 C

Consider the induced linear map ᾱ = V /Vλ → V /Vλ , which is represented by C. Apply
induction.

• α ∈ L(V ) is diagonalisable iff p(α) = 0 for some p(t) ∈ F[t] which is a product of distinct
linear factors.

Proof. ⇐: Define
Y t − λi
qj (t) = , πj = qj (α).
λj − λi
i6=j

Then
X
q(t) = qj (t) = 1 ⇒ q(α) = ι,
so
X
V = Im(πj ).
P
(α − λj ι) πj = 0 ⇒ Im(πj ) ≤ Vλj . Show directness by applying πj on v ∈ Vλj ∩ i6=j Vλi .

• α, β ∈ L(V ) are simultaneously diagonalisable iff they commute.

Proof.
 Consider
 an eigenspace Vλi of α and assume q(β) = 0. Then β(Vλi ) ≤ Vλi and
q β|Vλ = q(β)|Vλ = 0, so we can diagonalise β|Vλ .
i i i

• Cayley-Hamilton: For α ∈ L(V ), χα (α) = 0.

Proof. B adj B = det B · I. Choose B = tI − A, expand and equate coefficients.

• aλ : algebraic multiplicity (multiplicity of λ as a root of χα (t)).


gλ : geometric multiplicity (n(α − λι)).
cλ : multiplicity of λ as a root of mα (t).

Then 1 ≤ gλ ≤ aλ , 1 ≤ cλ ≤ aλ .

α is diagonalisable ⇔ αλ = gλ ∀ λ ⇔ cλ = 1 ∀ λ (when F = C).

From now on F = C.

3
• For α ∈ L(V ) with JNF A, the number of Jordan blocks Jm (λ) with m ≥ r in A is

n (α − λι)r − n (α − λι)r−1 .
 

Proof. Note that (


r, r≤m
n (Jm (λ) − λIm )r =

,
m, r>m
so (
r
 r−1
 1, r≤m
n (Jm (λ) − λIm ) − n (Jm (λ) − λIm ) = .
0, r > m.
Moreover, for µ 6= λ,

n (Jm (µ) − λIm )r = n Jm (µ − λ)r = 0.


 

Adding up for each block in A gives the result.

• Generalised Eigenspace Decomposition: For V a FDVS over C and α ∈ L(V ),


suppose that
mα (t) = (t − λ1 )c1 · · · (t − λk )ck ,
 
for distinct λi . Let Vj = ker (a − λj ι)cj . Then V =
L
Vj .

Proof. Define Y
pj (t) = (t − λi )ci .
i6=j

The pj have no common factors, so we can find qj ∈ C[t] such that


X
pj (t) qj (t) = 1.
P P
So if πj = qj (α) pj (α), then πj = ι ⇒ V = Im(πj ).

(α − λj ι)cj πj = 0 ⇒ Im(πj ) ≤ Vj . Also πj πi = 0 for i 6= j, so πj = πj2 , i.e. πj is a


projection on Vj . Directness follows.

6 Bilinear (etc.) Forms


6.1 Bilinear Form
• Bilinear Form: ϕ : U × V → F, linear in both arguments.

Induces linear maps:


ϕL : U → V ∗ , u 7→ ϕ(u, −),
ϕR : V → U ∗ , v 7→ ϕ(−, v).

• Rank: r(ϕ) = r(ϕL ) = r(ϕR ) = r [ϕ]B for any basis B.
• Symmetric Bilinear Form: ϕ : V × V → F, ϕ(u, v) = ϕ(v, u) (represented by a
symmetric matrix A = AT ).
• Quadratic Form: Q(v) = ϕ(v, v) for some symmetric bilinear form ϕ.
1 
Polarisation identity: ϕ(u, v) = Q(u + v) − Q(u) − Q(v) (assuming 2 6= 0 in F).
2

4
• Let ϕ be a symmetric bilinear form on a FDVS V over a field F in which 2 6= 0. Then
there exists a basis B of V such that [ϕ]B is diagonal.

Proof. Choose e1 ∈ V such that ϕ(e1 , e1 ) 6= 0, consider U = {u ∈ V : ϕ(e1 , u) = 0}, apply


induction.

• For ϕ a symmetric bilinear form on V , a FDVS over C, there is a basis B of V such that
 
Ir 0
[ϕ]B =  ,
0 0

where r = r(φ).
• For ϕ a symmetric bilinear form on V , a FDVS over R, there is a basis B of V such that
 
Ip
 
[ϕ]B = 
 −Iq ,

0

where p + q = r(φ).

Signature: s(ϕ) = p − q.

Sylvester’s Law of Inertia: p, q are uniquely determined for each ϕ (so signature is
well-defined).

Proof. Show that p is the largest dimension of a subspace of V on which ϕ is positive


definite.

• Alternating/Skew-symmetric Form: ϕ(u, v) = −ϕ(v, u).


• For ϕ an alternating bilinear form on V , a FDVS over F, there is a basis B of V such that
 
Im
 
[ϕ]B = 
 −Im ,

0

for some m (hence 2 | r(ϕ)).

6.2 Sesquilinear Form


Here F = C.

• Sesquilinear Form: ϕ : V × W → C which is linear in the first argument, and

φ(v, µ1 w1 + µ2 w2 ) = µ1 ϕ(v, w1 ) + µ2 ϕ(v, w2 ).

• Hermitian Form: ϕ : V × V → C, ϕ(u, v) = ϕ(v, u) (represented by a Hermitian matrix


A = A† ).

5
Polarisation identity: Given a function Q : V → R, v 7→ ϕ(v, v) for some Hermitian ϕ, we
can determine φ by
1 
ϕ(u, v) = Q(u + v) − Q(u − v) + iQ(u − iv) − iQ(u + iv) .
4

We can similarly define the rank and signature of a Hermitian form. Sylvester’s law of
inertia applies as well.

7 Inner Product Spaces


• Inner product: For V a FDVS over R/C, an inner product is a positive-definite sym-
metric bilinear/Hermitian form ϕ on V .

Inner products are equivalent to (certain) norms (via the polarisation identity).
• Gram-Schmidt Process: For a countable LI set {v1 , v2 , . . . }, there exists a sequence
(en ) of ON vectors such that span{v1 , . . . , vk } = span{e1 , . . . , ek } ∀ k.

• Orthogonal Matrix: RRT = RT R = I ⇔ RT = R−1 .


Unitary Matrix: U U † = U † U = I ⇔ U † = U −1 .

Gram-Schmidt implies that any matrix A ∈ Mn (R)/Mn (C) can be written as A = RT ,


where R is orthogonal/unitary, T is upper triangular.
• Projection Map: V = U ⊕ W , π : V → W , u + w 7→ w. We have π 2 = π.
If U = W ⊥ , π is unique, given by
X
π(v) = hv, ei i ei ,

and kv − π(v)k ≤ kv − wk ∀ v ∈ V, w ∈ W , with equality iff π(v) = w.

7.1 Adjoints, Orthogonal, Unitary maps


• Adjoint: For α ∈ L(V, W ), its adjoint is α∗ ∈ L(W, V ) such that hαv, wi = hv, α∗ wi ∀ v, w.

If α is represented by A, its adjoint is represented by A† .

For V, W FDVS, V ∼ = V ∗ via v 7→ hv, −i, and W ∼


= W ∗ via w 7→ hw, −i. Then α∗ : W → V
∗ ∗ ∗
can be identified with α : W → V , the dual of α.
• Self Adjoint: α ∈ L(V ) such that α = α∗ , i.e. hαu, vi = hu, αvi ∀ u, v.

A ∈ Mn (R)/Mn (C) is self-adjoint iff it is symmetric/Hermitian (for all ON bases).


• Isometry: α ∈ V is an isometry iff hαu, αvi = hu, vi ∀ u, v ⇔ kαvk = kvk ∀ v ⇔ α−1 = α∗ .

If V is real/complex, α is called orthogonal/unitary and is represented by an orthogo-


nal/unitary matrix. Orthogonal/unitary matrices generate the orthogonal group O(V )/the
unitary group U (V ).

6
7.2 Spectral Theory
7.2.1 Self-Adjoint Maps
• If α ∈ L(V ) is self-adjoint, then:
– α has real eigenvalues,
– eigenvectors of α with different eigenvalues are orthogonal.
• If V is a FD inner product space and α ∈ L(V ) is self-adjoint, then V has an orthonormal
basis of eigenvectors of α.

Proof. χα has a complex root λ, which is real by above. Consider eigenvector v, prove α
preserves U = hvi⊥ , apply induction.

7.2.2 Application to symmetric bilinear/Hermitian forms:


• Let A ∈ Mn (R)/Mn (C) be symmetric/Hermitian. Then there is an orthogonal/unitary
matrix P such that P T AP / P † AP is diagonal with real entries.

Hence a symmetric bilinear/Hermitian form ϕ : V × V → F can be represented by a


diagonal matrix with real entries wrt some orthonormal basis of V .
• The signature of ϕ with matrix A can be defined as s(ϕ) = # positive eigenvalues of A −
# negative eigenvalues of A.
• If φ, ψ are symmetric bilinear/Hermitian forms on a FDVS V over R/C, and φ is positive
definite, then there exists a basis B of V such that [φ]B , [ψ]B are both diagonal.

Proof. Use φ as an inner product (i.e. turn [φ] into I), then diagonalise ψ.

So for A, B ∈ Mn (R)/Mn (C) with A positive definite, there exists some orthogonal/unitary
matrix Q such that QT AQ / Q† AQ, QT BQ / Q† BQ are both diagonal.

7.2.3 Unitary Maps


Here F = C.
• If α ∈ L(V ) is unitary, then:
– the eigenvalues of α lie on the unit circle,
– eigenvectors of α with different eigenvalues are orthogonal.
• If V is a FD complex inner product space and α ∈ L(V ) is unitary, then V has an
orthonormal basis of eigenvectors of α.

Proof. χα has a complex root λ. Consider eigenvector v, prove α preserves U = hvi⊥ ,


apply induction.

7.2.4 Normal maps


Self-adjoint and unitary maps are examples of normal maps. A normal map is a linear map
α ∈ L(V ) that commutes with its adjoint, i.e. αα∗ = α∗ α. If V is a FD complex inner product
space and α ∈ L(V ) is normal, then V has an orthonormal basis of eigenvectors of α.
Proof. χα has a complex root λ. For v ∈ Vλ , we have α∗ v ∈ Vλ . Use this to prove that α
preserves Vλ⊥ , apply induction.

You might also like