Professional Documents
Culture Documents
VECTOR SPACES
2019–2020
1 Definitions
4 Change of Basis
6 Gram-Schmidt Orthogonalisation
1 Definitions
4 Change of Basis
6 Gram-Schmidt Orthogonalisation
1 u+v ∈V 6 α.v ∈ V
2 u+v =v+u 7 α.(u + v) = α.u + α.v
3 (u + v) + w = u + (v + w) 8 (α + β).v = α.v + β.v
4 ∃0 ∈ V, ∀v ∈ V : v + 0 = v 9 (αβ).v = α.(βv)
5 v ∈ V,∃ − v ∈ V : v + (−v) = 0 10 1.v = v
Example 1
For andy x = (x1 , x2 , . . . , xn ), y = (y1 , y2 , . . . , yn ) ∈ Rn and α ∈ R, we
define addition in Rn and scalar multiplication with element in R by
Example 2
Let Km×n be the set of all m × n matrices with coefficients in K. For
any A = (aij )m×n , B = (bij )m×n ∈ Km×n and α ∈ K, we define addition
in Km×n and scalar multiplication with element in K by
Example 3
Let S be a nonempty set and FKS = {f | f : S → K}. For any
f, g ∈ FKS and α ∈ K, we define addition in FKS and scalar
multiplication with element in K by
Theorem 1
1 The zero vector is unique.
2 0v = 0 for all v ∈ V .
3 k0 = 0 for all scalars k ∈ K.
4 The additive inverse of each element of V is unique.
5 For v ∈ V : −v = (−1)v.
6 kv = 0 ⇐⇒ k = 0 or v = 0.
Definition 2
Let ∅ =
6 S ⊂ V . If (S, +, ., K) is a vector space, then we say that S is a
subspace of (V, +, ., K).
VECTOR SPACE ITC 7 / 36
Subpaces
Theorem 2
Let S ⊂ V . Then S is a subspace of V iff
1 S 6= ∅,
2 For u, v ∈ S, α ∈ K : u + v ∈ S and α.v ∈ S.
Theorem 3
Let ∅ =
6 S ⊂ V . Then S is a subspace of V iff
1 0 ∈ S,
2 u, v ∈ S, k ∈ K : u + kv ∈ S.
Theorem 4
If S1 , S2 , . . . , Sp are subspaces of V , then S1 ∩ S2 ∩ · · · ∩ Sp is also a
subspace of V .
1 Definitions
4 Change of Basis
6 Gram-Schmidt Orthogonalisation
Definition 3
Let v1 , v2 , . . . , vp ∈ V .
1 We say that v ∈ V is a linear combination of v1 , v2 , . . . , vp if
there exist k1 , k2 , . . . , kp ∈ K such that
v = k1 v1 + k2 v2 + · · · + kp vp
2 The set of all linear combinations of v1 , v2 , . . . , vp is called the
span of v1 , v2 , . . . , vp , and denoted by Span{v1 , v2 , . . . , vp }. That
is,
Span{v1 , v2 , . . . , vp } = {k1 v1 + · · · + kp vp : ki ∈ K, i = 1, 2, . . . , p}
3 A subspace S of V is said to be spanned by v1 , v2 , . . . , vp if
S = Span{v1 , v2 , . . . , vp }.
Definition 4
The vectors v1 , v2 , . . . , vp ∈ V are said to be linearly independent if
the equation
k1 v1 + k2 v2 + · · · + kp vp = 0
has a unique solution k1 = · · · = kp = 0. Otherwise, v1 , v2 , . . . , vp are
said to be linearly dependent.
Theorem 5
Let v1 , v2 , . . . , vp ∈ V . Then
1 Span{v1 , v2 , . . . , vp } is a subspace of V and the smallest subspace
containing {v1 , v2 , . . . , vp }.
2 A vector v ∈ Span{v1 , v2 , . . . , vp } can be written uniquely as a
linear combination of v1 , v2 , . . . , vp if and only if v1 , v2 , . . . , vp are
linearly independent.
Theorem 6
The set {v1 , v2 , . . . , vp } ⊂ V is linearly dependent if and only if at least
one of the vectors of the set can be expressed as a linearly combination
of the others.
Definition 5
Let A = (aij ) be an m × n matrix with entries in K. The row space of
A denoted r(A) is a subspace of Kn spanned by row vectors of A and
the column space of A denoted c(A) is a subspace of Km spanned by
column vectors of A.
Theorem 7
Two matrices A and B are row-equivalent matrices, then r(A) = r(B).
Theorem 8
The set of nonzero row vectors in any row-echelon form of a matrix is
linearly independent.
1 Definitions
4 Change of Basis
6 Gram-Schmidt Orthogonalisation
Definition 6
A set {v1 , v2 , . . . , vn } ⊂ V is called a basis for V if:
1 v1 , v2 , . . . , vn are linearly independent,
2 Span{v1 , v2 , . . . , vn } = V .
Example 4
B = {e1 , e2 , . . . , en } where ei = (0, . . . , 0, 1, 0, . . . , 0) for i = 1, 2, . . . , n is
| {z }
1 at i−th
a basis of Rn . It is called a standard basis or canonical basis for Rn .
Example 5
For n ∈ N, B = {1, x, . . . , xn } is a basis of Rn [X]. It is called a
standard basis or canonical basis for Rn [X].
Definition 7
The dimension of a vector space V is the number of elements in the
basis. That is, if B = {v1 , v2 , . . . , vn } is a basis for V , then dimension
of V equals to n. We write,
dimV = n.
Theorem 9
If Span{v1 , v2 , . . . , vn } = V , then any collection of m vectors in V ,
where m > n, is linearly dependent.
Theorem 11
Let B = {v1 , v2 , . . . , vn } ⊂ V , and dimV = n. The following statements
are equivalent.
1 B is a basis for V .
2 B is linearly independent.
3 B spans V .
Theorem 12
If S is a subspace of a vector space V , then
1 dim S ≤ dim V
2 dim S = dim V ⇐⇒ S = V.
VECTOR SPACE ITC 17 / 36
Contents
1 Definitions
4 Change of Basis
6 Gram-Schmidt Orthogonalisation
Theorem 13
Let B = {v1 , v2 , . . . , vn } be a basis for V . Then every vector v ∈ V can
be written uniquely as a linear combination of v1 , v2 , . . . , vn . That is,
∃! (c1 , c2 , . . . , cn ) ∈ Kn , such that
v = c1 v1 + c2 v2 + · · · + cn vn . (1)
Definition 8
The unique n-tuple (c1 , c2 , . . . , cn ) ∈ Kn defined in (1) is called the
coordonates of v relative to the ordered basis
B = {v1 , v2 , . . . , vn }. We denote
c1
c2
[v]B = .
.
.
cn
Theorem 14
Let B be an ordered basis for a finite dimensional vector space V . Let
u, v ∈ V and c ∈ K. Then,
Definition 9
Let B = {v1 , v2 , . . . , vn } and C = {w1 , w2 , . . . , wn } be two ordered
bases for a vector space V . We call the n × n matrix defined by
Theorem 15
Let A, B and C be three ordered bases for a finite dimensional vector
space V and v ∈ V . Then
Theorem 16
The set of nonzero row vectors in any row-echelon form of an m × n
matrix A is a basis for the row space r(A).
Definition 10
Let S1 , S2 , . . . , Sp and S be subspaces of V .
1 The sum of S1 , S2 , . . . , Sp is defined by
S1 + · · · + Sp = {s1 + · · · + sp : si ∈ Si , i = 1, 2, . . . , p}
S = S1 ⊕ S2 ⊕ · · · ⊕ Sp
s = s1 + s2 + · · · + sp
where si ∈ Si , i = 1, 2, . . . , p.
Theorem 19
Let S1 , S2 be two subspaces of a finite dimensional vector space V and
let V = S1 + S2 . Then
Theorem 21
Let V be a finite-dimensional vector space of n dimensions
over K and
S1 , . . . , Sp be subspaces of V . Let dj = dim Sj and bj1 , . . . , bjdj be a
basis for Sj . Then the following statements are equivalent.
1 S1 + · · · + Sp = S1 ⊕ · · · ⊕ Sp .
2 b11 , . . . , b1d1 , . . . , bp1 , . . . , bpdp is a basis for S1 + · · · + Sp .
3 dim (S1 + · · · + Sp ) = dim (S1 ) + · · · + dim (Sp )
1 Definitions
4 Change of Basis
6 Gram-Schmidt Orthogonalisation
Definition 11
A map h., .i : V × V → K is called an inner product in V , if it
satisfies the following properties. For all vectors u, v, w ∈ V and for all
scalars k ∈ K,
1 hu, ui ≥ 0, and hu, ui = 0 ⇐⇒ u = 0.
2 hu, vi = hv, ui.
3 hku, vi = k hu, vi .
4 hu + v, wi = hu, wi + hv, wi .
Definition 12
Any vector space V that is equipped with an inner product h., .i is
called an inner product space.
The norm of u is defined in term of the inner product by
p
kuk = hu, ui.
d(u, v) = ku − vk.
Theorem 22
Let V be an inner product space over K. Let u, v ∈ V and k ∈ K.
Then,
1 kkvk = |k|kvk
2 ku + vk ≤ kuk + kvk
3 | hu, vi | ≤ kukkvk
ku + vk2 + ku − vk2 = 2 kuk2 + kvk2
4
1 Definitions
4 Change of Basis
6 Gram-Schmidt Orthogonalisation
Theorem 23
If {v1 , v2 , . . . , vp } of nonzero vectors in an inner product space V is
an orthogonal set then it is linearly independent.
If {v1 , v2 , . . . , vn } is an orthogonal basis for a finite-dimensional
inner product space V , then for every vector v ∈ V ,
i = 2, 3, . . . , p.
Theorem 25
Let S be a finite-dimensional subspace of an inner product space V ,
and let y ∈ V . Then there exist unique vectors u ∈ S and z ∈ S ⊥ such
that y = u + z. Furthermore, if {v1 , v2 , . . . , vk } is an orthonormal basis
of W , then
Xk
u= hy, vi ivi .
i=1
Theorem 26
In the notation Theorem 25, the vector u is the unique vector in W
that is ”closest” to y; that is for any x ∈ S, ky − xk ≥ ky − uk.
Theorem 27
Suppose that S = {v1 , v2 , . . . , vk } is an orthonormal set in an
n-dimensional inner product space V . Then
1 S can be extended to an orthonormal basis
{v1 , . . . , vk , vk+1 , . . . , vn }
for V .
2 If W = SpanS, then S1 = {vk+1 , . . . , vn } is an orthonormal basis
for W ⊥ .
3 If W is any subspace of V , then