Review of Linear Algebra

Throughout these notes, F denotes a field (often called the scalars in this context).

1

Definition of a vector space

Definition 1.1. A F -vector space or simply a vector space is a triple (V, +, ·), where (V, +) is an abelian group (the vectors ), and there is a function F × V → V (scalar multiplication), whose value at (t, v ) is just denoted t · v or tv , such that 1. For all s, t ∈ F and v ∈ V , s(tv ) = (st)v . (An analogue of the associative law for multiplication.) 2. For all s, t ∈ F and v ∈ V , (s + t)v = sv + tv . (Scalar multiplication distributes over scalar addition.) 3. For all t ∈ F and v, w ∈ V , t(v + w) = tv + tw. (Scalar multiplication distributes over vector addition.) 4. For all v ∈ V , 1 · v = v . (A kind of identity law for scalar multiplication.) It is a straightforward consequence of the axioms that 0v = 0 for all v ∈ V , where the first 0 is the element 0 ∈ F and the second is 0 ∈ V , that, for all t ∈ F , t0 = 0 (both 0’s here are the zero vector), and that, for all v ∈ V , (−1)v = −v . Example 1.2. (1) The n-fold Cartesian product F n is an F -vector space, where F n is a group under componentwise addition, i.e. (a1 , . . . , an ) + (b1 , . . . , bn ) = (a1 + b1 , . . . , an + bn ), and scalar multiplication is defined by t(a1 , . . . , an ) = (ta1 , . . . , tan ). (2) If X is a set and F X denotes the group of functions from X to F , then F X is an F -vector space. Here addition is defined pointwise: (f + g )(x) = f (x) + g (x), and similarly for scalar multiplication: (tf )(x) = tf (x). 1

+. +). +) to (V2 . in this case. r(sm) = (rs)m. then E is an F -vector space in this way. In this case. such that. ·).e.5. ·) is also a vector space. then R becomes an F -vector space. for t ∈ F and r ∈ R. such that: 1. It then follows that (W. n then tf (x) = n i=0 tan x . using the fact that (R.4. The set n Pn = { i=0 ai xi : ai ∈ F } of polynomials of degree at most n. Let V1 and V2 be two F -vector spaces and let f : V1 → V2 be a function (= map). if E is a field containing F as a subfield. Let V be a vector space. if f (x) = n i=0 an x .(3) The polynomial ring F [x] is an F -vector space under addition of polynomials. if R is any ring containing F as a subring. together with the zero polynomial. f (tv ) = tf (v ). In particular. s ∈ R and m ∈ M . is additive. +) is an abelian group. tv ∈ W ). and defining. for all v ∈ W and for all t ∈ F . it is easy to check that f −1 is also linear. +. {0} and V are always vector subspaces of V . Then f is linear or a linear map if it is a group homomorphism from (V1 .e. and the fact that 1 ∈ F is the multiplicative identity in R. distributivity. is a vector subspace of F [x]. 2 . For all r. Explicitly. and there is a function R × M → M . with unity). An F -vector subspace of V is a subset W ⊆ V is an additive subgroup of V which is closed under scalar multiplication (i. i. Then an R-module M is a triple (M.6. Definition 1. the properties (1)–(4) of a vector space become consequences of associativity. and satisfies: For all t ∈ F and v ∈ V1 .3. where (M. The function f is a linear isomorphism if it is both linear and a bijection. whose value at (r. Let R be a ring (always assumed commutative. Example 1. we have the analogue of homomorphisms and isomorphisms: Definition 1. (4) More generally. scalar multiplication tr to be the usual multiplication in R applied to t and r. +) is an abelian group. m) is denoted rm or r · m. Finally. where we define tf (x) to be the product of the constant polynomial n t ∈ F with the polynomial f (x) ∈ F [x]. There is an analogue of vector spaces for more general rings: Definition 1.

. vd } ⊆ W. vd . . For all r. . . (ii) If W is a vector subspace of V containing v1 . 1 · m = m. . (iii) For every v ∈ V . vd . . . . . span ∅ = {0}. . . . . vd } ⊆ span{v1 . . vd . . In other words. . vd ∈ V be a sequence of vectors. . 3 . r(m + n) = rm + rn. . The span of {v1 . . Rn is an R-module. and equality holds if and only if v ∈ span{v1 . vd . vd } = {t1 v1 + · · · + td vd : ti ∈ F for all i}. . vd ∈ V be a sequence of vectors. . By definition (or logic). n ∈ M . . vd is a vector of the form t1 v1 + · · · + td vd . span{v1 . . A linear combination of v1 . if I is an ideal of R. then span{v1 . . . . vd } is the set of all linear combinations of v1 . . Then: (i) span{v1 . . Thus span{v1 . Submodules and homomorphisms of R-modules are defined in the obvious way. s ∈ R and m ∈ M . . Despite the similarity with the definition of vector spaces.1. . . vd } is the smallest vector subspace of V containing v1 . and hence Z/nZ is a Z-module. . We have the following properties of span: Proposition 2. 2 Linear independence and span Let us introduce some terminology: Definition 2. . . a Z-module is the same thing as an abelian group. . general R-modules can look quite complicated. . where the ti ∈ F . . . v }. . 4. then both I and R/I are R-modules. . .2. 3. . span{v1 . Let v1 . For example. . . . . . . . vd } is a vector subspace of V containing vi for every i. . For all r ∈ R and m. For example. For a general ring R. . . . vd }. (r + s)m = rm + sm. Let V be an F -vector space and let v1 .2. For all m ∈ M . . . .

. vd . vd }. .e. vd }. . . . . we see that vi ∈ span{v1 . . The next piece of terminology is there to deal with the fact that we might have chosen a highly redundant set of vectors to span V . . . Thus span{v1 . vd }. . . vd }. . . It follows that span{v1 . given v = t1 v1 + · · · + td vd ∈ span{v1 . . . Definition 2. . . i. v }. . and. 0 = 0 · v1 + · · · +0 · vd ∈ span{v1 . . . . vd . . . vd }. . . . . . . . . vd and v and hence span{v1 . Finally. vd }. vd . . . . Then by (i) span{v1 . . . and hence v ∈ span{v1 . . . Definition 2. suppose that v ∈ span{v1 . (ii) If W is a vector subspace of V containing v1 . . v }. . . . . . . xn . . . vd . td ∈ F . . . 4 . j = i. . . Hence t1 v1 + · · · + td vd ∈ W . . the subspace Pn of F [x] defined in Example 1.3. . . . . vd . F n is finite-dimensional. Hence span{v1 . Then in particular v ∈ span{v1 . . . (i) Taking tj = 0. . On the other hand. vd } = span{v1 . . vd ∈ V such that V = span{v1 . . for all v = t1 v1 + · · · + td vd ∈ span{v1 . vd }. A sequence of vectors w1 . . . Hence span{v1 . . (iii) We always have span{v1 . . . vd } is a vector subspace of V containing v1 . But F [x] is not a finite-dimensional vector space. . . . vd }. vd } as well. . . . . vd }. . . . . vd }. . . ti vi ∈ W . .4. . . . and ti = 1. . Also. Next. given v = t1 v1 +· · ·+td vd and w = s1 w1 +· · ·+sd wd ∈ span{v1 . . . . .4 is a finite dimensional vector space. . as it is spanned by 1. For example. . . . . x. vd } is a vector subspace of V containing vi for every i. . vd . . . . . Conversely. Hence span{v1 . . for all t1 . . vd } and t ∈ F . . . . . . . vd } is an additive subgroup of V . . . . . by (i). . . w } will be said to span V . . vd } ⊆ W . vd } ⊆ span{v1 . since we can write t1 v1 + · · · + td vd = t1 v1 + · · · + td vd + 0 · v . . . −v = (−t1 )v1 + · · · + (−td )vd ∈ span{v1 . vd } is closed under addition. . v } ⊆ span{v1 . . v is a linear combination of v1 . vd . The vector space V is infinite dimensional if it is not finite dimensional.Proof. . vd }. . v + w = (s1 + t1 )v1 + · · · + (sd + td )wd ∈ span{v1 . . vd }. . . . . . . then. Now suppose that span{v1 . . . . . . . w such that V = span{w1 . . . An F -vector space V is a finite dimensional vector space if there exist v1 . . . tv = (tt1 )v1 + · · · + (ttd )vd ∈ span{v1 . v } = span{v1 . v } = span{v1 .

5. . A sequence w1 . the vectors e1 . then ti = si for all i. . Proof. . .Definition 2. . The sequence w1 . i. . . wr are linearly independent if and only if. then we can express 0 as the nontrivial linear combination wi − wj . . and likewise any smaller sequence (not allowing repeats).e.6. Hence w1 . wr are linearly independent and if t1 w1 + · · · + tr wr = s1 w1 + · · · + sr wr . . then it follows from t1 w1 + · · · + tr wr = 0 = 0 · w1 + · · · + 0 · wr that ti = 0 for all i. . . . . si ∈ F . such that t1 w1 + · · · + tr wr = s1 w1 + · · · + sr wr . wr ∈ V is linearly independent if the following holds: if there exist real numbers ti such that t1 w1 + · · · + tr wr = 0. then ti = 0 for all i. si = ti . . . . Likewise if one of the wi is zero then the set is linearly dependent. e1 . . . . 1 ≤ i ≤ r. if w1 . Conversely. . then so is any reordering of the vectors w1 . The vectors w1 . . The vectors w1 . . . then (t1 . wr is linearly dependent if it is not linearly independent. . wr are not linearly independent if and only if we can write at least one of the wi as a linear combination of the wj . . . . . for example w1 . . . . .7. ej are linearly independent. wr are linearly independent. wr }—if there are any repeated vectors wi = wj . By definition or by logic. . Thus by the definition of linear independence ti − si = 0 for every i. For a less obscure example. . then after subtracting and rearranging we have (t1 − s1 )w1 + · · · + (tr − sr )wr 0. wr . ws with s ≤ r. . . If w1 . . . A related argument shows: Lemma 2. en ∈ F n are linearly independent since if t1 e1 + · · · + tn en = 0. Lemma 2. 5 . . More generally. . The proof is left as an exercise. for all j ≤ n. . the empty set is linearly independent. . . . . . . Note that the definition of linear independence does not depend only on the set {w1 . . Clearly. if the last statement of the lemma holds and if t1 w1 + · · · + tr wr = 0. . tn ) is the zero vector and thus ti = 0 for all i. . j = i. . given ti . wr are linearly independent. .

. Thus we can solve for v1 in terms of w1 and the vi . . For example. The vectors v1 . vd }. . v2 . . . Let V be an F -vector space. . v2 . . . possibly after relabeling the vi . . va } = span{v1 . . . vb+1 . va }: a w2 = t1 w1 + i=2 ti vi . . v2 . . . . . . . . . . . va } = · · · = span{w1 . . (If {w1 . Thus to say that the vectors v1 . . va } = span{w1 . . . To begin. . i. v2 . vd are a basis of V if they are linearly independent and V = span{v1 . . v2 .e. . . xn are a basis for Pn . {v1 . i > 1: 1 v 1 = w1 + t1 a − i=2 ti t1 vi . . va }. we have span{w1 . . en are a basis for F n . . . . we may suppose that {w1 . . . . Since w1 = 0. wb .) Moreover none of the wi is zero. . wb } = ∅. vr are a basis of V is to say that every x ∈ V can be uniquely written as x = t1 v1 + · · · + tr vr for ti ∈ F . . . we can write it as a linear combination of the vi : a w1 = i=1 ti vi .Definition 2. . . . . 6 . . v2 . . va } = ∅. . span{v1 . . Given w1 . . . . va } = span{v1 .8. va }. va }. . write w2 as a vector in span{w1 . we can assume that t1 = 0. It follows that v1 ∈ span{w1 . . the standard basis vectors e1 . va } = span{w1 . . Now using some of the properties of span listed above. . w2 . . va }. v2 . Continuing in this way. From this we will be able to conclude that b ≤ a. Suppose that w1 . . The elements 1. . . . . . x. a ≥ 1. . . then b = 0 and the conclusion b ≤ a is automatic. . After relabeling the vi . . Lemma 2. . . and at least one of the ti = 0. . wb are linearly independent vectors contained in span{v1 . . where the second equality holds since w1 ∈ span{v1 . We shall show that. . . va }. wb } = ∅. . . . Then b ≤ a. w1 .9 (Main counting argument). Proof. . . . . . . w2 . . . v3 .

. So for all i ≤ b. notice that in particular we must have a ≥ 2. and vi . for a fixed i < b. . . va . We claim that the same is true for i + 1. . (i) Suppose that V = span{v1 . . . Then as before we can solve for vi+1 in terms of the vectors w1 . . At least one of the numbers ti+1 . . va } = span{w1 . we must have ti = 0. which would say that the vectors w1 . . v3 . w2 . w are linearly independent vectors in V . It follows that span{w1 . . we may write v2 = − t1 1 w1 + w2 + t2 t2 a − i=3 ti t2 vi . wi . . . After relabeling. va }. . v2 . . vn } and that w1 . wi+1 . . . . . . v3 . we may assume that ti+1 = 0. va } = · · · = span{w1 . vi+1 . . wi . . . w2 .10. . . . . . . and thus we can solve for v2 in terms of w1 . . vi+2 . . . . . . vi+1 . By induction. . . . . . . which is possible as wi+1 ∈ span{v1 . . i ≥ 3 and so span{w1 . . Write wi+1 = t1 w1 + · · · + ti wi + ti+1 vi+1 + · · · + ta va . After relabeling. Arguing as before. . contradicting the linear independence of the wi . w2 . . . . . va } = = span{w1 . and in particular b ≤ a. . w2 . . . i ≤ a. . . . w2 . . . wi+1 are not linearly independent. we can assume that t2 = 0. va } and we have completed the inductive step. wi+1 . . . v2 . . va }. . . . . w2 . . for otherwise wi+1 = t1 w1 + · · · + ti wi . wi . va } = span{w1 . . va } = span{w1 . vi+2 . . ta is nonzero. . . . Then ≤ n. for otherwise we would have w2 = t1 w1 and thus there would exist a nontrivial linear combination t1 w1 + (−1)w2 = 0. . wi . 7 . .For some i ≥ 2. In particular this says that i + 1 ≤ a. . . va } = span{w1 . . . . This rather complicated argument has the following consequences: Corollary 2. . va }. suppose that we have showed that i ≤ a and that after some relabeling of the vi we have span{v1 . w2 . vi+1 . . . . .

. wr is a basis for V . . vd } then some subsequence of v1 . . . . . one of the vectors vi is expressed as a linear combination of the others. After relabeling we may assume that vd is a linear combination of v1 . . vd is a basis for V . . . Hence dim V ≤ d. . . . . and if dim V = d then v1 . . vd are linearly independent. then there exists a vector. . Moreover dim W = dim V if and only if W = V . By (iii) of Proposition 2. . w }.(ii) If V is a finite-dimensional F -vector space. . . (v) If V is a finite-dimensional F -vector space and W is a vector subspace of V . . . . . va is a sequence of linearly independent vectors contained in V = span{w1 . . . . . (iv) If V is a finite-dimensional F -vector space and w1 . . . . .7. and if dim V = then w1 . . . then dim W ≤ dim V . . . wr ∈V such that w1 . va are two bases of V . . to conclude that ≤ n. . . . . . . . then there exist vectors w +1 . Thus b ≤ a. . . . Continue in this way until we find v1 .2. . . vd }. . . . . w are linearly independent vectors in V . (iii) If v1 . . . vd are not linearly independent. vd−1 . . . and to the linearly independent vectors w1 . . . then. and V = span{v1 . (vi) If v1 . . . Proof. . . . vk with k ≤ d such that span{v1 . . . . . . . (iv) If span{w1 . It follows that the sequence w1 . . then every two bases for V have the same number of elements—call this number the dimension of V which we write as dim V or dimF V if we want to emphasize the field F . w is a basis for V . . . . Thus for example dim F n = n and dim Pn = n + 1. . . . But symmetrically v1 . . w } = V . . . vd−1 } = span{v1 . . (iii) If V = span{v1 . td ) = i=1 ti vi is a linear isomorphism from F d to V . w . . wb and v1 . vn . . . . . (i) Apply the lemma to the subspace V itself. . . . (ii) If w1 . . Thus a = b. vd is a basis for V . . . vd } is a basis. va }. . span{v1 . then by definition w1 . . . wb are linearly independent vectors. . . . . . w +1 8 . in which case span{v1 . call it w +1 ∈ V with w +1 ∈ / span{w1 . . w +1 . w . . . . . . . . which is the span of v1 . . w ∈ V . vd is a basis of V . . . . . . . vk } = span{v1 . by Lemma 2. vk are linearly independent. then the function f : F d → V defined by d f (t1 . . . . . . . . so a ≤ b. Hence dim V ≥ . Moreover k = d exactly when v1 . Then by definition k = dim V and k ≤ d. . wb }. . . . vd } and such that v1 .

w +1 are still linearly independent. ak1 ak2 . then we must have t +1 = 0 since w1 . vd } is a vector subspace of V . We can then associate to f an k × n matrix with coefficients in F as follows: Define   a11 a12 . Proposition 2. (v) Choosing a basis of W and applying (iv) (since the elements of a basis are linearly independent) we see that it can be completed to a basis of V . . . The proof of (iv) of the above corollary shows how to find a basis of W .11. i. At this point we have found a linearly independent sequence which spans V and thus is a basis. . . . . . . In particular W is of the form span{v1 . . . W = V . . . ek of F k : suppose that f (ei ) = k j =1 aji ej = (a1i . . . . vd }. . But that would say that w +1 ∈ span{w1 . . A= . such that +1 0 = i=1 ti wi . . We have already noted that a set of the form span{v1 . . contradicting our choice. Since the number of elements in a linearly independent sequence of vectors in V is at most dim V . Conversely let W be a vector subspace of V . Then f (t1 .. note that if span{w1 . Moreover dim W = dim V if and only if the basis we chose for W was already a basis for V . . (vi) A straightforward calculation shows that f is linear. Thus dim W ≤ dim V . . akn 9 . w are linearly independent. . tn ) = f ( i ti ei ) = i ti f (ei ). . .  . and it is injective since the vi are linearly independent and by Lemma 2. w } = V . So w1 . To see the last statement. . . . . . vd ∈ V . . aki ). . . not all 0.e. 3 Linear functions and matrices Let f : F n → F k be a linear function. . .is still linearly independent: if there exist ti ∈ F . Proof. . vd } for some v1 . . We continue in this way. . . this procedure has to stop after at most dim V − stages. . . . . . w . . and hence dim V ≥ + 1. . . . . . . a1n  . It is a bijection by definition of a basis: it is surjective since the vi span V . . . We can write f (ei ) in terms of the basis e1 . . . then W is a vector subspace of V if and only if it is of the form span{v1 . . w }. then there exist + 1 linearly independent vectors in V . . If V is a finite dimensional F -vector space and W is a subset of V . .6.

. . . having fixed the basis v1 . 10 . . . We say that A is the matrix associated to the linear map f and the bases v1 . . . viewing V as both the domain and range of f . . . and choose bases v1 . wk we again get a k × n matrix A by the formula: A = (aij ). If f : V1 → V2 is linear. composition of linear maps corresponds to multiplication of matrices. More generally. then. . the n × n matrix A is invertible ⇐⇒ det A = 0. . vn once and for all. wk . where aij is defined by k f (vi ) = j =1 aji wj . . . . suppose that V1 and V2 are two finite dimensional vector spaces. given the bases v1 . . expansion by minors). . Moreover. With the understanding that we have to choose bases to define the matrix associated to a linear map. . and in this case there is an explicit formula for A−1 (Cramer’s rule). wk of V2 . . . Again. . . . . vn and w1 . . Finally. we can define the determinant det A of an n × n matrix with coefficients in F by the usual formulas (for example. . and write A = (aij ) where f (vi ) = k j =1 aji vj . vn of V1 and w1 . . . so that n = dim V1 and k = dim V2 . . it is simplest to fix one basis v1 . . vn for V .linear maps from V to itself correspond to n × n matrices and composition to matrix multiplication. In this case. In particular. let f : V → V be a linear map from a finite dimensional vector space V to itself. vn and w1 .Then the columns of A are the vectors f (ei ). . . .

Master your semester with Scribd & The New York Times

Special offer for students: Only $4.99/month.

Master your semester with Scribd & The New York Times

Cancel anytime.