You are on page 1of 4
7.2 Bases and Matrices in the SVD A The SVD produces orthonormal basis of 8 and w’s for the four fundamental subspaces. 2 Using those bases, A becomes a diagonal matrix and Av; 4 = Singular value. 3 Thetwo-bases diagonalization A = UEV often has more information than A = KAX—1. 4 UEV" separates Aino rank-L matioes owwief +b ormeP. oxunvt isthe largest! / {VEY separates A into ramk-L matrices oyuiv? +--+ aruruir. eiurv? isthe largest ‘The Singular Value Decomposition is a highlight of linear algebra. A is any m by m ‘matrix, square or rectangular. Is rank is r. We will diagonalize this A, butnot by X-"AX. ‘The eigenvectors in X have three big problems: They are usually not orthogonal, there are not always enough eigenvectors, and Aw — Az requires A to be a square matrix, The Singular vectors of A solve all those problems ina perfect way. Let medescribe what we want from the SVD: the right bases for the four subspaces. ‘Then [ will write about the steps to find those basis vectors in order of importance. ‘The price we pay is to have two sets of singular vectors, w’s and v's, The u’s are in R™ and the v's are in R™, They will be the columns of an m by m matrix {7 and an n by im matrix V. I will frst describe the SVD in terms of those basis vectors. Then Tcan also describe the SVD in terms of theorthogonal matrices U and V.. (using vectors) The u's and o's give bases for the four fundamental subspaces: Uy,...)tle isan orthonormal basis for the column space tea) --- Mm isan orthonormal basis for the left mullspace (4) wy is an orthonormal basis for the row space }¥__ isan orthonormal basis for the mullspace N(.4).. try ‘More than just erthogonality these basis vectors diagonaize the matrix A: “Als diagonalized” © Av) =o,u. Avg = 02m «1. Ave = arte (1) ‘Those singular values 71 to c will be positive numbers: oj is the length of Av;. ‘The o's go into a diagonal matrix that is otherwise zero. That matrix is. (using matrices) Since the u's are orthonormal, the matrix UJ. with those r columns has UU, = I. Since the v’s are orthonormal, the matrix V= has V;"V; =I. Then the equations Av; = a;1; tell us colurma by column that AV, = U, 3, Hel. ° AV, = U,=, (mby r)(r by 7) (m by n)(n by 6) | Ala ‘This is the heatt of the SVD, but there is more. Those v's and w’s account for the row space and column space of A. We have n —r more v's and m ~r more w’s, from the nullspace N(A) and the left nulispace N(A™). They are automatically orthogonal to the first v's and w's (because the whole nullspaces are orthogonal). We now include all the v's and u's in V and U, s© these matrices become square. We still have AV = US. (by n)(ndyn) at AV equals UE A |v) = 0, - 02 > ...0y > 0, the splitting in equation (4) gives the r rank-one pieces of A in order of importance. This is crucial. Example 1 Whenis A= UV" (singular values) the same as X AX" (eigenvalues)? Solution A needs orthonormal eigenvectors to allow X = U = V. A alto needs cigenvalues \ > 0 if A =. So A must be a positive semidefinite (or definite) symmetric matrix. Only then will A= X AX~? which is also QAQ™ coincide with A = VZV". Example 2 If A = ay" (rank 1) with unit vectors 2 and y, what is the SVD of A? Solution The reduced SVD in (2) is exactly ey™, with rank r = 1. Ithas u, = x and vy = yand cy = 1. For the full SVD, complete u) = 2 to an orthonormal basis, of u's, and complete v; = to an orthonormal basis of v's. No new o’s, only 1 = 1. Proof of the SVD We need to show how those amazing u's and v’s can be constructed. The v's will be orthonormal eigenvectors of AT A. This must be true because we are aiming for ATA = (ULV™)"(UEV") = VETUTUEYT = VETEVT. o On the right you see the eigenvector matrix V for the symmetric positive (semi) definite matrix ATA, And (Z75) mus be the eigenvalue matrix of (AA) : Each a” is (ATA)! Now Av; = ait; tells us the unit vectors w1 to ur. ‘This is the key equation (1). ‘The essential point—the whole reason that the SVD succeeds—is that those unit vectors ‘uz to uy are automatically orthogonal to each other (because the v's are orthogonal) : Key step gry, = (AtL\" (405) _ vf ATAv, igg 4M (2) 4) Wa; ‘The v's are eigenvectors of ATA (symmetric). They are orthogonal and now the 1's are also orthogonal. Actually those 1s will be eigenvectors of AA™. Finally we complete the v's and w’s to n v's and m w’s with any orthonormal bases for the nullspaces N(A) and N(A™). We have found V and © and U in A = UBV". An Example of the SVD Here is an example to show the computation of all tree matricesin A = UEV". Example 3 Find ie mics 75 2,V fr A= | 4 al ‘Therank is r = 2. With rank 2, this A has positive singular values o1 and o2. We will see that a1 is larger than Amex = 5,and 02 is smaller than Amin = 3. Begin with A” A and AAT: oe, [2 2 [9 2 aa-[3 2 oa [3 all ‘Those have the same trace (50) and the same eigenvalues o? = 45 and o} = 5. The square roots are oy = V4 and o2 = V5. Then 6172 = 15 and this is the determinant of A. A key step is to find the eigenvectors of A” A (with eigenvalues 45 and 5): [> s)[t}-*(t] [8 2][t}-(4] ‘Then v; and vz are those orthogonal eigenvectors rescaled to length 1. Divide by V2. Av; Right singular vectors 0 51] nad Fi] Left singular vectors 1; Now compute Av; and Avp which will be oyu, = VA5 uy and cpus = V5 up — 3f2 met_{t] . a= ala] ~ vals] > om am = 4/4] = 3 ag [ct] = ou 2 = Blail= “asl a] = oe ‘The division by V/10 makes uy and uz orthonormal. Then 9) = v45 and o2 = V5 as expected. The Singular Value Decomposition of Ais U times © times V". Lyi 3) fe. zal aot ] ta U and V contain orthonormal bases for the column space and the row space (both spaces are just R?). The real achievement is that those two bases diagonalize A: AV equals US: The matrix A splits into a combination of two rank-one matrices, columns times rows: y_VHl1 1), v6] 3 -3)_[3 0)_, +m l-1 1} 74 5}7* ovurvy] + oous0y = Vail 3 3

You might also like