You are on page 1of 28
2E1: Linear Algebra | Lecture Notes __§3 Eigenvalues and eigenvectors 3 Eigenvalues and eigenvectors We know that many physical systems arising in engineering applications can be represented as discrete models involving matrices. Some key parameters describing physical systems (e.g., the resonance frequency) are closely related to eigen- values of the matrix representing the system. That is why the eigenvalue analysis is ubiquitous in all branches of modern engineering. For example, the natural frequency of the bridge is the eigenvalue of smallest magnitude of a system that mod- els the bridge. The engineers exploit this knowledge to ensure the stability of their constructions. Eigenvalue analysis is also used in the design of car stereo systems, where it helps to re- duce the vibration of the car due to the music. In electrical engineering, the application of eigenvalues and eigenvectors is useful for decoupling three-phase systems through symmet- rical component transformation. 3.1 Definitions of eigenvalues and eigenvectors The eigenvectors of a square matrix A are the non-zero vec- tors x that, after being multiplied by the matrix, remain par- allel to the original vector. For each eigenvector, the corre- sponding eigenvalue is the factor by which the eigenvector is scaled when multiplied by the matrix (see Figure 3.1). The prefix eigen- is adopted from the German word “eigen” for “own” in the sense of a characteristic description (that is why the eigenvectors are sometimes also called characteris- tic vectors, and, similarly, the eigenvalues are also known as characteristic values). 3.1 2B; Linear Algebra | Lecture Notes §8 Eigenvalues and eigenvectors % er Figure 3.1 Now, we can give a formal mathematical description of this idea. Given a square matrix A, let us consider the problem of finding numbers A (real or complex) and vectors (vector- columns) x (x # 0) such that Ax =x. qd) This problem is called the eigenvalue problem, the numbers are called the eigenvalues of the matrix A, and the non- zero vectors x are called the eigenvectors corresponding to the eigenvalue A. Points to note: we are not interested in the trivial solution x = 0 of prob- lem (1); © eigenvectors are only unique up to a multiplicative factor, i.e., if x satisfies (1) for some A then so does cx, where c is any constant. 3.2 2E1: Linear Algebra | Lecture Notes 43. Eigenvalues and eigenvectors 3.2 Finding eigenvalues First, we note that Ax = AIx, where I is the identity matrix. Then we can rewrite equation (1) in the form Ax — \Ix = 0, or (A -Al)x =0. (2) Matrix equation (2) (which, in fact, represents the linear sys- tem) has a non-trivial solution x # 0 if and only if the matrix A — AI of this system is singular, which is the case if and only if det(A — AI) =0. (3) Thus, we have the equation for finding eigenvalues \. Equa- tion (3) is called the characteristic equation. Points to note: e if A is nm x n matrix, then (3) is a polynomial equation (in A) of degree n and it has n (in general, complex) solutions; e solutions to equation (3) may be repeated (e.g., the equa- tion (\ — 1)? = 0 has two solutions which coalesce, 1 = A = 1); in this case we say that the eigenvalue \ has mul- tiplicity m, > 1; if A isn x n matrix, then m, < n. Ba 2E1: Linear Algebra | Lecture Notes __§3 Eigenvalues and eigenvectors Example 3.1 (diagonal matrix): find the eigenvalues of the matrix 10 a-(53): We have 10 rA0 1-r 0 a-a=(59)-(5 )=( 0 2): Hence, we can write the characteristic equation: ne 0 det(A — AT) 0 2-A |-o This gives (1 — \)(2— ) = 0, and we find two eigenvalues of the matrix A: \ = 1 and \ = 2. Example 3.2 (triangular matrix): find the eigenvalues of the matrix 11 A=( ir We have *” arn 4 14) (rs NL [te Aat~ (52)-(8 5) 0 1a) Hence, te characteristic equatou is A-d 1 a = (4-0) (2-4) = O dot (A- 24) | 5 a | (1-) (2-A) This gives two eigenvalues eA aml he2, 3.4 2E1: Linear Algebra | Lecture Notes 3 Eigenvalues and eigenvectors Example 3.3 (symmetric matrix): find the eigenvalues of the matrix Lil a=(15): Characterishe equation - 4 det (AAT) 20> \ mle?) (ery (2-A) = 4 =O Q-B4N-1 =O NW -3d44 Mae 22/e a Hence, tt eigenvalues ore d,> 24 4y5, MeZ- 288. Example 3.4 (non-symmetric matrix): find the eigenvalues of the matrix 11 a-(1,3): Characteristic equation det (A-AT)= 0 => | 4 | -° “4. Oh ld) =) 44 =O a-3dh4N 44 Deo a ae 2 (of Wn zi Hence, the eigevalues are complex numbers - 2,8; ~2-£ Megrei, Sa mae oa an 2El: Linear Algebra | Lecture Notes §3 Eigenvalues and eigenvectors Points to note: e if A is a triangular matrix (either lower-triangular, upper- triangular or diagonal) then the eigenvalues of A are the diagonal entries (see Examples 3.1 and 3.2); e if A is a symmetric matrix with real elements then the eigenvalues of A are always real (see Example 3.3); e if A is a non-symmetric matrix then its eigenvalues are either real or complex-conjugate pairs (see Example 3.4). 3.6 2E1: Linear Algebra | Lecture Notes §3. Eigenvalues and eigenvectors 3.3 Finding eigenvectors Having solved (3) to find the eigenvalues, we then substitute them into (1) (or, into (2)) to obtain the corresponding eigen- vectors. Example 3.2 (triangular matrix; continued): find the eigen- vectors of the matrix 11 A=(53): o= (3) @)-@): This gives the system of equations The first eigenvalue: m+ = 2r9 = 2a, which has infinitely many solutions at =0, r= a (ais arbitrary). Hence, any vector x = (a, 0)" with a ¥ 0 is the eigenvector of A corresponding to the eigenvalue , = 1. Analogously, for the second eigenvalue: ne, 953) ()=2(2). This gives the system of equations a + @ = 2a, 2x9 = 2%, Ee | 261: Linear Algebra | Lecture Notes §3. Eigenvalues and eigenvectors which has infinitely many solutions x) = 2 = a (ais arbitrary). Then, any vector x = (a, a)? with a # 0 is the eigenvector of A corresponding to the eigenvalue \2 = 2. We can scale the eigenvectors (e.g., by choosing a = 1 in the above example), to obtain a unique eigenvector for each eigenvalue. In Example 3.2 given above, the eigensystem of the matrix A is then given by -ai-(a())- 0} Sometimes, it is useful to normalise eigenvectors. The eigen- system of the matrix A in Example 3.2 is then written as fol- lows: 3 1 1/v2 wend {1 (0)} {9.GNa) Example 3.3 (symmetric matrix; continued): find the eigen- vectors of the matrix 11 a (; a) ist eigenvalue: dy= 2+ 3VF we (F)(2) EHO (9 Je | | roan @4bee x2 (4+ TO)" © 1 To see this Note that these two equations are the Same, take @ and substtuk inh © 3.8 2EL: Linear Algebra | Leste Notes 63 Figemaues nd eigenvectors Xie (-S4 dee) = Cat bey(tt 2)e = Cpty em Thus, we com fix Kyxal, thin from (@) we get xy=(fr dvd, Thuefore, j= (ee) with dfo is the eigenvector , a of A corresponding te y= 34285, 2nd eigenvalue: A, = 3-75 x, x x ee ' (4) => GO YC Gi 12) (%) rere Ed | ee (2 Xt2ke = (F- FNS) Xe (Gee -75)x 38)n These two equahous ove He game ABaiw Thur, we com fix XyeB anual finely from tha dst epuakon : Kye (£7 EB) B. Therefore, for ong pFO, the eigewvecbr 4 of A corresponding fo Az is (u i hae) The eigensystem of the matrix A in Example 3.3 is given by = {3+ 5vi.x}, G-av5y}, (OH). 9): 1 3V5 Observe that x and y are orthogonal, i.e., x -y = where Point to note: ¢ in general, the eigenvectors associated with distinct eigen- values of a symmetric matrix are mutually orthogonal. 3.9 21: Linear Algebra | Lecture Notes §3 Eigenvalues and eigenvectors Example 3.5: find the eigenvectors of the matrix 21 A= ( a Firgk we weeok to find euigewvalues a 2a-h 4 A-)T = ( ° Jn) Chowacterishe equation | oe \ One zan @-ay =0 => Ar2 There is only one (repeateat J eagenva (uk Eigenveetors satisfy Ax = 2k 24 *) a (3) (3 s)(a) + Ge Que tke = aM ® | ax.7 ak, @ Equation © is always fue , 50 we com discard it From equation ® we have X.70 Therefore , X4 cam be arbihary. Thus , the egenveetor corresponding ‘to A=2 is Lua vector (3) for ony 40, We cam seale i+ by selecting dd Hence, the cigensystem of Ha matix A is 3(A) }2, (One Note that in this example A is 2 x 2 matrix; however there is only one (repeated) eigenvalue and there is only one eigen- vector. 3.10 2EI; Linear Algebra | Lecture Notes §3. Eigenvalues and eigenvectors 3.4 Properties of eigenvalues and eigenvectors Eigenvalues and eigenvectors have a number of useful proper- ties (we have already mentioned some of them in §§3.1-3.3). Property 1. Let A be ann x n matrix, and 1, A2,..., An be the eigenvalues of A. Then (i) the matrix AT has the same eigenvalues \j, A2,-.-; An; (ii) the inverse matrix A“! (if it exists) has eigenvalues Nea ae eee we (iii) the matrix A — aI has eigenvalues Ar — a, Ag — @,...; An — O; (iv) for any non-negative integer k, the matrix A* has eigen- values \¥, Af... , AK. Proof. (i) Indeed, recalling that det(A) = det(A*) for any square ma- trix A (see §1.5), the characteristic equation (3) for the matrix A implies (see also §1.4) det(A — AI)" = det(A? — AI) =0. This immediately shows that ) is also an eigenvalue of A’. (ii) If A has an inverse A~', we can left-multiply the eigen- value problem (1) by AW! to obtain Aq} (Ax) = ATAx. This gives x=\A™ x 3.11 21: Linear Algebra | Lecture Notes §3. Eigenvalues and eigenvectors or, dividing by the scalar \, Alx=\!x. This shows that if \ and x are, respectively, an eigenvalue and an eigenvector of A then \~! and x are, respectively, an eigenvalue and an eigenvector of AW'. Properties (iii) and (iv) are proved similarly to property (ii). Property 2. For any square matrix A, the sum of eigenval- ues is equal to the sum of diagonal elements of A (which is called the trace of A). Property 3. For any square matrix A, the product of eigen- values is equal to the determinant of A. N.B. When using Properties 2 and 3, one must count re- peated eigenvalues according to their multiplicity. Property 4. The eigenvectors of a square matrix A corre- sponding to distinct eigenvalues are linearly independent, i.e., one eigenvector cannot be written as a linear combination of the other eigenvectors. 2E1; Linear Algebra | Lecture Notes §3. Eigenvalues and eigenvectors 3.5 Diagonalization of matrices Diagonalization means transforming a non-diagonal matrix into an equivalent matrix which is diagonal and hence is sim- pler to deal with. 3.5.1 Diagonalizing matrices with distinct eigenvalues Let A be ann x n matrix, A € R"*". Assume that A has n distinct eigenvalues A, A2,..., An with the corresponding eigenvectors X),X2,..-,Xn So that Ax, = Ax = xr Ax. = 2X. = X2r2 AX, = AnXn = Xn An Let us write these equations as columns of a matrix (Ax, Ax. ... AXn) = (x11 X2AQ -.. XnAn) or M0... 0 A (x1 X2 .-. Xn) = (x1 X-.. X) | 0 2... OF], 0 0... An which can be written as the matrix equation AP = PD, (4) where Bi (ie Xap Xn) is the matrix formed by eigenvectors of A and a ooo D={0 »... 0 W soc 25 3.13 2E1: Linear Algebra | Lecture Notes §3 Eigenvalues and eigenvectors is the diagonal matrix of eigenvalues. Since the matrix A has n distinct eigenvalues, the eigenvec- tors Xj, X,...,X» are linearly independent (see Property 4 in §3.4). Therefore, the matrix P (which is formed by these eigenvectors) is not singular, i.e., detP # 0 and there exists the inverse matrix P~!. Then, we left-multiply (4) by P~! to obtain P'AP=P"'PD, which implies (due to P~'P = I) the matrix equation P'AP =D. (5) Points to note: any n x n matrix with n distinct eigenvalues can be diag- onalized; the matrix P in (4) is called the modal matrix of A; since D is a diagonal matrix with eigenvalues ,,..., An, which are the same as those of A, the matrices D and A are called similar matrices; the transformation of A into D using (5) is called a simi- larity transformation; ¢ if we right-multiply (4) by P~! we obtain A=PDP", (6) which is an alternative form of (5). 3.14 2E: Linear Algebra | Lecture Notes $3 Eigenvalues and eigenvectors Example 3.2 (continued): find the similarity transformation 7 . 11 for the matrix A = ( 2): We have (see previous calculations for this example in §3.3) 11 10 a=(j Ab p= (i a fil 1 fi =~! P=(01), P=) Find AP and PD: 44 14\_ (12 (32) a= (83), - 47] OW 42 PD = Cet) (82) = (oy) Check that AP = PD and write the similarity transforma- tion: -pp Vv Heme p'ap © D (NDC = (oes or A= ppp, r= (ot) (62) (09), 3.15 2El; Linear Algebra | Lecture Notes 3. Eigenvalues and eigenvectors 3.5.2. Diagonalization of symmetric matrices Let A be a symmetric n x m matrix with real elements, i.e., A = A‘. From §3.1 and §3.2 we know that — the eigenvalues of A are always real; — the eigenvectors associated with distinct eigenvalues of A are mutually orthogonal. Key point: if A is symmetric then the modal matrix P formed from suit- ably normalized eigenvectors satisfies the property P-!=P" or, equivalently, PP™=P™P=I. (7) Due to this property, the diagonalization of symmetric matri- ces is simple: (5),(7) => PAP =PTAP=D. Points to note: any symmetric matrix with real elements can be diagonal- ized (even if some of its eigenvalues are repeated); © square matrices satisfying (7) are called orthogonal matri- ces). 3.16 2E: Linear Algebra | Lecture Notes {3 Eigenvalues and eigenvestors Example 3.6: find the orthogonal modal matrix for the sym- metric matrix A = (; 0) , and diagonalize A. First, we wert to fined the eigeuvaturs of A -h 4) 0 yy det (A-Z) - | Ml hte Hence, Vere ave two eigenvalues: y= 4, Nae-4 Find the eigenvectors corvespousing to Ay=4 : “ 2 o4 a) x, Xa Kr ae = =? aie aa Coe (x) yee hamee Re((s) for amy #0 is the tigenvectov covresponcing to hq, we worma lize it , Vth K= (IE) suck that pet =(Uiny =Uel) * 2 Find fe cUgenveetov corre spoudivg to Age-A! oye O1\(K\ ny (\ cy )Ree rk Ax =) => ( Jn) =(-1) (2) ? yee , fix Xrzk then Kyo -l amd the normaligzect agenmctor corresponding ty Ap=-4 is eo =e . ya - We The modal matrix P= i ye) ig orthogonal because ot. pt T ya we WE\./A0 BIS PT) indeest ppt = (HE WE) (MEE) (3°) The diagonal matnix similar t Ais De e oF then A= PDPT 3.17 2EI; Linear Algebra | Lecture Notes ___§3 Eigenvalues and eigenvectors 3.5.3 Matrices with repeated eigenvalues Let us briefly address the case of non-symmetric matrices with at least one repeated eigenvalue. As we can see from two examples below, some matrices of this type are diagonalizable but others are not. Example 3.5 (continued): prove that the similarity transfor- ») (ie. that A mation is not possible for the matrix A = ( is not diagonalizable). As we have seen in §3.3 (see Example 3.5 therein), the ma- trix A has one eigenvalue \ = 2 of multiplicity 2, and the Fi | , i corresponding eigenvectors are multiples of ( pltesy 0). 2). Q. If we attempt to form a modal matrix P from any two of these Fi 1 clean eigenvectors, e.g., ({) and ( 0 ) it will always have zero determinant. Hence, the inverse matrix P~! does not exist, and the similarity transformation P~1AP that we use to diag- onalize a matrix is not possible here. 5-4 4 Example 3.7: the matrix A = | 12 —11 12 } has eigenval- 4-45 ues —3, 1, 1. The eigenvector corresponding to the eigen- value —3 is x = (13 1)" (or any multiple). Investigate the eigenvectors associated with the repeated eigenvalue \ = 1 3.18 2E1: Linear Algebra | Lecture Notes 7 3. Eigenvalues and eigenvectors and deduce whether A is diagonalizable or not. For the required eigenvector y = (y; ye yz)’ we need to solve the equation Ay = ly, ie., 5 -4 4 TT Mw 12 -11 12) [wm] =| 4-4 5/ \ys Ys, After simplification, each equation here gives y; —y2+y3 = 0. So we have just one equation in three unknowns, so we can choose any two values arbitrarily. For example, the choices y=l 0 (and hence ys = —1) and y, = 0, yy = 1 (and hence y3 = 1) give rise to linearly independent eigenvectors 1 0 yD=| 0 and y= [1 —1 1 We can thus form a non-singular modal matrix P from y") and y) together with given vector x: ela) P=(3 0 1], detP=-1#0. 1-11 We can then indeed diagonalize A through the transformation —3 00 leo P'AP=D={| 0 10) wihP’=[{2 -1 1 001 3-2 3 3.19 2E1: Linear Algebra | Lecture Notes §3 Eigenvalues and eigenvectors Points to note: non-symmetric matrices with repeated eigenvalues may be diagonalizable but may be not; © an nxn matrix with repeated eigenvalues can be diagonal- ized provided we can obtain 7 linearly independent eigen- vectors for it; this will be the case if, for each repeated eigenvalue ); of multiplicity m; > 1, we can obtain m; linearly independent eigenvectors. 3.20 2E1: Linear Algebra | Lecture Notes __§3 Eigenvalues and eigenvectors 3.6 Powers of a matrix Let A be a diagonalizable square matrix. Then the similarity transformation allows us to write (cf. (6)) A=PDP. We can use this result to obtain the powers of A, a process which is sometimes useful in control theory. Note that A?=AA, A®=AAA, ete. Clearly, obtaining high powers of A directly would in general involve many multiplications. The process is quite straight- forward, however, for a diagonal matrix D: a, 0... 0 pa| 9 me Of, 0 0 nn ay 0... 0 ai 0... 0 p=pp = | ° & 0 0 az 0 0 0 Onn 0 0 Gyn az, 0 0 0 ads 0 oe Now, using the relation A = PDP~! we can obtain a formula for powers of A in terms of the easily calculated powers of the 3.21 2: Linear Algebr | Lecture Notes 63 lgenalues and eigenvectors diagonal matrix D: A2?=AA = (PDP™')(PDP~') = PD(P"'P)DP™ PDIDP! = PD*P™. Similarly, A®= AA = (PD’P"!)(PDP™) = PD*(P*P)DP™* = PD'IDP'= PDP. The general result: let A be a diagonalizable matrix with eigenvalues Aj, A2,...,An and let P be the modal matrix of A. Then for any positive integer k, 0 0 | pa 0 0 ae Example 3.2 (continued): given A = ( ») find AX. From previous calculations for tris komple in §3,5.4 we howe A- (24)(83)G 7) Then - , -) me (LE ECE) 0) “(4 (ST (288) 3.22 2E1: Linear Algebra | Lecture Notes §3 Eigenvalues and eigenvectors 3.7 Applications of eigenvalues to solving systems of dif- ferential equations In this section we shall apply matrix analysis, eigenvalues, and eigenvectors to solve systems of first order ordinary dif- ferential equations. Differential equations and their systems arise in many areas of mathematics and engineering, for ex- ample in control theory and in the analysis of electrical cir- cuits. The unknowns in these equations are functions (e.g., of the time variable ¢). A number of techniques (analytical and numerical ones) have been developed to solve such systems of equations. The Laplace transform is one example of analyt- ical techniques. We shall study another analytical technique based on eigenvalues and eigenvectors. Our first step will be to recast the system of ordinary differen- tial equations in the matrix form X= Ax, where A is ann x n coefficient matrix of constants, x is the n x 1 column vector of unknown functions, and x is the n x 1 column vector con- taining the derivatives of the unknowns. Then, the main step will be to use the modal matrix of A to diagonalize the sys- tem of differential equations. After change of variables, this will transform the system x= Ax into new system y= Dy, where D is a diagonal matrix. We shall find that this new di- agonal system of differential equations can be easily solved. This special solution will allow us to obtain the solution of the original system. We start with a simple example of uncoupled system of dif- ferential equations. 2E1: Linear Algebra | Lecture Notes §3 Eigenvalues and eigenvectors Example 3.8: solve the following system of linear equations (8) 2 = 3x2, where « and 22 are the functions of time ¢, and 2‘, is short- hand for “2. This system is uncoupled (the first equation in (8) involves only the unknown « and the second equation involves only 9). Therefore, we can simply solve each equation separately: {ne =Ce a(t) = Cye*, where C\ and C2 are constants determined by the initial con- ditions (e.g., 21(0) and 79(0)). Remark: when solving system (8) we needed no knowledge of matrix theory. However, we should note that introducing the vector @ ay(t x= (20) system (8) can be written in the following matrix form Re aoe parrnere een oa) eee tee by 03 The solution is » Ce’ x)= (Ce) 3.24 2El: Linear Algebra | Lecture Notes §3 Eigenvalues and eigenvectors We now describe how the matrix theory works for the coupled system of differential equations. Example 3.9: solve the following system of linear equations 4 = 4a, + 2x9 f= —a2 + 22 (9) with initial conditions 2 (0) = 1 and x9(0) = 0. First, write (9) in matrix form x= Ax: (2\_ [4 2 () rahe ta] > (4 a] he), \aa/. Our next goal is to find the modal matrix P. To that end, we need to find the eigenvalues and eigenvectors of the matrix A. Write the characteristic equation for A: -d 2 det (AAI) 0 = | ‘| °° (u-rj(ier) + 2 = N-5A+6 -0 and find the eigenvalues: \, = 2 and 9 = 3. Find the eigenvector corresponding to \,: Mr2 => AX= 2k = | YX) 42K. = 2K =X Thy 2 Oke >] A t2K =O x, Niet k eet os Fx Xyo4, then x= -4 =-xy 3.25 21: Linear Algebra | Lecture Notes §3 Eigenvalues and eigenvectors Find the eigenvector corresponding to A»: heed = ARE BK => | YK + 2k. = 3K | =X + Xa = 3kz >) h+2k2=0 -X, - 2K. =O Fix Xe=40 | thu Kt -2 21- (7). Thus P = CG 7) D= G 5): We then use the similarity transformation (see (6)) =? Hy =-Qke A=PDP! to rewrite (9) as x= PDP" !x, or, after left-multiplying both sides by P~!, Pl x= DP x. (10) Now, let us introduce a “transformed” vector y = P-'x. Then y= P-! &. After this change of variables equation (10) be- comes y= Dy. We can solve this simple equation (see Example 3.8): @ 2t vit) = (C8). Then we need to “transform back”: eee 1 -2\ (Ce) _ (Cie* — 2Che* ieee een Coe} ~ \-Cye* + Cre)” 3.26 2E1: Linear Algebra | Lecture Notes 43. Eigenvalues and eigenvectors Hence a(t) = —Ce* + Chet. It remains to find the constants C; and C2 using the initial conditions. We have 21(0) = Cy —2C,=1 CG=-1 = x9(0) = —C, + Ch = 0 Cy =-1. Thus, the solution to the original system of differential equa- tions is fae = 2eit — ett wo(t) = e — et, {ne = Cre" — 20, e Points to note: e For any system of differential equations of the form x= Ax, where A is ann x 1 coefficient matrix with distinct eigenvalues Aj, A2,...,An and ¢ is the independent vari- able, the solution is written as x= Py, where P is the modal matrix of A and + y= (ci eM, Ope, 2, Cre) ; the decoupling method discussed above can be readily ex- tended to systems of second order differential equations X= Ax which could arise, for example, in a mechanical system consisting of coupled springs (here, X is shorthand for @, for Gp»): 3.27

You might also like