18 views

Uploaded by Fratenity T Makande

- Refresher Algebra Calculus
- 214.9 (Wishart)
- Abstraction in Fitch's Basic Logic
- 14
- Stickel Attack
- Post Processual Archaeology
- Inversion
- Matlab
- Ee Engineering Mathematics
- Dasar Komputasi
- 08Chap4.pdf
- Metaphilosophy
- Matrix Inversion in Matlab - Gauss-Jordan Method
- Matrix Inversion Formula
- Hansen(2006, Econometrics)
- Egnstr Theory Part -1
- An Algebraic Formulation of Kinematic Isotropy and Design of Isotropic 6-6 Stewart Platform Manipulators
- Linear Systems Ch4 [Compatibility Mode]
- 4.10 Matrices, SPM Paper 2 (Long Questions) - SPM Mathematics.pdf
- Khazaka Detailed

You are on page 1of 5

A BSTRACT. The following are proofs you should be familiar with for the midterm and nal exam. On both the midterm and nal exam there will be a proof to write out which will be similar to one of the following proofs.

1. P ROOFS YOU ARE RESPONSIBLE FOR ON THE MIDTERM AND FINAL Theorem 1.1. For A,B nxn matrices, and s a scalar (1) T r(A + B) = T r(A) + T r(B) (2) T r(sA) = sT r(A) (3) T r(AT ) = T r(A) Proof Note that the (i, i)-th entry of A + B is aii + bii , the (i, i)-th entry of cA is caii , and the (i, i)-th entry of AT is aii . Then consider the following: (a) T r(A + B) = = = (b) T r(cA) = = = (c) T r(A )

T

(a11 + b11 ) + ... + (ann + bnn ) (a11 + ... + ann ) + (b11 + ... + bnn ) T r(A) + T r(B) (ca11 + ... + cann ) c(a11 + ... + ann ) cT r(A) a11 + ... + ann T r(A)

= =

Theorem 1.2. If you have a set of n vectors S = {v1 , ..., vn } in Rm where n > m, then the set S of vectors is linearly dependent. Proof Begin by constructing the following mxn matrix A= v1 | . . . vn ... |

Since this matrix has m rows and there can be at most one pivot per row, it follows that Rk(A) m < n = number of columns. By a theorem from the book (Thm 1.8) we know that the the columns of a matrix are linearly independent if and only if the rank of the matrix is equal to the number of columns. The negation of this statement is that the columns of a matrix are linearly dependent if and only if the rank of the matrix is strictly less than the number of columns (by denition it is never possible for the rank of a matrix to be strictly more than the number of columns as there is at most one pivot per column). However, above we have shown that for the matrix A, the rank is in fact strictly less than the number of columns thereby immediately implying that the columns of A are linearly dependent or in other words the set S = {v1 , ..., vn } in Rm is linearly dependent. Theorem 1.3. If A, B are invertible nxn matrices, then:

1

PROOFS

(1) A1 is invertible with inverse A (2) AB is invertible with inverse B 1 A1 (3) AT is invertible with inverse (A1 )T Proof Part (1): Since A is invertible, it follows that A1 such that: AA1 = A1 A = In . However, by denition this immediately implies that A1 is invertible with inverse A. Part (2): Since A,B are invertible, it follows that A1 , B 1 such that: AA1 = A1 A = In = BB 1 = B 1 B. Then consider the following computations: AB(B 1 A1 ) (B 1 A1 )AB = A(BB 1 )A1 = AIn A1 = AA1 = In = B 1 (A1 A)B = B 1 In B = B 1 B = In

Which implies that AB is invertible with inverse B 1 A1 . Part (3): Since A is invertible, it follows that A1 such that: AA1 = A1 A = In . Then consider the following computations: AT (A1 )T (AT )1 AT = =

T (A1 A)T = In = In T (AA1 )T = In = In

Which implies that AT is invertible with inverse (A1 )T . Theorem 1.4. Let A be an mxn matrix and B an nxp matrix, then (AB)T = B T AT Proof Let the matrices A = (aij ), B = (bij ) i.e. the notation is that the (i,j)th entry of a matrix M is given by mij . Then note that AT = (aji ), B T = (bji ) Then consider the following to complete the proof:

n

(ABij )T

= = = =

(

k=1 n

k=1 n

( (

bki ajk )

k=1 T

((B AT )ij )

Note: (ABij ) stands for the (i,j)th entry in the AB matrix, and similarly, ((B T AT )ij ) stands for the (i,j)th entry in the B T AT matrix. Theorem 1.5. Let A be an invertible nxn matrix. Then A1 is unique.

PROOFS

Proof Assume that there are two inverses: A1 , A1 . Since they are both inverses, we have the following: AA1 = In = AA1 = = = = = Thereby completing the proof. Theorem 1.6. Let A be an nxn matrix and let B be the nxn matrix gotten by interchanging the the ith and jth rows of A. Then det(A) = det(B). Proof By induction. For the base case, consider the case where n=2, then direct computation shows that: det( a c b d ) = (ad bc) = bc ad = det( c a d b ) A1 (AA1 ) = A1 (In ) = A1 (AA1 ) (A1 A)A1 = A1 = (A1 A)A1 In A1 = A1 = In A1 A1 = A1 = A1 A1 = A1

Next let n 3 and assume the result is true for (n-1)x(n-1) complete the induction by proving the result for nxn matrices. Let a11 . . . . . . . . . a1n a11 . . . . . . . . . . . . . . . . . . . . . . . . aj1 . . . . . . . . . ajn ak1 . . . . . . . . . , . . . . . . A= . B= . . . . . . . . ak1 . . . . . . . . . akn aj1 . . . . . . . . . . . . . . . . . . . . . . . . an1 . . . . . . . . . ann an1 . . .

matrices, and we will ... ... . . . . . . ... ... . . . . . . ... . . . ... ... . . . ... a1n . . . akn . . . ajn . . . ann

i.e. B is obtained from A by interchanging the jth and kth rows. Assume that that row p = j, k (we can always do this since in this case we have n 3, so there must exist some row p that is being left xed). Then consider the following computation to complete the proof (where the determinant of both matrices is calculated by going across the pth row):

n

det(A)

= = =

m=1 n

apm (Cpm ) apm ((Cpm )) apm (Cpm ) [by the induction assumption for (n-1)x(n-1) matrices]

m=1 n

m=1

det(B)

Notation: In the computation above, Cpm = (1)p+m det((n1)x(n1)) minor of the A with the pth row and mth columns removed) Cpm = (1)p+m det((n1)x(n1)) minor of the B with the pth row and mth columns removed)

PROOFS

1 det(A)

Proof First note that the identity matrix is a diagonal matrix so its determinant is just the product of the diagonal entries. Since all the entries are 1, it follows that det(In ) = 1. Next consider the following computation to complete the proof: 1 = = = det(In ) = det(AA1 ) det(A)det(A1 ) 1 det(A1 ) = det(A) [using the fact that det(AB) = det(A)det(B)] [Note that det(A) = 0 since A is invertible]

2. P ROOFS THAT YOU ARE RESPONSIBLE FOR ON THE F INAL ONLY Theorem 2.1. Similar matrices have the same eigenvalues with the same multiplicities. Proof Let A and B be similar nxn matrices. That is, there exists an invertible nxn matrix P such that B = P 1 AP . Since the eigenvalues of a matrix are precisely the roots of the characteristic equation of a matrix, in order to prove that A and B have the same eigenvalues, it sufces to show that they have the same characteristic polynomials (and hence the same characteristic equations). To see this consider the following: B (t) = det(B tIn ) = det(P 1 AP tP 1 (In )P ) = det(P 1 AP P 1 (tIn )P ) = det(P 1 (A tIn )P ) = det(P 1 )det(A tIn )det(P ) 1 = det(A tIn )det(P ) det(P ) 1 det(P )det(A tIn ) = det(P ) = det(A tIn ) = A (t) Corollary 2.2. Let A and B be similar 2x2 matrices, then T r(A) = T r(B) and det(A) = det(B) [Note: This results holds true for nxn matrices as well] Proof Since A and B are similar, in the proof of theorem 2.1 we saw that A and B have a b the same characteristic polynomial. However for an arbitrary 2x2 matrix M = , c d we know that the characteristic polynomial is given by M (t) = t2 (a + d)t + (ad bc) = t2 T r(M )t + det(M ) Thus since the similar 2x2 matrices A and B have the same characteristic polynomials, this implies that t2 T r(A)t + det(A) = t2 T r(B)t + det(B) Then equating the terms of the equivalent polynomials, it follows that: T r(A) = T r(B) and det(A) = det(B)

PROOFS

Theorem 2.3. Let v, w, z Rn be orthonormal vectors. Prove that {v, w, z} are linearly independent Proof Assume that there are coefcients a, b, c R such that (2.1) av + bw + cz = 0

Then since v, w, z Rn are orthonormal, we have that (2.2) (2.3) vw =vz =wz =0 vv =ww =zz =1

Using facts 2.2 and 2.3 in conjunction with equation 2.1 it follows that: a = a + 0 + 0 = a(v v) + b(v w) + c(v z) = v (av) + v (bw) + v (cz) = b = c = = v (av + bw + cz) = v (0) = 0 0 + b + 0 = a(w v) + b(w w) + c(w z) = w (av) + w (bw) + w (cz) 0 + 0 + c = a(z v) + b(z w) + c(z z) = z (av) + z (bw) + z (cz) z (av + bw + cz) = z (0) = 0

- Refresher Algebra CalculusUploaded byishaaneja1
- 214.9 (Wishart)Uploaded byDavidArechaga
- Abstraction in Fitch's Basic LogicUploaded byMelissa Sullivan
- 14Uploaded byNehruBoda
- Stickel AttackUploaded bySotirios Hassapis
- Post Processual ArchaeologyUploaded byKrisha Desai
- InversionUploaded byasaghaei
- MatlabUploaded byRonald Mulinde
- Ee Engineering MathematicsUploaded byHariShankarSharma
- Dasar KomputasiUploaded byregina pramudita
- 08Chap4.pdfUploaded byNitai Chandra Ganguly
- MetaphilosophyUploaded byvvavva7
- Matrix Inversion in Matlab - Gauss-Jordan MethodUploaded byanon020202
- Matrix Inversion FormulaUploaded bysayedmh
- Hansen(2006, Econometrics)Uploaded bymoimemevollbio
- Egnstr Theory Part -1Uploaded bykapilkumar18
- An Algebraic Formulation of Kinematic Isotropy and Design of Isotropic 6-6 Stewart Platform ManipulatorsUploaded byHareesha N G
- Linear Systems Ch4 [Compatibility Mode]Uploaded byUmair Zulfiqar
- 4.10 Matrices, SPM Paper 2 (Long Questions) - SPM Mathematics.pdfUploaded byzaedmohd
- Khazaka DetailedUploaded byPramod Srinivasan
- FT12Uploaded byKumar Rajnikant
- Matrices TransformationUploaded bykarthikesan48
- 01717562Uploaded bypmahesh268
- MATH219 Lecture 7Uploaded bySerdar Bilge
- The Four Types of Sums of SquaresUploaded byamiba45
- HW 03.pdfUploaded byQFTW
- Diagonalization of MatricesUploaded byiman0nami
- Course NotesUploaded byVaibhav Gupta
- MATLAB Course - Part 1.pdfUploaded byApichat Junsod

- DPP-1..pdf iit mathsUploaded bySunil Pandey
- Fu Huan-Huan et al- Dynamical Stability and Attractor of the Variable Generalized Chaplygin Gas ModelUploaded byAzmerzr
- ADM CMM Course Test 1Uploaded bypandaprasad
- ass1Uploaded byDeepak Tholia
- 1309.5888Uploaded byHarikrishnan Ramani
- Solution 6Uploaded byAndrew Orr
- Klein GeometryUploaded bycalamart
- Caso 1 (Version 2).XlsbUploaded byLuis Manuel Castillo Jesus
- Assignment 1 - Identical ParticlesUploaded byPranayYadav
- Nonlinear Dynamics and Chaos. Applied to Weather ForecastingUploaded bygdenunzio
- LN_Week1Uploaded bywilltuna
- 12864773 Physics Deep Thaught TxtUploaded bySE Tan
- Notas CaosUploaded byJuan Sebastián Porras Arce
- Physics Isipathana College 12 structured02Uploaded byMohamemd Fazeel Mohammed Asif
- Partition FunctionUploaded bymayil
- Acc ProjUploaded byKian Chuan
- Quantum Plasmas by CovleaUploaded byScutul_Crestin
- Two Dof SystemUploaded byKaushik Reddy
- Molecular OrbitalUploaded byMichael Martin
- AN4246 Calibrating an eCompass in the Presence of Hard and Soft-Iron InterferenceUploaded byherrJezni
- 205C Midterm 1Uploaded byinfiniti786
- Quantum Walk on a Spin Network and the Golden Ratio as the Fundamental Constant of NatureUploaded byKlee Irwin
- ejercicios cuánticaUploaded byGuillermo Prol Castelo
- 1112SEM2-MA1521Uploaded bythientrancong
- MIT18 06S10 Final AnswersUploaded byandreaskailina
- Twisted de Rham Cohomology, Homological Definition of the Integral and 'Physics Over a Ring', ShapiroUploaded byMon0id
- MA6151 - Mathematics - I NotesUploaded byRUDHRESH KUMAR S
- ExitonMagnonPhononStructureOfFeBO3Uploaded byDima Afanasiev
- Analysis of Nonlinear Cylindrical Waveguide Taper Using Modal Matching TechniqueUploaded byeditor3854
- ChaosUploaded byMounesh Basavaraj