You are on page 1of 4

Solving linear systems, Gaussian Elimination, back substitution, cost of elimination (Sec 1.

3 - 1 hr) instill the process of Gaussian elimination and back substitution terms: pivot, inconsistent system, triangular system, singular system understand how elimination can break down (no row exchanges yet) introduce the idea of computing the cost of large computations Column vectors, addition, scalar multiplication, the two geometric interpretations (Sec 1.2 - 1 hr) terms: vectors, vector addition, scalar multiplication, linear combos understanding vectors as arrows and the geometry of linear combos solving 2 x 2 (3 x 3) systems as the intersection of lines (planes) solving 2 x 2 (3 x 3) systems as a linear combo problem understanding geometrically how systems can be singular unique solution, no solution, innitely many solutions Matrices, addition, scalar and matrix multiplication, matrix form of a linear system (Sec 1.4 - 1 hr) terms: matrix, matrix addition, scalar multiplication, matrix multiplication introduce linear systems as the matrix equation Ax = b instill the associativity and non-commutativity of multiplication, distribution laws terms: identity matrix, elementary matrix (and role in row operations) Elementary matrices, permutation matrices, LU factorization (Sec 1.5 - 1 hr) terms: triangular matrix, elementary matrix introduce the idea of an inverse via elementary matrices derive the triangular factorization of a matrix (when no row exchanges are needed) instill the importance of one linear system = two triangular systems terms: row exchanges, permutation matrices derive the general elimination principle PA = LU for non-singular systems Inverses, Gauss-Jordan elimination, transposes, symmetric matrices (Sec 1.6 - 1 hr) terms: inverse Gauss-Jordan elimination and computing inverses instilling the fact that inverse exists = there is a full set of pivots = non-singular terms: transpose, symmetric matrix Application to tridiagonal systems (Sec 1.7 - 1 hr) terms: two-point boundary value problem, difference equations showing how a two-point boundary value problem can be approximated by a matrix equation that is tri-diagonal and symmetric explain how this reduced the cost of computation Vector spaces, subspaces, column space and null space of a matrix (Sec 2.1 - 1 hr) instill the idea of a vector space as a set of objects and addition and scalar multiplication behaving as for real numbers, but instead with lists of reals examples: column vectors, 2 x 3 matrices, polynomials of degree less than n terms: subspace, closure rules, column space, null space instill the importance of column and null spaces in studying linear systems Finding the column space and null space, Echelon form, general factorization, pivot and free variables, superposition, rank, efcient solution methods (Sec 2.2 - 2 hrs) terms: row echelon form, reduced row echelon form state the general factorization rule PA = LU terms: pivot variables, free variables, special solutions, complete solution intro the idea of dimension informally: dimension of the null space = # of free variables terms: superposition, complete solution, particular solution, null space solution

drive home the idea that the reduced row echelon form Rx = d of Ax = b reveals all terms: span, rank relate the dimensions of column space and null space to rank r and m and n describing the column space in terms of pivot columns Linear independence, basis, dimension, coordinates relative to a basis (Sec 2.3 - 1 hr) terms: linear independence and dependence instill how to check for independence terms: spanning set examples: column vectors, 2 x 3 matrices, symmetric matrices, polynomials terms: basis, minimal spanning set, maximal independent set recall now how to nd bases for the column space and the null space uniqueness of representation in terms of a basis and coordinates of a vector relative to a basis Four fundamental subspaces (Sec 2.4 - 1 hr) terms: column space, null space, row space, left null space, row rank, column rank give examples and nd the dimensions of each subspace and what space they live in instill the geometric picture of A as a mapping Application to networks (Sec 2.5 - 1 hr) terms: edge-node incidence matrix for a digraph interpretation of null space as giving the non-uniqueness of potentials and connectedness interpretation of column space as giving restrictions of potential differences interpretation of left null space as giving current loops interpretation of row space as giving a spanning tree for the graph Linear Transformations (Sec 2.6 - 2 hrs) terms: linear transformation geometric examples in 2D abstract examples in function space derive the method of associating a matrix with a linear transformation (its coordinates) and instill how it is used give several examples terms: rotation, projection, reection Orthogonal vectors and subspaces, orthogonal complement, Fundamental Thm Part II (Sec 3.1 - 1 hr) terms: length, inner product, orthogonal vectors, unit vectors, orthonormal basis derive the result that mutually orthogonal implies independent terms: orthogonal subspaces, orthogonal complements establish the unique decomposition into orthogonal components establish the orthogonality results for the four fundamental subspaces Projection onto a vector (Sec 3.2 - 1 hr) schwartz inequality , establish the connection between angles and the inner product derive formula for orthogonal projection onto a line derive the alternate formula that uses a projection matrix derive the inner product property of transposes of matrices Projections onto subspaces and least squares approximation (Sec 3.3 - 2 hrs) explain how least squares approximation is related to orthogonal projections terms: normal equations, best estimate, projection onto a subspace connection between invertibility of ATA and independence of the columns of A terms: projection matrix discuss the least-squares tting of data

discuss data tting with models other than linear Orthogonal bases, orthogonal matrices, Gram-Schmidt, function spaces and Fourier series (Sec 3.4 - 2 hr) terms: orthogonal matrix, orthonormal basis expressing b as a linear combo of an orthonormal basis length in terms of orthonormal coordinates projection matrices in the orthogonal case description of the Gram-Schmidt procedure description of the A = QR factorization discuss orthogonality in function spaces and Fourier series Complex Numbers and Matrices (Sec 5.5 - 1 hr) crash course on complex variables terms: Hermitian matrix discuss the topic of roots of unity and their properties (Sec 3.5) Denition and properties of the determinant (Sec 4.2 - 1 hr) describe the three properties dening the determinant describe seven additional properties that derive from the rst three (product rule, transpose rule, inverse rule, singularity rule, etc.) Formulas for the determinant, cofactor expansions, Cramers rule, adjoint (Sec 4.3 - 1 hr) derive the product of pivots rule derive the permutation form (optional) derive the cofactor expansion form of the determinant (important for next section) Applications of determinants (Sec 4.4 - 1 hr) cofactor matrix formula for the inverse of a matrix Cramers rule volume of a box in terms of determinants Eigenvalues, eigenvectors, computation examples (Sec 5.1 - 1 hr) motivate the eigenvalue problem for A using const coefcient differential systems present in detail a 2 x 2 example and its full solution discuss what the eigenvalue problem looks like for diagonal, triangular and projection matrices discuss the sum and product rules for eigenvalues Diagonalization of a matrix (Sec 5.2 - 1 hr) present the diagonalization results for a square matrix and a justication point out that not all matrices are diagonalizable, but those with distinct eigenvalues are discuss the issue of possibly not enough eigenvectors when eigenvalues are repeated discuss the fact that complex eigenvalues can arise even with real matrices discuss how to nd powers of matrices using diagonalization Matrix exponentials, constant-coefcient differential equations, stability (Sec 5.4 - 2 hr) introduce the matrix exponential and the solution of a const coefcient linear ODE system cover the diffusion model approximation for 2 x 2 systems (connections with previous application) terms: equilibrium, stable, neutral stable, unstable Complex matrices, Hermitian and unitary matrices, spectral theorem (Sec 5.5 - 1 hr) terms: Hermitian of a matrix, Hermitian matrix, unitary matrix present the special properties of the eigenvalue problem for Hermitian matrices present the spectral theorem (Sec 5.6) and decomposition into a sum of rank 1 matrices

Similarity transformations, change of basis (Sec 5.6 - 2 hr) terms: similar matrices, similarity transformation derive the eigenvalue and eigenvector properties of similar matrices develop the change of basis formulae for going from one basis to another (handout) Minima, maxima, saddle points, denite and semi-denite quadratic forms (Sec 6.1 - 1 hr) terms: quadratic form, stationary point, positive denite, saddle point illustrate completing the square for minimum tests in 2 variable problems derive the quadratic form for minimum tests using Taylor Approximations Tests for positive deniteness (Sec 6.2 - 2 hr) relate positive deniteness for 2 variable problems to signs of eigenvalues state the positive deniteness theorem for n variable systems show how the symmetric factorization of A produces a sum of squares formula introduce the idea of a square root of a positive denite matrix describe semi-deniteness and corresponding tests develop the connection between quadratic forms, ellipsoids, and coordinate changes SVD (Sec 6.3 - 2 hr) present the singular value decomposition and its connection to the four fundamental subspaces application to image processing and expansions in terms of rank 1 matrices application to continuum physics and robotics via the Polar Decomposition theorem