Professional Documents
Culture Documents
Basic Equations Vectors, Norm, Dot Product (cont) Matrix Algebra, Identity and Inverse
Matrix
Network Flows u·v = u1v1 + u2v2 dot product
1. the flow in an arc is only in one directions ...+unvn (A + B)ij = (A)ij + (B)ij (A - B)ij = (A)ij -
||u|| ||v|| cos(θ) (B)ij
2. flow into a node = flow out of a node
u and v are orthogonal if u·v = 0 (cos(θ) = 0) (cA)ij = c(A)ij (A T)ij = (A)ji
3. flow into the network = flow out of the
network a set of vectors is an orthogonal set iff vi·vj (AB)ij = ai1b1j + ai2b2j + ... aikbkj
= 0,if i≠j Inner Product (number) is u Tv = u·v, u
Balancing Chemical Equations
a set of vectors is an orthonormal set iff vi·vj and v same size
1. add x's before each combo and both side
= 0,if i≠j, and ||vi|| = 1 for all i
Outer Product (matrix) is uvT, u and v can
2. carbo = x1 + 2(x3), set as system, solve
(u·v)2 ≤ ||u||2||v||2 Cauchy-Schwarz be any size
Matrix
or Inequality
(AT) T = A (kA)T = k(A)T
augmented variables and soluti‐ |u·v| ≤ ||u|| ||v||
matrix on(rhs) (A+B)T = AT + BT (AB)T = BTAT
d(uv) ≤ d(u,w) + Triangle Inequality
coefficient coefficients only, no rhs tr(A T) = tr(A) tr(AB) = tr(BA)
d(w,v)
matrix ||u+v|| ≤ ||u|| + ||v|| uTv = tr(uvT) tr(uvT) = tr(vuT)
||v1 + v2 ... + vk|| = ||v1|| + ||v2|| ... + ||vk|| tr(A) = a11 + a22 ... + (AT)ij = Aji
Vectors, Norm, Dot Product
ann
maginitude (norm) of vector v is ||v||; ||v|| ≥ 0 Lines and Planes Identity matrix is square matrix with 1 along
if k>0, kv same direction magnitude = a vector equation with x = x0 + tv, diagonals
as v k||v|| parameter t -∞ < t < +∞ If A is m x n, A n = A and mA = A
if k<0, kv opposite magnitude = |k| solutin set for 3 dimension linear equation is a square matrix is AB = = BA
direction to v ||v|| a plane invertible(nonsingular)
vectors in Rn (n = v = (v1, v2, ..., if x is a point on this plane n·(x-x0) = 0 if:
dimension) vn) (point-normal equation) B is the inverse of A B = A-1
v = P1P2 = OP2 - OP1 displacement A(x-x0)+B(y-y0)+C(z-z0) = x0 = if A has no inverse, A is not invertible
vector 0 (x0,y0,z0), (singular)
norm/magnitude of sqrt( (v1)2+(‐ n = (A, B, C)
det(A) = ad - bc ≠ 0 is invertible
vector ||v|| v2)2...) general/algebraic equation Ax+By+Cz =
if A is invertible: (AB)-1 = B-1A-1
||v|| = 0 iff v =0 ||kv|| = |k| ||v|| D
(An) -1 = A-n = (A-1) n (AT) -1 = (A-1) T
unit vector u in same u = (1/ ||v||) v two planes are parallel if n1 = kn2,
orthogonal if n1·n2 = 0 (kA)-1 1/k(A -1), k≠0
direct as v
e1 = (1,0...) ... en = standard unit
Elementary Matrix and Unifying Theorem
(0,...1) in Rn vector
elementary matrices are invertible
d(u,v) = sqrt((u1-v1)2 + (u2-v2)2 ... (un-
A-1 = Ek Ek-1 ... E2 E1
vn)2) = ||u-v||
[ A | ] -> [ | A -1 ]
d(u,v) = 0 iff u = v
(how to find inverse of A)
Ax = b; x = A -1b
Elementary Matrix and Unifying Theorem Examples of Subspaces Linear Independent (cont)
(cont)
IF: w1, w2 are then w1+w2 are - list the vectors as the columns of a matrix,
- A -> RREF = within S within S row reduce it, if many solution, then it is
- A can be express as a product of E and kw1 is within S dependent
- A is invertible - the zero vector 0 it self is a subspace - after RREF, the columns with leading 1's
- Ax = 0 has only the trivial solution are a maxmially linearly independent subset
- Rn is a subspace of all vectors
- Ax = b is consistent for every vector b in according to Pivot Theorem
Rn - Lines and planes through the origin are
subspaces
- Ax = b has eactly 1 solution for every b in Diagonal, Triangular, Symmetric
Rn - The set of all vectors b such that Ax = b is Matrices
- colum and rowvectors of A are linealy consistent, is a subspace
Diagonal all zeros along the
independent - If {v1, v2, ...vk} is any set of vectors in R n,
Matrices diagonal
- det(A) ≠ 0 then the set W of all linear combinations of
Lower zeros above diagonal
- λ = 0 is not an eigenvalue of A these vector is a subspace
Triangular
- TA is one to one and onto
W = {c1v1 + c2v2 + ... ckvk}; c are within
If not, then all no. Upper zeros below the
real numbers
Triangular diagonal
Consistency Symmetric if: AT = A
Span
EAx = Eb -> Rx = b' , where b' = Eb
- the span of a set of vectors { v1, v2, ... vk} Skew-Symm‐ AT = -A
(Ax=b) [ A | b ] -> [ EA | Eb ] (Rx = b') is the set of all linear combinations of these etric if:
(but treat b as unknown: b1, b2...) vectors
Determinants
For it to be consistent, if R has zero span { v1, v2, ... vk} = { v11t, t2v2, ... , tkvk}
rows at the bottom, b' that row must det(A) = a1jC1j + expansion along
If S = { v1, v2, ... vk}, then W = span(S) is a
equal to zero a2jC2j ... + anjCnj jth column
subspace
det(A) = ai1Ci1 + expansion along
Ax = b is consistent if and only if b is a
Homogeneous Systems ai2Ci2 ... + ainCin the ith row
linear combination of col(A)
Linear Combination of the vectors: Cij = (-1)i+j Mij
v = c1v1 + c2v2 ... + cnvn Linear Independent Mij = deleted ith row and jth column matrix
(use matrix to find c)
- if unique solution for a set of vectors, then - pick the one with most zeros to calculate
Ax = 0 Homogeneous
it is linearly independent easier
Ax = b Non-homog‐
c1v1 + c2v2 ... + cnvn = 0; all the c = 0 det(AT) = det(A) det(A-1) =
enous
- for dependent, not all the c = 0 1/det(A)
x = x0 + t1v1 + t2v2 ... + Homogeneous
Dependent if: det(AB) = det(A)det(B) det(kA) =
tkvk
- a linear combination of the other vectors kndet(A)
x = t1v1 + t2v2 ... + tkvk Non-homog‐
- a scalar multiple of the other - A is invertible iff det(A) not equal 0
eneous
- a set of more than n vectors in Rn
- det of triangular or diagonal matrix is the
xp is any solution of NH x = xp + xh
Independent if: product of the diagonal entries
system
- the span of these two vectors form a plane det(A) for 2x2 matrix ad - bc
and xh is a solution of H
system
u x v = -v x k(u x v) = (ku) x v = u x (kv) Eigenvalues and Eigenvectors compression in y-direction [x, y] -> [x, ky]
u shear in x-direction T(x,y) = (x+ky, y);
Ax= λx
uxu=0 parallel vectors has 0 for c.p. [x+ky (1, k), y( 0, 1)]
det(λI - A) = (-1) n det(A - λI)
u (u x v) = 0 v (u x v) = 0 shear in y-direction T(x,y) = (x, y+kx);
pa(λ) = 3x3: det(A - λI); 2x2: det(λI - A)
u x v is perpendicular to span {u, v} [x (1, 0), y (k, 1)]
- solve for (λI - A)x = 0 for eigenvectors
||u x v|| = ||u|| ||v|| sin(theta), where theta is orthogonal projection on the xy-plane: [x, y ,
Work Flow: 0]
the angle between vectors
- form matrix
orthogonal projection on the xz-plane: [x, 0 ,
- compute pa(λ) = det(λI - A)
z]
- find roots of pa(λ) -> eigenvalues of A
orthogonal projection on the yz-plane: [0, y ,
- plug in roots then solve for the equation
z]
reflection about the xy-plane: [x, y, -z]
reflection about the xz-plane: [x, -y, z]
reflection about the yz-plane: [-x, y, z]
ker(T) is the set of all vectors x such that col(A) = columns with leading ones from
T(x) = 0, RREF matrix, find the vector, original A
ker(T) = span{(v)} null(A) = free variable's vectors
the solution space of Ax = 0 is the null null(AT) = after transform, the free variable
space; vector
null(A) = ker(A)
The Rank Theorem: rank(A) = rank(AT) for
range of T, ran(T) is the set of vectors y any matrix have the same dimension
such that
rank(A) = # of free vectors in span
y = T(x) for some x
dim(row(A)) = dim(col(A)) = rank(A)
ran(T) = col([T]) = span{ [col1], [col2] ...}; Ax
dim(null(A)) = nullity(A)
=b
Important Facts:
Orthogonal Compliment, DImention
1. T is one to one iff ker(T) = {0}
Theorem
2. Ax = b, if consistent, has a unique
solution S⟂ = {v ∈ Rn | v · w = 0 for all w ∈ S}
iff null(A) = {0}; Ax = 0 has only the trivial S⟂ is a subspace of Rn; S⟂ = span(S)⟂ = W⟂
solution iff null(A) = {0}
row(A)⟂ = null(A) null(A)⟂ = row(A)
Important facts 2: ((S ⟂) ⟂ = S iff S is
1.T:Rn -> Rm is onto iff the system Tx = y subspace
has a solution x in Rn for every y in Rm
col(A) ⟂ = null(AT) null(AT) ⟂ = col(A)
2. Ax = b is consistent for every b in R m(A is
The Dimension rank(A) + nullity(A) =
onto) iff col(A) = Rm
Theorem n
The composition of T2 with T1 is: T2 ◦ T1 A is m x n matrix (k + (n-k) = n)
(T2 ◦ T1)(x) = T2(T1(x)); T2 ◦ T1: Rn -> Rm if W is a subspace dim(W) + dim(W⟂) =
compostion of linear transformations corres‐ of Rn n
ponds to matrix application: [T2 ◦ T1] = [T1]
[T2]