Professional Documents
Culture Documents
Lin Alg 20 Part 1
Lin Alg 20 Part 1
1. Vectors
2. Linear (in)dependence and basis
3. Matrices
4. Linear equations
5. Solving linear equations by Gaussian elimination.
6. Linear transformations (identity, composition, inverse)
7. Kernels and ranges of matrices, vector subspaces in general.
8. Dot product and orthogonality.
1 / 37
Vectors in Rn
vn
1 0 0
0 1 0
0 0
e1 = , e2 = , en =
..
canonical basis vectors
.. .. .
. . 0
0 0 1
v1 , v2 , . . . , vn real numbers.
2 / 37
Basic operations: addition of vectors and multiplication by
a scalar
v1 + w1
v+w =
..
.
vn + wn
αv1
αv = ... , α ∈ R
αvn
w is a linear combination of v1 , . . . , vk if
w = α1 v1 + α1 v2 + . . . αk vk
0 = α1 v1 + α1 v2 + . . . αk vk .
4 / 37
Linear dependence - alternative statements
or
v11 v12 . . . v1k α1 0
.. .. ..
. =
. .
vn1 vn2 . . . vnk αn 0
5 / 37
Linear independence and basis
6 / 37
Matrices
7 / 37
Example of a matrix
1 1 1
A = 2 −3 −1 .
−1 1 2
Question: Are the columns (rows) of A linearly dependent?
8 / 37
Multiplication of a vector by a matrix
Given
α1
α = ...
A= v1 v2 . . . vk ,
αk
we have
Aα = α1 v1 + α2 v2 + . . . + αk vk .
In other words Aα is a linear combination of the columns of A with
the coefficients α1 , α2 , . . . , αk .
Pk
α1 v11 + α2 v12 + . . . + αk v1k i=1 αi v1i
α1 v21 + α2 v22 + . . . + αk v2k Pk αi v2i
i=1
Also Aα = =
.. ..
. .
Pk
α1 vn1 + α2 vn2 + . . . + αk vnk i=1 αi vni
9 / 37
Matrix multiplication
10 / 37
Systems of linear equations
Suppose
b1
A = (aij )i=1,...,n,j=1,...k and b = ... are given.
bn
Ax = b,
where
x1
x = ... is an unknown vector.
xk
11 / 37
Familiar form
x1 a1 + x2 a2 + . . . + xk ak = b.
12 / 37
A concrete (high school level) example:
x +y +z =1
2x − 3y − z = 0
−x + y + 2z = −1.
13 / 37
Gaussian elimination
Given a system of equations
a11 x1 + a12 x2 + . . . a1k xk = b1
a21 x1 + a22 x2 + . . . a2k xk = b2
..
.
an1 x1 + an2 x2 + . . . ank xk = bn ,
Gaussian elimination consists of two types of elementary operations
that are carried out in succession:
(RI) Interchange of equations,
(RA) Adding a multiple of an equation to another equation and
replacing it by the resulting equation.
The system obtained by any of these operations has the same
solutions.
Note: Multiplying an equation by a scalar does not change
solutions and can be used in numerical approaches.
14 / 37
Gaussian elimination on the level of matrices
15 / 37
Examples
0 1 1 | 1 1 1 3 | 2
(A, b) = 1 1 3 | 2 → (U, c) = 0 1 1 | 1
2 3 4 | 1 0 0 −3 | −4
Solution: x3 = 43 , x2 = − 13 , x1 = − 35 .
1 1 2 | 1 1 1 2 | 1
(A, b) = 0 1 1 | 1 → (U, c) = 0 1 1 | 1
2 3 5 | 1 0 0 0 | −2
16 / 37
1 1 2 | 1 1 1 2 | 1
(A, b) = 0 1 1 | 1 → (U, c) = 0 1 1 | 1
2 3 5 | 3 0 0 0 | 0
17 / 37
Return to linear independence
Consider:
v1 , . . . , vk and A = v1 v2 . . . vk .
Note that
• the zero vector 0 is always a solution of Ax = 0.
• Linear independence of v1 , . . . , vk is equivalent to the
requirement that 0 is the only solution of Ax = 0,
• Linear independence of v1 , . . . , vk is equivalent to the
requirement that all the pivots of U not equal 0.
• Consequently (by Theorem above) if A is an n by n matrix then
Ax = b has a unique solution for any choice of b
18 / 37
Linear transformations
αk
19 / 37
Composition and identity
20 / 37
Inverse
α1 , α 2 , . . . , α n
21 / 37
Notes
(1) A−1 can be found by Gaussian elimination by solving for the
columns αj .
(2) (AB)−1 = B −1 A−1
(3) A−1 A = I .
Proof of (3)
• We have defined A−1 by the requirement AA−1 = I and showed
that it can be found by Gaussian elimination.
• Also by Gaussian elimination we can find a matrix B such that
BA = I .
Now it is ‘easy’ to show that B = A−1 :
{b ∈ Rn : Ax = b has a solution.}
{x ∈ Rk : Ax = 0.}
23 / 37
Vector subspace of Rn
24 / 37
Spanning set and basis of a subspace
25 / 37
Example
Let
1 1 2 0
A= 0 1 1 1
2 3 5 1
(1) Find the k for which ker(A) is the subset of Rk .
(2) What is the dimension of ker(A)?
(3) What is rank(A)?
26 / 37
The transpose of a matrix
27 / 37
By Gaussian elimination:
(1) The number of pivots in U equals the number of independent
rows in A.
(2) The number of pivots in U equals the number of independent
columns in A.
rank(A) = rank(AT ).
28 / 37
Dot product (inner product), length and orthogonality
Then
29 / 37
Traingle inequality and Cauchy-Schwartz inequality
√
|v| = `(v) = v · v norm of v
|v + w| ≤ |v| + |w| triangle inequality
|v · w| ≤ |v||w| Cauchy-Schwartz inequality
30 / 37
A note on dot product
v · w = vT w,
31 / 37
Orthogonality of vectors and spaces
Vectors
v1 w1
v2 w2
v= and w =
.. ..
. .
vn wn
are orthogonal if v · w = 0.
1 2
Example: v = and w = are orthogonal.
2 −1
A vector v is perpendicular to a vector subspace W if v is
perpendicular to each vector in W .
Vector spaces V and W are orthogonal if v · w = 0 for every v in
V and w in W .
32 / 37
Orthogonal complement
{v1 , . . . , vk , w1 , . . . , wm }
is a basis of Rn .
33 / 37
The image of a matrix and the kernel of its transpose
Av · w = v · AT w
Proof
Av · w = (Av)T w = vT AT w
= vT AT w = v · AT w.
(Im(A))⊥ = ker(AT ).
34 / 37
Exercices
1. Given
2 −3 0 3
A = 4 −5 1 and b = 7 ,
2 −1 −3 5
U = Lk Lk−1 . . . L1 A = LA.
3. Given
2 −3 0
A = 4 −5 d
2 −1 −3
find the number d such that
Ax = 0
has multiple solutions. What can you say about the solutions of
Ax = b, where b is from exercise 1.?
4. For A from exercise 1. find M such that A = MU
36 / 37
Exercises cont.
37 / 37