Professional Documents
Culture Documents
線性代數 (2) 教材7 4
線性代數 (2) 教材7 4
21
In the plane , we would like to find a line that best fits the data of
When the points are collinear on the same line , then it is clear that best fits the data.
Whereas if they are not collinear, we look for a line such that the sum of the
squared vertical distance of the data from it is smaller than from other lines. That is, we
must find and such that the quantity
is minimized. The technique to find the best line is called the method of least squares,
is called the error sum of squares, and the line for which is minimized is called the
least-squares line.
.
Note that is the distance from to the vector ,
1
which lies in the subspace of . In order to minimize , we need only
choose the vector that is nearest to . Since the are not all equal, it follows
that is linearly independent. It follows that is a basis for . Therefore,
is invertible. By closest vector property, is the orthogonal projection of on ;
Example1. (P.468)
The method of least squares can be developed for finding a quadratic polynomial
that best fits the data of points: .
Notice that the new error sum of squares is
.
Let be a subspace of with a basis , and . Thus, we have
2
A system of linear equations often has no solution (i.e., it is inconsistent.)
In this circumstance, we are interested to find a vector for which is minimized.
Let be the column space of . By the closest vector property, to minimize the
error , the vector must be the orthogonal projection of on . It follows that
is the unique solution of the system of least norm, called the least-norm solution.
3
Homework:#1, #11, #17, #21 on page 473