You are on page 1of 4

線性代數(2)教材 2020.4.

21

Sec 7.4 Least-squares Approximation (PP. 466~475)

In the plane , we would like to find a line that best fits the data of

points: , where the are not all equal.

When the points are collinear on the same line , then it is clear that best fits the data.
Whereas if they are not collinear, we look for a line such that the sum of the
squared vertical distance of the data from it is smaller than from other lines. That is, we
must find and such that the quantity

is minimized. The technique to find the best line is called the method of least squares,
is called the error sum of squares, and the line for which is minimized is called the
least-squares line.

Theorem. Let , and , then .

Pf. Let and . Then

.
Note that is the distance from to the vector ,

1
which lies in the subspace of . In order to minimize , we need only

choose the vector that is nearest to . Since the are not all equal, it follows
that is linearly independent. It follows that is a basis for . Therefore,
is invertible. By closest vector property, is the orthogonal projection of on ;

i.e., . Since , we have

It follows that , called the normal equations. Hence,

Example1. (P.468)

The method of least squares can be developed for finding a quadratic polynomial
that best fits the data of points: .
Notice that the new error sum of squares is

In this case, we let , , and . Then

.
Let be a subspace of with a basis , and . Thus, we have

the normal equation , and hence .


Example2. (P.469)

2
A system of linear equations often has no solution (i.e., it is inconsistent.)
In this circumstance, we are interested to find a vector for which is minimized.
Let be the column space of . By the closest vector property, to minimize the
error , the vector must be the orthogonal projection of on . It follows that

, which is guaranteed to be consistent, since . Thus, we


can solve the general solution of the system , where the columns
of come from the columns of that forms a basis for .
Consider any consistent nonhomogeneous system , with , we let be any
solution of the system, and let . Then is a solution of the system , if and
only if for some . Hence we wish to select a vector such that the is
minimized. Since , to minimize , the vector must be the
orthogonal projection of on ; that is, . Thus,

is the unique solution of the system of least norm, called the least-norm solution.

Equivalently, is an optimal solution of the optimization problem: .


Example3~4. (P.472 + P.473)

3
Homework:#1, #11, #17, #21 on page 473

You might also like