Professional Documents
Culture Documents
Lecture 7
Regularized least-squares and Gauss-Newton
method
multi-objective least-squares
regularized least-squares
nonlinear least-squares
Gauss-Newton method
71
Multi-objective least-squares
(x Rn is the variable)
we can make one smaller, at the expense of making the other larger
J1 x(1)
x(3)
x(2)
J2
note that x Rn, but this plot is in R2; point labeled x(1) is really
J2(x(1)), J1(x(1))
J1 x(1)
x(3)
x(2)
J1 + J2 =
J2
where
A y
A = , y =
F g
optimal x:
T
1
x = aa + I a
0.9
0.8
0.7
J1 = (y 1)2
0.6
0.5
0.4
0.3
0.2
0.1
0
0 0.5 1 1.5 2 2.5 3 3.5
3
x 10
J2 = kxk2
T
AT y,
1
x = A A + I
Ax y is sensor residual
m
X
kr(x)k2 = ri(x)2,
i=1
where r : Rn Rm
we measure i = kx bik + vi
(vi is range error, unknown but assumed small)
m
X
NLLS: find x Rn that minimizes kr(x)k2 = ri(x)2, where
i=1
n m
r:R R
Gauss-Newton method:
10 beacons
+ true position (3.6, 3.2); initial guess (1.2, 1.2)
range estimates accurate to 0.5
5
5
5 4 3 2 1 0 1 2 3 4 5
16
14
12
10
5
0
5 0
0
5 5
10
kr(x)k28
0
1 2 3 4 5 6 7 8 9 10
iteration
1 2
1 1
5
5 4 3 2 1 0 1 2 3 4 5
so that next iterate is not too far from previous one (hence, linearized
model still pretty accurate)