Professional Documents
Culture Documents
RVN 2021 Lecture 1B
RVN 2021 Lecture 1B
Show how to
• Use least-squares estimation to
determine unknown parameters
from a set of measurements
• Extend least-squares estimation
to nonlinear problems
• Account for variation in
measurement quality in a least-
squares solution
Apply these techniques to some
example problems
y1 y2 y3 y
z A curve z Let’s have some fun!
z3 z3
z2 z2
z1 z1
y1 y2 y3 y y1 y2 y3 y
RVN Lecture 1B – Introduction to Least Squares Estimation 6
1. Formulating the Problem z A straight line
The Problem (3)
z3
z = G(y) z2
There’s lots of options for G: z1
Even more if you assume the
observations have errors:
y1 y2 y3 y
z Another curve z Another straight line
z3 z3
z2 z2
z1 z1
y1 y2 y3 y y1 y2 y3 y
RVN Lecture 1B – Introduction to Least Squares Estimation 7
1. Formulating the Problem
The Problem (4)
We need to know what the shape
z z
of the function should be
z3 z3
z2 z2 We use the physics of the problem
z1 z1
to propose a suitable model
y1 y2 y3 y y1 y2 y3 y
z z
z3 z3
z2 z2
z1 z1
y1 y2 y3 y y1 y2 y3 y
z z
z3 z3
z2 z2
z1 z1 e.g. a wall forms a straight line in
the horizontal plane
y1 y2 y3 y y1 y2 y3 y
y y
h ( x, y1 ) x1 + x2 y1
z = h ( x, y ) = =x +x y
h ( x, y2 ) 1 2 2
where:
x1 y1 z1 z
x= y = z = z = x1 + x2 y
x2 y2 z2
As z is a linear function of z2
both x1 and x2 z1
1 y1 x1
z = H (y) x = x
1 y 2 2 y1 y2 y
RVN Lecture 1B – Introduction to Least Squares Estimation 16
1. Formulating the Problem
Example 1: A Straight Line Function (2)
We are solving z = Hx where
z z = x1 + x2 y
x z 1 y1
x = 1 z = 1 H (y ) =
x
2 z
2 1 y 2
z2
The measurement matrix, H, is square
z1
and non-singular (provided y1 y2), so it
can be inverted.
The solution is thus x = H–1z y1 y2 y
Let (y1 , z1) = (4, 4) and (y2 , z2) = (12, 6) Therefore:
−1 −1
x1 1 y1 z1 1 4 4 1.5 −0.5 4 3
= = =
x 1 y z 1 12 6 −0.125 0.125 6 = 0.25
2 2 2
See RVN Least-Squares Examples.xlsx on Moodle
RVN Lecture 1B – Introduction to Least Squares Estimation 17
1. Formulating the Problem
General Problem Formulation
A measurement, z1, can depend on multiple known parameters, y1, y2…
A known parameter, y1, can impact multiple measurements, z1, z2…
Any component of z and h can be a function of any component of y.
Different components of z and h can also be functions of different states
x1 y1 z1 h1 ( x, y ) h1 ( x, y )
x = x2 y = y2 z = z2
x1 x2
h2 ( x, y ) h2 ( x, y )
H (y) =
h1 ( x, y )
x1 x2
z = h ( x, y ) = h2 ( x, y )
RVN Lecture 1B – Introduction to Least Squares Estimation 18
1. Formulating the Problem
Handling Real Measurements (1)
Measurements are always subject to error
Measured
value
z = z + Error ~ is called ‘tilde’
True value
Therefore, states or parameters determined from those measurements
are also subject to error
Estimated
value
x̂ = x + e Error ^ is called ‘caret’
True value
ˆ = H −1z and x = H −1z then e = H −1ε
For a linear system, if x
But, we cannot use x = H–1z if there are more measurements than states
1. Only square matrices can be inverted
2. The simultaneous equations will contradict each other because of the
measurement errors
We need a new approach: Least-squares Estimation
RVN Lecture 1B – Introduction to Least Squares Estimation 20
Contents
What do we do?
y1 y2 y3 y
RVN Lecture 1B – Introduction to Least Squares Estimation 22
2. Linear Least-Squares Estimation
Adjusting the Measurements to Fit
We assume that z is subject to measurement error, but ignore errors in y
We make an adjustment to each z observation to make z fit the
function h(x,y). This adjustment is called the residual, v.
z Sometimes the opposite
sign convention is used:
~
v = – dz+
z3
~
~ z3 + v3
~
z2 z2 + v2
~
~
z1 z1 + v1
Tilde denotes a
~
z measurement
– not the exact
y1 y2 y3 y value
RVN Lecture 1B – Introduction to Least Squares Estimation 23
2. Linear Least-Squares Estimation
Modifying the Measurement Model
General measurement model: z + v = h ( x, y )
Linear measurement model: z + v = H(y )x
~
where x1 y1 z1 v1
x2 y2 z2 v2
x= y = z= v= n = number of states
m = number of
xn ym zm vm measurements
Expanding:
+
ˆ
x +T
H T
ˆ
Hx +
− ˆ
x +T
H T
z − z T
ˆ
Hx +
+ z T
z = 0 – (4)
xˆ
Differentiating:
+T T
2xˆ H H − 2z H = 0
T T
– (5) Noting that a b = bT
a
(H H) H Hxˆ = ( H H ) H T z
T −1 T + T −1
– (7)
xˆ = ( H H ) H T z
+ T −1
Cancelling: – (8)
z (
= H ( H H ) HT − I z
T −1
)
~
z3
~
~ z3 + v3
~
z2 z2 + v2
~
~
z1 z1 + v1
The red line shows zˆ = H ( y ) x
ˆ+
y1 y2 y3 y
RVN Lecture 1B – Introduction to Least Squares Estimation 30
2. Linear Least-Squares Estimation
Example 2: A Straight Line (1)
We have y, z coordinates of four points along a
wall. We assume…
1. The y coordinates are exact
2. The z coordinates have measurement errors
3. The Wall is straight 20
z (3,18)
A straight line is represented by z = x1 + x2y, 15
z1 3 1 y1 1 0 (1,7)
z2 7 1 y2 1 1 5
z = = H (y) = = 1
z 11
3 1 y3 2 (0,3) y
0
z4 18 1 y4 1 3 0 1 2 3
xˆ2 4.9
See RVN Least-Squares Examples.xlsx on Moodle
RVN Lecture 1B – Introduction to Least Squares Estimation 32
Contents
x3 z = x2 x32 − ( y − x1 )
2
2
( ) ( E1 E ) ( N 1 N )
2
zr1 h x, y y − x + y − x
=
z h x, y
1
=
r2 2 ( ) 2
( y E 2 − xE ) + ( y N 2 − x N )
2
RVN Lecture 1B – Introduction to Least Squares Estimation 35
3. Applying Least Squares to Nonlinear Problems
Nonlinear Problems (3)
Unfortunately, observations are not always a linear functions of the states.
Example C: Determining positions
Landmark 2
from optical angle measurements North yE2, yN2
Landmark 1
yE1, yN1
Bearing 2
Bearing 1
User
xE, xN
z 1 h1 ( x, y ) arctan 2 ( ( yE1 − xE ) , ( y N 1 − xN ) )
z = h x, y =
2 2 ( ) arctan 2 ( ( yE 2 − xE ) , ( yN 2 − xN ) )
The least-squares method can only solve linear problems
RVN Lecture 1B – Introduction to Least Squares Estimation 36
3. Applying Least Squares to Nonlinear Problems
Finding an Equivalent Linear Problem
We cannot solve a nonlinear h1 (x , y )
problem using least-squares h (x , y )
estimation directly z= 2 h ( x,y ) H ( y ) x
h (x , y )
Instead, we must formulate an m
equivalent linear problem, such as
z1 dz1/dx1 = H11
dz = H(x,y) dx, where
dz is the change in z, and
dx is the change in x dz1
To use least-squares we must essentially
turn a nonlinear problem into a linear one
dx1 x1
RVN Lecture 1B – Introduction to Least Squares Estimation 37
3. Applying Least Squares to Nonlinear Problems
Linearisation using Taylor’s Theorem
Applying Taylor’s theorem to the measurement model…
h ( x, y )
then… h ( x , y ) h ( x, y ) + x − x
x
h ( x, y )
or h ( x , y ) h ( x, y ) + H ( x, y ) x − x where H ( x, y ) =
x
Rearranging: h ( x , y ) − h ( x, y ) H ( x, y ) x − x
x
Each row corresponds to one m
h x1 h m x2 h m xn x=x zm
z − h ( xˆ − , y ) + v = h ( xˆ + , y ) − h ( xˆ − , y )
x̂1− x1
We can then use this to model a linear relationship…
RVN Lecture 1B – Introduction to Least Squares Estimation 40
3. Applying Least Squares to Nonlinear Problems
Linearising the Problem (2)
From the previous slide…
First-order
z − h ( xˆ − , y ) + v = h ( xˆ + , y ) − h ( xˆ − , y ) Taylor series
approximation
Measurement H (xˆ − , y ) xˆ + − xˆ − – Linearisation
innovation,
b, or dz− = H (xˆ − , y )d x
= measurements where dx = xˆ + − xˆ −
minus predictions
b1 + v1
b + v H(xˆ − ,y )dx
r =2 x r
r!
0
( x1 − y1 ) + ( x2 − y2 )
2 2
z1 h1 ( x, y )
= =
z6 h6 ( x, y )
( x3 − x1 ) + ( x4 − x2 )
2 2
See RVN Least-Squares Examples.xlsx on Moodle
RVN Lecture 1B – Introduction to Least Squares Estimation 47
3. Applying Least Squares to Nonlinear Problems
Example 3: Total Station Positioning (3)
Step 1: Determine North Step 2: Predict
the measurement the states
A: coordinates y1, y2
model - Bearing
Point Easting Northing
Measurement z14 B 10.10 7.10
Bearing AB from N C 6.50 9.90
D 6.60 6.60
B: coordinates x1, x2 E 6.60 5.40
F 8.30 4.40
z14 = h14 ( x, y ) = arctan 2 ( ( x1 − y1 ) , ( x2 − y2 ) )
Step 3: Measurement Measured Predicted b = z − zˆ −
Calculate the Range AB = z1 2.882 2.902 -0.020
Measurement Range BC = z6 4.491 4.561 -0.070
innovation Bearing AB = z14 3.124 3.107 0.017
h1 ( x ) xˆ1− − y1
( )
Use predicted states −
for the measurement H11 xˆ = x =
( xˆ − y1 ) + ( xˆ − y2 )
− 2 − 2
1 −
matrix: x = xˆ
1 2
−
ˆ −y
x
Simplifying: H11 ( xˆ ) = 1 − 1 ( xˆ − y1 ) + ( xˆ − y2 )
− − 2 − 2
as ẑ =
−
1 1 2
zˆ1
See RVN Least-Squares Examples.xlsx on Moodle
RVN Lecture 1B – Introduction to Least Squares Estimation 49
3. Applying Least Squares to Nonlinear Problems
Example 3: Total Station Positioning (5)
Step 4: Calculate the Measurement matrix - Bearing
−
ˆ −y
x
Simplifying: H14,1 ( xˆ ) = 2 − 2 2 ( xˆ − y1 ) + ( xˆ − y2 )
− − 2 − 2
as ẑ =
−
1 1 2
( zˆ1 )
See RVN Least-Squares Examples.xlsx on Moodle
RVN Lecture 1B – Introduction to Least Squares Estimation 50
3. Applying Least Squares to Nonlinear Problems
Example 3: Total Station Positioning (6)
Step 5: Solve C A
xˆ + xˆ − + ( H T H ) H Tb
−1
Expectation operator
– Gives the mean value of E ( ) = 0 E(z) = z
an infinitely large sample
( )
Czij = E ( zi − zi ) ( z j − z j ) = zi zj zij Correlation coefficient
Varies between
−1: fully anticorrelated
Covariance of ith and jth Measurement error 0: uncorrelated
measurement errors standard deviations +1: fully correlated
RVN Lecture 1B – Introduction to Least Squares Estimation 56
4. Weighted Least-Squares Estimation
Measurement Error Covariance Matrix
Expectation of square of the measurement error vector
Comprises variances and covariances of all of the measurement errors
xˆ = ( H C H ) H T C−z 1z
+ T −1 −1
z
Residuals:
Unweighted solution for comparison
v = Hxˆ + − z
xˆ = ( H H ) H T z
( )
−1
= H ( H Cz H ) H TC−z 1 − I z
+ T −1
T −1
E(x ) = x
Expectation operator
– Gives the mean value of E (e) = 0 ˆ +
an infinitely large sample
=E ( e ) =E ( ( xˆ − x ) )
2
estimates will usually
2
x2
2
2
+
2 2
have different
variances
=E ( e ) =E ( ( xˆ − x ) )
2
2
xn
2
n
+
n n
( )
Cxij = E ( xˆi+ − xi )( xˆ +j − x j ) = xi xj xij Correlation coefficient
Will be non-zero unless
the state estimation can
Covariance of ith and jth State estimation error be partitioned into
state estimation errors standard deviations separate problems
RVN Lecture 1B – Introduction to Least Squares Estimation 63
4. Weighted Least-Squares Estimation
State Estimation Error Covariance Matrix
Expectation of square of the error in the state vector
Comprises variances and covariances of all of the state estimation errors
xˆ xˆ + ( H C H ) H T C−z 1b
+ − T −1 −1
Weighted nonlinear solution: z
Cx = ( H Cz H ) H Cz E ( εε ) Cz H ( H Cz H )
T −1 −1 T −1 T −1 T −1 −1
E ( εε )=C Cx = ( H C H ) H C Cz C H ( H C H )
T T −1 −1 T −1 −1 T −1 −1
z z z z z
Cx = ( H C H ) ( H C H )( H C H )
T −1 −1 T −1 T −1 −1
z z z
Cx = ( H C H )
T −1 −1
z
RVN Lecture 1B – Introduction to Least Squares Estimation 65
4. Weighted Least-Squares Estimation
Example 4: Total Station Positioning (1)
Building on Example 3 C A
A total station measures 13
ranges between 6 points
Coordinates of point A are known B
D
Coordinates of the other 5 points
E
are to be determined
The bearing of A to B (with respect F
to north) is also measured