SQUARES
D. E. WELLS
E. J. KRAKIWSKY
May 1971
TECHNICAL
REPORT
LECTURE NOTES
NO.
18217
D.E.Wells
E.J. Krakiwsky
May 1971
Latest reprinting February 1997
PREFACE
In order to make our extensive series of lecture notes more readily available, we have
scanned the old master copies and produced electronic versions in Portable Document
Format. The quality of the images varies depending on the quality of the originals. The
images have not been converted to searchable text.
PREFACE
These notes have been compiled in an attempt to integrate a description of the method of least squares as used in surveying with
(a)
statistical concepts
(b)
(c)
he~d
INDEX
Page
1.
2.
3.
Introduction
1.1
1.2
1.3
1.4
10
2.1
Statistical Terms
10
2.2
Frequency Functions
12
2.3
17
2.4
2.5
21
2.6
25
..
20
27
3.1
27
3.1.1
27
3.1.2
28
3.1.3
30
3.1.4
30
3.1.5
33
3.1.6
36
3.2
38
3.2.1
38
Page
40
41
3.2.4
41
43
43
3.3.2
45
3.3.3
46
48
3.5
4.
48
51
3.4.3
51
55
4.1
57
57
4.3
58
...........
. .............
...
59
60
. . . .
60
61
. . . . . . .
63
64
4.11
65
65
Page
5.
68
..................
68
5.1
Introduction
5.2
5.3
5.4
5.5
5.6
5.7
.....
~Iean )l
..
...
..
... .
Mean x and
.. ...
l
....
. . . . . .. . . . .
)l
..
72
and
......
..
73
74
)l
and
...
75
)l
and
71
s2 .
70
.....
Sample Variance
5.8
ll
....
....
76
s2 .
77
2
a . . . . .
78
2 2
5.11 Examination of the Ratio of Two Variances (cr 2 /cr 1 ) in Terms of
si and S~.
79
(si;s~) in
cr~
80
2
81
83
Page
6.
86
Estimator for X.
8'7
6.1
6.2
88
6.3
90
6 .1~
93
6.5
6.6
. 7.
Covariance Matrix of X
94
Summary . .
99
101
7.1
102
7.2
Linearization Examples . . . . . . .
105
108
75
8.
Normal Equations . . . . .
113
116
121
8.1
Introduction . . . . . . . .
121
8. 2
122
8. 3
125
8.lt
126
127
130
8. 5
13.6
Page
9.
10.
132
9.1
132
9.2
Additional Observations
137
9.3
142
9.4
146
Step by
~)tep
151
10.1
152
10.2
156
References
Appendix A:
159
Numerical Example
161
A.l
161
A.2
163
A. 3
Solution . . . .
164
Appendix B:
169
B.l
Normal . .
170
B.2
ChiSquare
171
B.3
Student's
B.4
t~
172
173
Appendix C:
177
Appendix D:
178
Appendix E:
179
LIST OF FIGURES
Page
Fig. 2l
14
Fig. 22
16
Fig. 3l
31
r
. lg. 32
ProbabilityNormal Distribution ..
34
Fig. 33
41a
Fig. 34
47
Fig. 35
52
Fig. 7l
110
LIST OF TABLES
Table
41
67
Table
5l
84
Table
81
INTRODUCTION
conc~pts
of modern
A description
1.1
There will
~gree
to
the
The method
The method of
And finally
4
(or probability density) function which expresses how often each of
these values appear in the situation under discussion.
The most
1.2
A X= L
solution only if
L # 0 (the system is nonhomogeneous),
rank of A
= dimension
rank of [A: L]
= rank
l2a
of X ,
12b
of A (system is
consistent).
12c
5
In the case whetethere are no redundant equations, criterion (l2b) will
mean that A is square and nonsingular, and therefore has an inverse.
The solution is then given by
lL
l3
X= A
In the case where there are redundant equations, A will not be square,
but ATA will be square and nonsingular, and the solution is given by
l4
(See course notes on "Matrices" for a more detailed treatment of the
above ideas).
Let us consider the case where the elements of L are the results
of physical measurements, which are to be related to the elements of
X by equation (1l).
be inconsistent,
and no unique
There are a
number of possible criteria, but the one commonly used is the least
squares criterion; that the sum of the squares of the inconsistencies
be a minimum.
AXL=V
where
~5
not known and must be solved for, since we have no way or knoving what
the inconsistent parts of each measurement will be.
Xso
V = (A X 
L}T (A
X
L) = minimum.
l6
17
and that the "best" estimate or the observation errors or residuals is
given by
18
These estimates are the simple least sauares estimates (also called the
equally weighted least squares estimates).
Often the physical measurements which make up the elements of L
do not all have the same precision (some could have been made using
.different instruments or under different conditions).
be reflected in our least squares estimatioh
process, so we assign to
each measurement a known "weight" and call P the matrix whose elements
are these weights, the weight matrix . we modifY the least squares
criterion to state that the best estimate is that which minimizes the
sum of the squares of the weighted residuals, that is
... T
,.
V P V = minimum.
l9
1
And as we will see in Chapter
X = (ATPA)l
ATPL
110
6~
In
F(X)  L
and
ii)
=V
111
F(X, L + V)
=0
112
1.3
8
inversion of large matrices (and even the multiplication of matrices)
requires a considerable number of computation steps.
Until the advent of large fast digital computers the solution of
large systems of equations was a formidable and incredibly tedious
task, attempted only when absolutely necessary.
Therefore,
We will
1.4
9
"If the astronomical observations and other quantities
on which the computation of orbits is based were absolutely
correct, the elements also, whether deduced from three or
four observations, would be strictly accurate (so far
indeed as the motion is supposed to take place exactly
according to the laws of Kepler) and, therefore, if other
observations were used, they might be confirmed but not
corrected. But since all our measurements and observations
are nothing more than approximations to the truth, the same
must be true of all calculations resting upon them, and
the highest aim of all computations made concerning concrete
phenomena must be to approximate, as nearly as practicable,
to the truth. But this can be accomplished in no other way
than by a suitable combination of more observations than
the number absolutely requisite for the determination of
the unknown quantities. This problem can only be properly
undertaken when an approximate knowledge of the orbit has
been already attained, which is afterwards to be corrected
so as to satisfy all the observations in the most accurate
manner possible."
Note that this single paragraph, written over 150 years ago,
embodies the concepts that
a)
b)
c)
measurement inconsistencies,
e)
10
2.
Statistical terms are in everyday use, and as such are often used
imprecisely or erroneously.
2.1
STATISTICAL TERMS
Statistical
A statistic is a quantitative
A discrete
A constant is a discrete
In general, the result of a
ll
An individual
Usually sample
boundaries~
12
boundaries is called the class interval.
The
The first is to
13
~
(x 0 ) dx
= Pr(x0 <
<
x 0 + dx) ,
21
where the term on the right is read "the probability that the value of
the variate x lies between x
and x
+ dx inclusive".
Xo
=f
oo
(x)
(x) dx = Pr (x < x ) ,

22
where the term on the right is read "the probability that the value of
the variate xis less than or equal to x ".
0
Figure 21.
Probability is represented by an
~rea
For
23
Note that the probability that x lies somewhere between the extreme
limits is certainty (or probability of unity).
00
Pr (
00
~X < + oo)
=f
_co
~(x)
dx
=l
24
14
Figure 2l.
")(
15
The probability that x is less than or equal to x,o is the value of
~
to x
deviation and the range, of which the standard deviation is most often
used.
The expected value of a 'function f(x) is an arithmetic average of
f(x) weighted according to the frequency distribution of the variate x
and is defined
00
25
E [E[f(x)]] = E [f(x)]
E [H(x)] =
The
~ ~
~E
[f(x)]
26a
26c
26d
x itself
00
=E
(x]
=f
00
X ~
(x) d X
27
16
Figure 22.
~(x)
17
The nth moment of a distribution about its origin is defined
00
E [xn]
=f
xn ~ (x) d x
28
00
E [(x~)n]
=f
(x~)n ~(x) d x
29
M(t)
= E[etx] = f
etx ~(x) d x
2lOa
(0)
2lOb
teo
~(x);
~(x)
2.3
18
tributions of a single variate x).
Let
where
xo
1
xo
dx1
xo
2
dX
d~2
dx
xo
n
X0 < X
1 
X0
<
<
X0
n n n
+ dx
oo
19
such that
x1
x2
and
independent.
The expected value of a multivariate function f(X) is defined
00
00
J f(X) ~(X) dx 1 dx 2
E[f(X)] = f
_oo
dx
_oo
213
j.ll
j.l2
xl
E[~2)
=E
E[x ]
n
ll,
= E[X]
~2
X
214
21
215
217a
20
and has values
2l7b
1 < p < +l

1J
<P
(x. ,x.) =
1
J
<P 1
(x.)
1
<P 2 (x.)
and
E[x. x.) = E [x.] E [x.]
1
J
1
J
so that
In fact the covariance crij and the correlation coefficient pij are
measures of the statistical dependence or correlation between x. and
1
X.
2.~.
=C X
218
Then
Uy = E[Y] = E[CX] = C E[X] = C
li:x
219
and
Ly = E[(YUy) (YUy)
. T
] = E[(C X C Ux)(CXCUl) ]
=C
E[(XUx)(XUX)T]CT
or
220
This is known as the covariance law (also called. the law of covariances
and law of propagation of covariances).
21
If Y is nonlinearly related to X
= F(X)
221
then we choose some value X0 and replace F(X) by its 'l'aylor's series
linear approximation about X0
that is
Then
and
Y 
uy  c(x 
where
'Thus
223
which is identical to the covariance law (equation 220), with
2. 5
<lF
=
ax
A distinction is
22
denoted e:.
The most often used estimators are the sample mean and
 = 1:.~
X
n :i
= ...l_E
n1 i
224
(x.x) 2
225
We
see that the value. of a sample statistic will in general vary from
sample to sample, that is the sample statistic itself is a variate .and
will have a distribution, called its sampling distribution.
have three distributions under consideration:
We now
the distribution of
23
Consider the sampling distribution of the sample mean statistic,
called the sampling distribution of means.
itself ha$
= E[n1
E[x]
~rhe
E x.] =  E E[x.)
n i
This distribution
= n1
~ = ~
226
that is the expected value of the sample mean is the population mean.
The variance is
= 1n 2 (0l
~2
~2
But
2
E [x. ] =
E[x.] E[x.]
and
.~. E[x. x.] = n(n1)~ 2 .
l'!"J
Therefore
var(x) =
= a2n
2 (n(a 2 +~ 2 ) + n(n1)~2)  ~2
227
( 2
= 11 E [ LX.
n
J.
But
a2
= 
+ ~2
The mean
24
Therefore
228
= J.l
E[~)
where
E x.
229
where
1
n1
E (x.x) 2
i
l
230
25
2.6
E~r
the population
If the
probability
231
then the interval between e 1 and e 2 is called the lOOa.% confidence
interval for E.
There
b)
c)
d)
26
If the probability of a Type I error is
Pr (hypothesis true but rejected)
=a
232
This
=S
233
Therefore,
27
3.
serve~>
3. 1
3.1.1
31
00
f exp ( y
_oo
oo
;z ) ely dz
32
28
and then transforming from Cartesian to polar coordinates as follows:
[ Y]
z
[c~s6]
=r
33
s~ne
27T
2
f exp(~ ) r dr d6
00
=f
21T
=f
d6
= 21T
34
and
35
Knowing the value of the integral I, 31 becomes
00
f
oo
2
. 1 1/2 exp(=}) dy
(
2ir)
=1
36
= xa
b
37
b > 0
38
f
_oo
1
= _.;:::__
exp [ (xa)
2b 2
39
b(21T)l/2
where oo
This p.d.f. is said to be that of a continuous normal random variable.
3.1.2
29
00
M(t)
=f
tx
~(x) dx
00
00
=f e
tx
co
and by letting y
= xa
b
 bt
(xa)
1
exp [
b(21T)l/2
2b 2
2
] dx
310
we have
2
x = by + b t + a
00
= exp
b2t2
00
[at + 2 ] f
_oo
2 2
1
exp [ _(by+b t) ] bdy
b(21T) 1 / 2
2b 2
~
1
(21T)
311
M(t)
From equation 2lOb, the mean
313
thus
= M"(o)
 [M' (o) ] 2 .
32
= Pr
~(w)
(XH
~ w)
a
= Pr
(x ~ wa + p)
or in integral form
1
wo'+p
~ ~ w) :: f
~
exp ( xp
a{2~) 1 / 2
2a 2
)2
] dx
= (xlJ).IIa
!1i (w)
= ~' (w)
'
cp ( w)
317.
a2
= 1,
=0
and
=0
and a2
=1
we get:
1)
2)
maximum value of
3)
1/[(2~) 1 / 2 ]
at x=O,
33
3.1.5
We have seen that the mean ~ and variance cr 2 are two parameters
of a univariate normal p.d.f.
The arguments
r   <c~)
= P(x~
cr  cr
(cIJ)/cr
=N
00
(c~)
cr
1
(21T)l/2
w2
exp(T) dw
Figure 32
~(w)
35
The value N for the above integral is tabulated for a random
variable n(O, 1).
(x~)/cr,
Normalization, that is
allows probabilities
Required:
Pr (x
4)
Solution:
Pr
from Table B1
(x);1 <
crcr
N(0.5) = 0.6915".
Required:
Pr (1~ x ~ 4)
Solution:
= Pr
= Pr
~
~
= N (~
= N(0.5)
From Table B1
< 2 cr )  Pr
cr
( X);1
)  N
c ~
c ~
(1 < ,?CJ.l .< _2_)
crcr
(XII
~
).l
< _1_ )
cr
(~) = N(
42 )
cr
N( 1 2 )
N(0.5) + N(0.25)  1
Required:
= 0.6915+
 N(..,0.25)
0.5987 1 = 0.2901.
~=2, cr 2=16
Solution:
Pr ( X);1
cr
= N(0.5)
cr
< Cll)
cr
= 0. 95
~c
) = 0. 95
36
(d]J)
c
Example
= 1.645
= 1.64)a
+ JJ
= 8.56.
xis n(JJ, a 2 )
Given:
Required:
= 0.95
Solution:
Pr ( C]J < .Y.]J) < C:;]l ) : 0 95
aa
Pr (~JJ
<
a
c]J)  Pr (~
N(C}l)
(J
N (C]J)
(J
From Table B1
(C:}l)
1!1
c
(Note when
}l
= 0,
[1  N (~))
(J
=1
<
cr
aa
+ 0.95
_c]J)
cr
= 0.95
= 0.95
= 0.975
= 1.96
= 1.96
(J
c ~ 2cr for Pr
}l
= 0.95).
37
these parameters together is called a multivariate normal distribution.
An example of this is in the case of a geodetic control network where
the coordinates of all the stations are peing estimated. and are thus
considered as random variables and are said to have a multivariate
normal distribution.
Form random. variables, the m.dimensional multivariate normal
p.d.f. is
~ (X) =
C exp
x
2
322
X
mxl
323
X
mxl
324
E
rnxm
325
the constant
326
38
Note the similarities of the univariate normal p.d.f. (315) with the
multivariate normal p.d.f. (322), namely
a)
(....1) 1/2
2
vs.
1
(21T)l/2
vs.
0"
b)
c)
(x~) 2
vs.
2. cr2
= C exp
xTf~x
) .
2
(
3.2
3.2.1
327
rr
1 (a)
CX>
=f
a1 ey dy ,
328
where the integral exists for a > 0 and has a positive value.
a
When
=1
CX>
r(l)
=f
ey dy
=1
329
r(a)
= (a1) ~
0
ya 2 ey
dy
= (a1) ~(a1).
330
39
Further, if a is a positive integer and greater than one,
r(a)
= (a1)
= x/a
a>
= (a1)!
0 yields,
oo
= oI (~)
a
r(a)
a1
X) 1 dx
exp ( 
a a
and
00
=I
x) dx
xa1 exp ( 
331
1
= "'"r(a) sa
=0
332
<X< oo)
elsewhere
a.
B = 2.
x of
\)
=2
Thus from 332, a random variable
the form
!j>(x)
=0
(v/21)
oo
333
elsewhere .
practical quantity and has a relationship to the least squares estimation problem discussed in Chapter 6.
x2 (v).
40
3.2.2
=f
M(t)
etx $(x) dx
0
00
=f
tx
0
00
M(=t)
=;
0
()! l)
X
r<Y)2 i/2
2
= x(l2t)/2
r(~)2 v/2
2
1
(~ 1)
exp ( ~) dx
or x=2y/(l2t), yields
2/(l2t)
v/2
r(.Y.)2
2
.{_L_\
= \1
 2t/
M(t)
00
f . 1
o r(~)
P ~
1
= ~
ey dy
335
v
(l2t )"2'
Computing
M' (t)
and
= M'
( 0)
=v
336
41
3.2.4
a)
b)
c)
d)
< co,
xp2
is
If a random
pr (x < x2)

dx
The above integral has been precomputed and the results tabulated in
the body of the table for particular values of
xp2
which correspond
4la
Figure 33.
42
values between 0 and 1.
Example 1  Direct Problem, One Abscissa
Given:
x is x2 (10) that is
ReQuired:
Pr (x ~ 18.31)
Solution:
x2
= 10
= 18.31
(lo)
0.95
Pr == 0.95
Example 2  Direct PrQblem, Two Abscissa
Given:
x2
x as
(20) that is
=20
ReQuired:
Pr (3lt.17~x~959)
Solution:
x2
= 34.17
o.975
= 9.59
and x2
0.025
Pr (x 2
~ x ~ x2
0.97)
0.025
Pr (x ~
x2
Pr (x ~ x2
0.025
0.975
=~
0.975  0.025
ReQuired:
Solution:
is
x2 (10)
that is \)
= 10
<
x2
0.90
xp2 )
= 0.90
= 0.90
43
4  Inverse Problem, Two Abscissa
ExaJ!lple
Given:
x is X (20) that is
Required
:x;.,x!
I
Solution:
\) = 20
<
2
x"x
 p )
2
= 0.99
xp2
1
= 0.005
Pr (x6.oo5
Pr (x
and P2
= 0.995.
= 0.99
~ x~.~ 95 ). ~ Pr (x ~ X~.00 5 ) =
= 0.995
 0.005
2
= 0.99
= 7.43
3.3.1
(.Y. 1)
1 t:<. v
.. 2
I
r(~)2v F
2
exp _(v)
.....2
338
44
=o
00
< w <
:]
elsewhere.
V'
t =
340
V',
or
tul/2
~1/2 , V'
341
= u.
dLJ
dL\J
()t
uu
fJI= av av =
at
au
(~)
1/'::.
\)
t ( Ll.\1 ) 1/2
= (})1/2,
0
342
343
=o
elsewhere.
45
Since we are interested in t only, u is integrated out of the
above expression; the following result is then the marginal p.d.f.
corresponding to t:
$(t)
= I ""$(t,u)du
""
exp [ ~ (1 +
vt2 )]du
345
346
z = u[l + (i_
)]/2
\)
yields
1)/21
$ ( t)
exrt...,~h
.\
2
+ t2 ;}dZ
347
$(t)
= r[(v+l)/2]
(l+t2/v)(v+l)/2
< ""
348
where w is n(O,l) and vis x1(v), and is written in the abbreviated form
t(v).
3.3.2
46
curve.
(Figure 34):
1)
2)
3)
4)
co
<
<
co
=0
'
maximum.
chisquar~
cj> ( t) dt '
_co
x is t(lO) that is v
~
Required:
Pr (x
Solution:
= 10
1.372)
0.90
p.,.
= l. 372
:::
0.90
47
l''igure 3lJ..
~(t)
48
Example 2  Direct Problem
Given:
= 10
x is t(lO) that is v
Required:
Pr ( ~~
~ 2.228)
Solution:
= 2.228
Pr (x ~ t 0 975 )
=1
 Pr Cx ~ t 0 . 975 )
=1
0.975 = 0.025
Pr = 2(0.025)=~
x as t(l4) that is v = 14
Required:
<

x < t ) = 0.90

Pr(tp ~ x ~ tp)
Solution:
=Pr
(x ~ tp)  Pr (x
= Pr (x ~
.5
tp)
= 2 Pr (x ~ tp)  1 = 0.90
or Pr (x ~ tp)
From Table B3 t 0 . 95
3.4
3.4.1
= 0.95
= 1.761
THE F DISTRIBUTION
49
Let us first consider two independent chi 7square random variables
u and v having v1 and v 2 degrees of freedom, respectively.
The joint
p.d.f. of u and v is
<jl(u, v)
349
0
< u < co
0 < v <
=0
co
elsewhere.
u/v 1
350
v/v 2
The transformation
equations are
f
=v
351
or
fv z
1
u=
=z
at

az
352
\),
\)1
=
av
az
=
0
zv 1
v2
353
is then
50
<j>(f,z)
= $(u,v)
det(J)
=
354
v1 f
exp [ 2z ( + 1)
\.1 2 '
v1 z
"2
out
z~
nanely
co
$(f)
=I
$(f, z) dz
""
356
$(f) becomes
(v1 +v2 )/2l
2y
(
0
v1 f/v 2+1
) dy '
v1 /2
r[(vl+v2)/2] (vl/v2)
<j>(f)
=
v
\.1
r(.l:. ) r(_g_ )
2
(o<f'<co)
= 0 elsewhere.
y
357
51
The random variable
where u is
x2(v1 )
and vis
x2 (v2 ),
F ('V
p
v ) = F1_p(v 2 , v1 ).
1~
Note also
B4) are the probability Pr, the abscissa value Fp , and the two
degrees of freedom v1 and v2 .
=1
0
cj>(:r) df ,
Figure 35.
cj;(f)
f=
'DISTRI BUTtoN
53
where
~(f)
=5
and v 2
Required:
Pr (x ~ 2.52)
Solution:
= 10
= 2.52
= 0.90
=4
and v 2
Required:
= 0.95
Solution:
Given:
x is F(4, 8) that is v 1
F 0 95 (4, 8)
=8
= 3.84
x is F(4, 8)
Required:
Solution:
= Pr
= 1
= 0.05
x1 ~ Fp1
Pr (
~ ~~) = 0.05
or
1 < 1 )
Pr (x F
= 0.95
54
Recall that if xis F(4,8) then l i s F(8,4), which gives a second
X
therefore, equating the two probability statements for we can solve for
X
F , that is
p
6.04
= 0.166
Required:
Solution:
is F(5,10)
F and F
such that Pr (F
< x < F ) = 0.90
pl
P2
Pi P2
Pr (F
< x < F
) = Pr (x < F )  Pr (x < F )
pl  p2
 p2
 pl
F
and F
will be chosen such that the remaining probability of
pl
p2
0.10 is divided equally, thus
a)
Pr (x
= 0.95
where Fp
= F0 . 95 (5,10)
b)
Pr (x < F )
 pl
= 0.05
where F
pl
= F0 . 05 (5,10)
<
p2
= 3.33
F0 . 95 (5,10)
Taking b) Pr (x
< F
) = Pr (l
=1
1
Pr (; 2_ F)
>
.1:_ )
pl
 Pr (h < Fl )
X pl
= 0. 05
= 0.95
pl
1
and Pr (; 2_ F0 . 95 (10,5))
From the second of Tables B4
we have
= 0.95
as in example 3.
F0 . 95 (10,5)
= 4.74
F0 . 05 (5, 10)
and from
= 4 ~ 74 = 0.211.
55
n (0, l)
Normal
Chisquare
Student's
t(v)
(o, l)
( x2 ( v) /vJI2
n
x2(vl)/vl
F =
x2(v2)/v2
4.
=L
i=r
x.
1
y =(
where
x. 1
n
L
J.l
 2
(x.  x)
1
n  l
L X
=~
n
Two examples
57
In
thi~
testing.
A random sample x1 ,
Required to
Proof:
prove:~~x~_v___~__n__C_o_,__l_)~
was to take the p.d.f. of x and make the change of variable y=(xJl)/cr
in the integration.
Comment:
=0
and cr 2 = 1 (317).
Required to prove:
_ d
x+n(Jl,)
n
Proof:
58
M(t) = E ( exp [
(x1 +x 2+ .
+xn)
1} and for x.
t
t
t
 Xl] E ( en
 X2) . . E [en
independent M( t )= E[en
g),
statistically
X :"1
n_,
o2t2
M(t) = E[e tx] = exp [11t +   ]
2
= IT
i=l
(~) 2
n
2
n
a2
exp [ 11 . ~+
1 n
ln
= exp[ (l:
n
11.h+
l: a2
1..
a2
t2
t2
2
since 11 1  11 2   11 n
= 11
and o12
= o 22
a2
= o2
We
recognize that the form of the m.g.f. is still normal and has
parameters 11 and o 2 /n, thus it is proved that xis n(l1, o 2 /n).
Comment:
Section
4.3.
 d n (11, a 2 /n)
x+
A sample mean
Required to Prove:
 11 +
d n
( 0' 1)
a/rn
Proof:
d
+n (11, o2) then
X.
59
The only difference in this case is that the variances are scaled by
1/n.
Comment:
x ~ n (~, o2)
Required to
Proof:
v
r''::::2     r
Prove: ( X~) ~
~ X2 ( 1 )
~~0~~
d
From section 4.l'W'= ("~)/o + n (0, 1), so the c.d.f. of
is
IV )
ew 12 dw
=0
0 < v
,v<O.
(v)
=;
~(v)
= y1/2 ;
the result is
= ~'(v),
~(v) =~1__
'V
0 < v <
oo
namely
"12,\ v/2
v~
0 <
~ e
(rr)l/2( 2 )1/2
o
elsewht!re
= 1,
x2
p.d.f.
60
gamma function
Comment:
r(~) = ~ 112 ,
[(x~)/a] 2
is verified.
andy.~
J.
x2
VARIP~LES
(v.)
J.
Required to prove:
      t
E y.
x2 (vl + "2 + v )
n
The moment generating function of E y. is
1 J.
1
Proof:
J.
n'
= E [etyl] E (etY1
x2
....
[ e tyn]
L
variable is (335)
M.(t)
= (l2t)v/ 2
)/2
degrees of freedom.
4.6
NO&~IZED
61
and x. ~ n (~, o2)
l
Required to P~r~cov~e~=~~
2
n ( x. ~\
E
d 2
+ X
_1_"}
a
From section 4. 4 if
Proof:
(n)
then y.l ~
x2(1).
In our case
v1 = v 2 =
\1
=1'
thus
2
E y.
Comment:
X.
=E
cr
=E
(x.x)2
where the
n1
x. + n ()..l, cr 2 )
l
Required to Prro~v~e~:~~~
(n1) s 2
cr 2
Proof:
=E
(x.x) 2
lcr2
~ x2(nl)
We begin by writing
n
E(x.).1) 2 = E (x.  x + x
~)2
= E[(x.i) 2
l
= E(x.i) 2 +
l
(xJ.l)2
2(x.i)
(xJ.~)]
E(xJ.1) 2 + E 2(x.i)(iJ.l)
1
62
But
n
n n x.
n
n
n
n
11 X.
r(x.x) = E(x.r 2..) = r x. r r 2.. = rx. r xi = 0
~
~
l
l ~ l n
l ~
l l n
l
l
Therefore
n
(x.x) 2 + n(x~)2
=r
r (x. ~) 2
Dividing by o 2 yields
n
r(x.~) 2
~
~~:
o2
r (x.x) 2
= (n1)
+ n
(
x~
)2
s 2 + n(xp) 2
o2
o2
M(t)
= E[:xp
and we can
[t
ite*
and
(l2t)l/ 2
63
Therefore
Therefore
(nl)s 2
+ X
(n  1)
a2
Comment:
MEAN
TO (s/~)
Given:
a)
b)
 d n ( ).l,
x+
a2
n)
d
X).l
+ n (0, 1)
c/IIFi'
c)
(n1) s 2 +d 2
x (n1)
a2
Required to Prove:
) 1/2
(X).l n
=~
t(n1)
s
Proof:
64
n(O, 1)
~ t (v)
[x2 (v )/v]l/2
and in the present case we have
(0 1)
+ t
(n1) .
[x 2 (n1)/(n1~112
Comment:
4. 9 DISTRIBUTION OF THE RATIO OF TWO SAMPLE VARIANCES l<'ROM THE SAME POPULATION
Given:
a)
2
(n 1 l) sl
+
d x2
(n 1 l)
~ x2
(n.2.l)
a2
2
(n l)s
2
2
b)
a2
Required to Prove:
Proof:
an F random variable.
3.4 that
x2
J.
(nl1
)j(n11)
__
;:;:..___
_....;__+
x2 (n21)/{n2l)
Cominent:
65
~~ .10
L,(.
mxm
Proof:
x Q: x2
(m)
mxl
Y
mxl
such that in the process
Tl
mxm
X
mxl
= T X;
X == TY
made independent.
z:x1
lxmmxm
mxl
= YT (
rrr E)? T ) y
mxm mxm mxm mxl
2
y2
+ . . +
Ym
02
m
Yi 2 d
.2
+ )(
(1)
x2 (1))
is distribu
x2 (m).
Comments:
l+ .11
In
66
distributions; these are special distributions and are the basis for
the distributions of functions of random variables given in this Chapter.
The latter are summarized in Table 41 in such a way as to show their
uses in:
1)
subsequent derivations,
2)
hypothesis testing.
67
Table 41.
Section
Derived
4.1
Random Variable
d
X.]_ + n(ll, o 2 )
Used For
Statistical
Test(Chap.5)
Used in
Derivation
X. ll d
~ + n(O, 1)
5.2
5.3
L x.
4.2
d
x.]_ + n(ll, o 2 )
1 ]_ d
 =+n(ll
X
o 2 /n)
n
'
4.3
x~
Xll
d (
1/2 + n 0, 1)
o/(n)
4.4
x.]_ ~ n(ll, o2 )
x.ll 2 d
(]_) + x2 (1)
4.5
y.]_ ~ x 2 (v.)
]_
n
E y. ~ x 2 (v 1 +v 2+.. v )
1 J.
n
4.6
x ]_. ~ n ( l1 , o2 )
n(ll, o2 /n)
4.8
x~
n(ll, o 2 /n)
,,.
5.4
5.5
no
no
,,
x.)l 2 d
(]_) + x 2 (n)
n
L
5.8
n
4.7
no
(nl)s 2
0'2
(x. x)2.
]_
d
+
x 2 (nl)
o2
,,
,l'
5.9
5.10
~ t(n1)
 ll
5.6
5. 7
s/rll
4.9
4.10
,,
d
x.]_ + n(ll, o 2 )
sl d
 + F(n 1, n 1)
2
2
1
s2
multivariate
normal
XT
lxm
E1 .
d
X + x 2 (m)
mxl
,,.
,
Basis for
Multivariate
Testing
yes
Chap. 8
68
5.
5.1
INTRODUCTION
Hecall from Section ?.5 that point estimation deals with the
estimation of population parameters from a. knowledge of sample
For example the sample mean x a.nd sample variance S
statistics.
J.l
are
and population
a , that is,
X :: )J ::
n
L
l
X.
l
n
n
A2
= C5
2
L (x.  x)
l
l
n l
:::
E:
<
z =a
e )
The
interval
,E)] .
can be made.
H0
The hypothesis
~H
~H."
Hl : ~
'~the
mean
is hypothesized
T~H
70
a single observation, x.
2.
3.
4.
5.
.
s2
the sample var1.ance,
6.
2 2
the ratio of two population variances, (o/o 1 )
7.
2 2
the ratio of two sample variances, (S/S1 ).
1.
)J
X.1.
o2
xi+ n (JJ,O )
From section 4.1, we know that
71
X.l
~ +d
n ( 0,1 ) .
[ ~ C 0 < X. < )J + C 0)
   l         1 .
(1)
( 2)
( 3)
01..
H0 :
X.
l
The mean
IN TERMS
observation x. as follows.
l
From section
l~ .1,
we know that
d
n(O,l) .
C <
xi 0
~
~
=a
and
Q. 11
lJ
is
< lJ < x. + c
a] .
(2)
( 3)
X AND
lJ
IN TERMS
VARIANCE
a1n
x 
lJ
n(O,l) .
a/R
The associated probability statement is
Pr (c
while the confidence interval for ll is
73
[i 
I .~ ~
(n)l 2
<
x + c
0/
(n)l 2
(1)
(2)
(3)
J7
to n(O,l) and a.
This confidence interval is used to test the hypothesis
XIN
TERMS
o /n as follows.
.~
and variance
X 
(0,1) .
o/~
Pr ( c < x 
~~
~c) = a ,
(~
O/
(n)l 2
<X<~+
4.3
(1)
the known
( 2)
( 3)
mean~.
to n(O,l) and a.
The above is used to test the hypothet>is
5.6
The mean
variance 5
2.
)J
From section 4. 8
as follows.
d
+
X~
s/R
t(n1) .
( t p 
<
X 
[ x
= a
< t p)
s/ .[7l1
:s
.[(\1
~).l
is
< X + t
s
p
{hl/
(2)
(3)
s2,
75
the tabulated value of t
to t(n1) and
et.
= ilH
H0 : ll
5.7
The sample mean x can be examined in terms of' the mean 11 and
samp1 e var1ance
s2
X 
Again
as f o11ows.
d
ll
t(n1) .
+
siR
Pr (t <
p
$/R
< t p )
= C1 ,
ll
p
(n
< X< ll
)1/2
+ tp
(n
sl1/2]
(1)
the
{2)
the computed value for the sample variance S , and sample size n,
(3)
kno~
mean
~,
to t(n1) and a.
The above is used to test the hypothesis
H0 : . x = xH.
1..1
From section 4. 6,
we know that
n
x. 
E (
1
J.
1..1
x cn>
<
n x.
E( J.
1..1
<
= a.
p2
2
while the confidence interval for a is
n
n
E
2
I:
2
(x.  JJ)
2
1
J.
(xi  JJ) < a <
]
(1
2
2
Xp
Xp
1
'
(2)
{3)
x2( n),
p1
1..1,
=1
 a.
x2p
and
1
' and p 2
x2p
ffoC
T
77
5. 9
ci
s2
IN TERMS
s2
as follows.
(n  1) S
x2 (n1) .
+
(n  1) 2
2 )=
<...:......;=.:,._;.<x
2
p
1.
(n  1)
2
Xp
s2
<

(l
'
is
2 < (n  1) s 2 ]
2
Xp
(2)
(3)
x2( n
 1 ) , p1
x2P
and
lex
= =2 ~ and
x2
p2
l+a(.
=~
78
5.10
as follows.
s2
ti
d
+
(n  l) .
(n  1) s 2 < X2 )
< ~~__.;::tr 2
p2
1
2
XP
s2
= ll
'
is
2
o
< 52
2
(n1)
5..XP
o
2
(n1)
(2)
(3)
x2p
and
corresponding to
A.2
x2
(Appendix B2)
p2
( n  1) , p1 =
$2
= 52 H
ll
79
5.11
S~ AND S~
d
+
(n2  1) s22
(
) I (n 2  1)
2
02
F (n1  l, n 2  1) ,
d
+
F ( n1  1 , n 2  1 ) .
Ct
'
2
2
while the confidence interval for 0 2 I 01
is
(l)
( 2)
= 1 2
Ct
80
The above is used to test the hypothesis
2
1
2
:::: 02 =
'
2
then sl and
s2
are
i3imp.ly sample variances of the same population (n (p,o 2 )), and the
o~ I o~ = 1
ratio
5.12
The ratio
2 as follows.
2
oi AND
s 12 1b... 22
o~
d 1.n
t erms
can b e examlne
') 2
>
SCI
2 2
F ( n 1  l , n 2  l) .
a '
sl
2 $2 
s l2 1s22
< < F
p2
is
(.siiS~)
0i and 0~ ,
4 . 9, we
81
( 2)
= a22 ,
5.13
2 2
The ratio of two variances o 2 /o 1 can be examined in terms of
measurements x1 , x 2 , ... , xn
2
which is n(]..l 2 , o 2 ).
nl
E
]..11)
1 (xi
2
01
d.+
2
X (n1 )
and
n2
E
1 (xi  11 2 )
2
01..
d. X2 (n )
2
+
'
82
(l)
(2)
+
< (l ) < F
pl  ( 2 ) 
p2
=a
n2
E (x.  ].12 )2
~
nl l
F
pl n2 nl
}:; (x.
l
~
]..l
<
'2.
02
<
2 al
nl
F
p2 n2
n2
}:; (x.
l
~
nl
E (x.
~
l
]..l
]..l
(2)
( 3)
of
=l
H :
0
 a
83
5:14
confidence interval,
b)
interval,
c)
column three the quantities which must be known for the con
fidence interval,
d)
e)
TABLE 5l.
Section
5.2
Known
Quantities
Quantity
Examined
single
observation x.
J..l'
cr
Statistic
 1  +
x.]..l d
n(O,l)
cr
[J..lCG
as above
<
<
X.
l
]..l+Ccr)
5.3
mean
J..l
xi' cr
5.4
mean
J..l
 cr
n, x,
5.5
sample mean
5.6
mean ].l
5.7
sample mean
5.8
variance cr2
J..l
[x  c cr
~ n(O,l)
rn
a/Ill
n, J.l, cr
as above
[J.l  C cr
 s
n, x,
x  J..l ~ t(n1)
[xt
n, ].l, s
as above
n, xi,J..l
n x. E( 1
1
cr
J..l
x. + ccr]
l
cr ]
< x + c 
rn
rn< X~
s!in"
<
<
J..l
2 d
)
+
x2 (n)
[
rn
OJ
+:
s ]
<J.l<x+tprnPin
[ ].lt
].1
+ C cr ]
].1
s <x
 < ] l + ts]

Pin
n
2
E(x.  ].l)
1 l
x2
<
p2
Pin
cr2
<
n
E(x. 1 l
x2
pl
2
].1)
Section
Quantity
Examined
Known
Quantities! Statistic
5.9
variance o2
n, s
5.10
sample
.
2
varJ..ance s
(n  l)s
o2
Confidence
Interval
as above
n, o
~ x2(n  1)
<
[x2 p
.o2
.
5.11
5.12
5.13
ratio of two
variances
o22 /o12
nl, n2,
ratio of two
sample
variances
s2fs2
1 2
nl, n2'
ratio of two
variances
o2 /o 2
2 1
nl, n2,
s2/sl
2/o2
; sl 1
2;02
52
F(nll,n2
1)
0/02
n2
~ F(n1 ,n2)
of nl
n2
Z (xi~2)
1
2
Xp
2 ]
2
.::_X p 2 (n1)
CP
\J1
~2
2
2
2
[FP (s 2 /s 1 )::. (o 2 /o1 ) ::_Fp (s 2 /s 1 ) ]
1
2
Z (xi~1)
xi' ~1'
< s
(n  l)s ]
<
2
2
2
[Fp (o1 /o 2 ) ::. (s1 /s 2 ) ::. Fp (o1 /o 2 ) ]
1
2
as above
nl
o2
o2 n2
nlz (xi~2)
[F
P1
nl
n2~ (xi~1)
n2
2
<
nlZ (xi~2)
o2
<F
 01  P2 ' nl
n2~ (xi~1)
2
II
86
6.
(6.1)
AXL=V
where nLl is called the observation vector and is the column vector
whose elements are the observed values, nvl is called the residual vector
and is the column vector whose elements are the unknown measurement
errors (inconsistencies), uXl is called the solution vector for which we
want a point estimate and whose elements are the unknown parameters,
and
n u
The
87
6.1
= minimum
(6.2)
=AX
(6.3)
(6.4)
lt=
,.,
ax
"
a(AXL)
..
a2
a(AXL)
ax
= 2(A
X  L)T p A
=0 ' '
=0
(6.5)
If (AT P A),called the
E [X]
=X
(6.7)
88
=0
E [VJ
(6.8)
= E[AXL] = E[AX]
 E[L]
= AX
 E[L]
=0
or
E [L]
= AX
(6.9)
= X,
If E [V]
=0
the residuals is
(6.10)
Also if E [V]
= 0,
E [L]
=A X
= V
89
z:L
=E
=E
[VV ]
= z:v
(6.11)
,..
mean that E1 =
l:~
z:L =
(Jl2
(J21
(J2
2
where
a?1.
is variance,
and
and
1.
determined.
90
is if
a2
~L = 0 Q
(6.13)
we know the relative covariance matrix Q but not the variance factor
a2.
0
(6.14)
=(AT
a2
0
l A)l AT
~L
l
a2 ~L
0
l
l
L = (AT ~L A)l AT ~L L
(6.15)
that is, the variance factor drops out and either weights 6.12 or 6.14
result in the same estimator X.
6.3
=B L
(6.16)
that is, the elements of X are linear functions of the observations, then
X is called a linear estimator of X.
The minimum variance estimator X of X is the linear unbiased
estimator whose covariance matrix
~X
A
A ~
~
T
= E [ (X  E [X] )(X  E [X] ) ]
( 6 .1 7 )
We
need some criterion by which we can decide when one matrix is "less than"
another matrix.
91
variance condition to be
Trace (I:x)
= minimum
(6.18)
and we will now proceed to find the matrix B in Equation 6.16 which will
satisfy this condition.
We have already seen that if E [V]
=0
,..
E [X] = X.
From the linear condition of Equation 6.16, and from the assumption E[V]
,.
E [X]
=E
[BL]
=B E
[L]
=BAX
Therefore
B A= I
From the
co~"ariance.
(6.19}
l13.w,. since
X
= BL
then
(6.20)
The problem may now be stated that we want to find that value of B
such that
T
Trace (BI: 1B )
= minimum
=0
92
BA I
=0
We will use the standard method for solving such minimization problems,
called the method of Lagrange, which will be fully explained. in Chapter
7.
BE BT + 2(B A I) K
L
where K is a matrix of undetermined. constants, called Lagrange multipliers, and we then set
a Tr(<jl)
aB
= 0
Tr ()
aB
rr T
=KA
aTr (K)
=0
a'rr ( p)
aB
so that
aB
or
B  
93
BA= I = 
Kr
AT
1
EL A
or
'l'
1
K =  (AT EL A)1
and
1
1
B = (AT EL A)1 AT EL
(6.21)
finally giving us
1
1
X = B L = (AT EL A)1 AT EL L
as the minimum variance estimator of X.
(6.22)
6.22 we see that the least squares estimator is the minimum variance
estimator when P=EL1 .
equations 6.22 and 6.6 is still valid, since the variance factor drops
out of equation 6.6.
6. l+
<I>(V)
C exp [
t (V
E[V])T
E~l
(6.23)
(V E[VJ)]
If
E [V] = 0
(6.24)
It can be seen that the least squares criterion
Afr
V E1
= minimum
(6.25)
94
6.5
In this section we will show that unbiased estimators exist for the
variance factor o~ and the covariance matrix of the unknowns J.:X given by
A'I'
;2
L:A =
X
nu
;2
(AT PA)l
(6.26)
(6.27)
where
(6.28)
The covariance matrix for X is given by
(6.29)
If E [V]
=0
E [X]
=X
(6.30)
therefore
( 6. 31)
Now from Equation 6.6 the expression for X is
(6.32)
which can be written
X= BL
(6.33)
95
where
(6.34}
From the covariance law we have
and thus
(6.40)
and i:f
(6.41)
then
(6.42)
and
Therefore
(6.44)
EX
is an unbiased estimator of
of a 2 can be found.
0
~;
= 
nu
(6.45)
or
(6.46)
First we recall the normal equations (Equation 6.5)
(6.47)
which can be written
(6.48)
or
"
(AX  L)
PA
=0
(6.49)
(6.50)
and
(6.51)
Based on these relations we now show that
(6.52)
where
V=AXL
V
= AX
 L
(6.53)
(6.54)
97
We begin by considering
"'T
VP V
,...
= (AX
 L)
P(AX  L)
Therefore
(6.55)
Next we consider
VT P V
= (AX
 L)T P(AX  L)
PAX.
= (X""
Therefore
 X)
""
A PA(X  X)
(6.56)
(6.57)
The next step is to prove that the value of the quadratic form
YT A Y
= Trace
(Y YT A)
(6.58)
98
T
Tr (Y A Y)
From the
= YT
A Y.
J
so that
Trace (YY A)
= YTAY
= Trace
"'
...
T T
A PA)
(6.59)
(6.60}
and using Equation
"'T
V PV
A
= Trace
6~11F:.Equation
{VV
6. 59 becomes
..
a2t... }
o x
(6.61)
99
E[~TP~]
= 020
= 020
"
"
T 1
Trace (E[VVTEv1 ])  02 Trace (E[(XX)(XX) ~ )])
= 020
"
"
T 1
1 )  02 Trace (E[(XX)(XX)
J~i 1)
Trace (E[VVT] EV
"
J)J
(6.62)
= o2o
(Trace I
= o 02
(n  u)
 Trace I )
u
(6.63)
and
6.26 defines an unbiased estimator of o02 , and from Equation 6.44 we see
that Equation 6.27 defines an unbiased estimator of
6.6
EX.
SUMMARY
In this Chapter we have shown that for the linear mathematical model
AXL=V
i)
ii)
100
P
v)
= cr2o
1
~L
vi)
vii) the least squares unbiased estimator cr2 of the variance factor cr 2
0
is
where
p
= (12
o
L:1
where
p
= (12
L:1
L
101
7.
In Chapter 6 we concentrated on demonstrating the statistical significance of the least squares point estimators under a variety of
assumptions, and assuming a linear mathematical model.
Nonlinear mathematical models occur far more frequently than do
linear models.
There
of the normal equations from the least squares criterion; and derivation
of expressions for the least squares point estimators from the normal
equations.
computer progrrum
by X.
We will call the sum of the two the total solution vector
X = X0
+ X
(7 .1)
102
7.1
=L +
(7.2)
V .
X and E have
(7.3)
F(X,L}=o
The quantity (r  u) is
=E
(7.4)
and the method is called the parametric method (also called the method
of observation equations,
the
(7. 5)
then the method is called the condition method (also called the method
of condition equations, the method of conditional observations, and the
method of correlates).
103
),
(L).
F (X,
= F (X
E)
L) +
0 ,
.!:..
ax
X +
.E
aE
xo,L
=0
xo,L
or
W + AX + BV
where rWl
A
r u
aF
=
= F(X0 ,
=0
(7.6)
and
r n
aF
aE
xo,L
= F (X
0 )
.!:..
ax
X  (L +
v) = 0
xo
or
W + AX  V
where
w
n 1
= F(X 0 )
Land A
n u
 aaFxl
=0
(7. 7)
xo
or
= F(L)
+ aF
aE
=0
104
W + BV
where
w = F(L)
nu 1
=0
(7. 8)
aF
B =nu n
aE L
The parametric and condition methods are the classical methods of
and
least squares estimation and are special cases of the combined method.
Many problems can be solved by either method.
two drawbacks.
If a
solution for the unknown parameters are desired, as is usually the case,
then after the condition method solution is complete, further work must
be done to obtain this solution.
On the other hand the condition method requires the solution of
only (n  u) equations rather than n equations for the parametric method.
This consideration overrode the drawbacks in the days of hand computations,
and most least squares work was done by the condition method.
However
with the use of the digital computer the advantage of fewer equations has
been erased, so now the parametric and more recently the combined methods
are usually used.
table.
Combined
Parametric
Mathematical Model
F(X,
F(X)
number of equations
n  u
number of observations
number of unknowns
W+AX+BV
E) = 0
=0
 E=
Condition
0
F(L)
= 0
=0
W+AXV = 0
w+ BV
special case
of combined
w.ith
special case
of combined
with
{B
= I
=n
{A
r
=0
= n
105
7.2
LINEARIZATION EXAMPLES
The
The mathematical
model
y.
].
= mx.J..
+ b
= 2r
observations, and
m
;l.xl
= X0
X=
bo
E=
1T I
xl
xl
vl
yl
yl
v2
L+v =
Xr
+
X
f.(x,
E)= i i.]. + b ~].
].
=o
106
CX 0
L) becomes
'
AX+ BV+W=O
where
xl
rA2 =
B
r 2r
[:0
1
mo
~]
and
m0 x
Example Two.
+ bo  y
(:xg, 0) and unknown point Cat (xc, yc). The three interior
107
x= X
+ X=
:tl
yc
yo
a.l
3
a.l
().2
E= L + V=
I
vl
+
().2
~3
v2
().3
v3
xc ))
().3 = arctan
which, when linearized about (X 0
L) becomes
V=.AX+W
where
y
3A2.=
2+ 2
xo Yo
2+ 2
xo Yo
Yo
x x
B o
(xBx0 )
Yo
2+ 2
xo Yo
and
2
+ y0
(xBx 0 )
x
Yo
(xBxo)
2 +
2
Yo
2
+ y0
(xBxo)
2 2
+y
(xBxo)
2 +
2
Yo
108
' 1 
3wl 
arctan (Yo/fvBx
\)  a 2
~Ci
arctan (xo/y } + arctan Bxd/y )  a 3
0
0
or
BV + W = 0 ,
where
I
= [1
1]
and
Note however that once this has been solved for V we will still have no
knowledge of
X.
7. 3 DERIVATION OF 'rifE NORMAL EQUATIONS
The
(7.9)
to the linearized mathematical model, which for the general case in thj.s
109
AX + BV + W = 0 .
(7.10)
example.
Suppose we want to minimize the function
(7.11)
subject to the constraint
f 0 (x, y)
c.
= ax
+ by +
=0
(7.12)
ii)
.E.! = 2ax
ax
iii)
+ ka = 0
and
~
=0
2by + kb
2\
x ' y)=Cl
::1,h
ax =
.:::..:t:.
aq,
'ay =
The
110
c
a+b
,y=
2c
,k=a+b
the particular ellipse which just touches the straight line, and the
solution (x, y) is the point of tangency between the ellipse and the
straight line, as shown in Figure 7.1.
so/u.hol'l.
potttt
Figure. 7 .l
Extending this method to the combined case we want to minimize the
function
AX + BV + W = 0 .
(7.13)
"
"
111
defined by
"T "
V PV
= constant
it = 2VT
p + 2KT B
av
=0
v + K=
BT
(7.15)
and
~ =2KTA=O,
ax
(7.16)
We now want to find a simultaneous solution of. the three equations
AX+BV+W=O
PV + BT
AT
for X, V and K.
K= 0
K= 0
112
w =0
(7 .17)
This hypermatrix equation represents the normal equations for the combined
method in their most expanded form.
I
= 0,
w =0
(7.18)
(7.19)
Note that in all three cases the hypercoefficient matrix has been constructed to be symmetric, with a nonsingular upper left element.
This
v
K
X
=
1
(7.20)
113
section we will derive more explicit expressions for the solutions for
X, K and V.
We start with
This
procedure is equivalent to the more familiar elimination and backsubstitution technique which does not use hypermatrices.
However the
BT
AT
is of the form
NY + u = 0
''Which :"can be partitioned
'
= 0
(7.21)
114
[::]
[ ::]
= 0.
(7.22)
or
(7.25)
I BT
A
B
0
0
K
X
w =0
0
115
=0
'
or
(7.26)
=K
h
[~)
t]
=0
or
(7.27a)
The first equation from hypermatrix Equation 7.26 is
from which
(7. 27b)
The first equation from hypermatrix Equation 7.21 is
from which
116
v =
(7 .27c)
Equations 7.27 are the expressions for the least squares estimators
using the combined method.
=
I, therefore
K = P (AX + W)
V
= P1"K = AX"
(7.28)
+ W
(7.29)
These solutions for X and V must be added to the initial approximation X0 and measured values L to obtain estimates of the total
solution vector
"
X = X0 + X
(7. 30)
(7.31)
~L
117
W = F (X0
1 ).
= aF
aL X0 ,L
E=L + V ,
it is obvious that from the covariance law
where
B .
Therefore
W = B EL BT
where
and
Therefore
(7.32)
118
= (. AT(BE1 BT)1A)1 .
(7.33)
K = G*(AX + W)
where
CK
= C* (ACX
=
+ I)
o~(BrLBT)l[I
AT(BrLBT)~
(7.34)
From Equation 727c
A
=
or
1
T "'
B K
119
(7.35)
Setting
cr~PlBT(BP 1 BT) 1 [
BP 1 
w=
a2
p 1
(7.37a)
(7. 37b)
l:"
X
= a20
(AT p A)1
l:"
= a20
(7.37c)
120
(7.38a)
(7. 38b)
121
8.
8.1
INTRODUCTION
7.
coordinates of points in a network containing a redundant set of observations among the points.
estimation involves these
region for these points taken all together, taken in groups, or considered
one at a time (for example, an error ellipse about a point in two dimensions,
or an error ellipsoid. for a point in three dimensions). We will develop
confidence regions for the following quantities (assuming the observations
to be normally distributed):
1)
2)
3)
122
(X  X}T
4)
~l
(:X 
X) ,
"
t:i
T(X X).L
(X X)]/u
by
X
= k
uxl
X
8.2
and
= ,..20
v
p1
123
o2
is computed from
o2
:;,Tpy
\)
n x.~ 2
}:; ( __:L_)
cr ..
Q:
X2 ( n)
x taken
variable
n r x) l. d x
2 = }:;
(n1) ~
cr 2
_:t_
2 (nl)
8l
x.
1xU..
= [AX
x2
xu J
The combined and parametric cases are used ,as the estimation technique;
in the combined case of r equations, the degrees of freedom v
while in the parametric case v
= ru,
= nu.
124
= Ct
while the associated confidence interval for a 2 is
0
\)~2
0
83
1)
2)
3)
t.
corresponding toX(\)),P 1
1a.
=2
and P2
x2p 1
and
x2p 2
(Appendix B2)
I u<
=2.
The relevance of the above relate& to the choice of the weight matrix,
that is
P, =
= EL1
Hypothesized
matrix
could be due to phenomena other than the incorrect scale of the covariance
matrix, that is
l)
2)
residual vector.
The above two items can also be treated as null hypothesesof course
125
A2
0
0
AT A
or V PV as
follows:
02
X P1
x2
8.3
EXA~INATION
< ~2 < x2
pl
\)
02 <
0
VTPV
OF THE RATIO
02
P2
< X P2
_9.]
84
\)
02
0
matrix
{
pl
where [Land
1
(0 2 )
0 1
~~
_,
ively for measurement set one, (b) v2 with a n2xn2 weight matrix
d+
case that
126
(a 2 )
0
I
( a2)
0
(a 2 )
0
)
(~2) I (a2)
Pr ( F p l < _O~l==O_].._ 2_ F P2) == a
( &2) I ( a2)
0
85
<
Ol
86
 P2
(o;)l
l
(cr 2 )
and
2)
~~he
2 '
H0.
8 .l1
==
X of
the
127
XX.
~
(XX)
1
~X
(XX) + x2 ( u)
where
weight matrices.
is simply the
x2 (u)
arglli~ent
of the
= a.
87
then the null
~ypothesis
is rejected.
8.5 EXAMINATION OF
DEV!ATIO~S
128
lve
= o~ QX
EX
= cr~
EX
and
below.
The ratio of two chisquared random variables divided by their
respective degrees of freedom dE;!fines the Fstatistic we need, namely
,.,
(XX
)T
1 (A
)
XX /u
~
E....
2( )
Q: X u /u Q: F (u, v)
x2 (v)/v
(xxr
~1 (XX)
u
(:n
EX
A rAI(A
(XX
XX )
d
+
a2
Az
cr cr2
0
:F(u.,
V)
u
The probability statement involving the above random variable is
Pr ( 0
"....__ _
u
< F )
p
= a.
88
< F
= ct.
T A1
EX:
(Appendix B4)
89
] .
(XX)
= ll.
Fp ,
810
129
the position described by the vector X, then the above equation becomes
8ll
&.l,F
E~ 1 X = 2
'r
[:j
2x2 2xl
[xl x2]
[ ol
'2
.0 12
At
o2
021
=2
812
Fp
=3
_,
A2
[xl x2 x3]
Fp
ol
0 12
0 21
o2
"0 31
1\
ol3
xl
0 23
x2
A2
A2
0 32
o3
=3
813
Fp
x3
Note that in the above two
examples, the equations will contain cross product terms since the off
diagonal elements are non zero.
e,
where
eigen
For example
after performing the eigen value problem on the two dimensional case
130
above, the
e~uation
the equation
0
1
814
where (y1 ,y2 ) lie along the axes of the rotated coordinate system defined
by the eigenvectors of
tx" .
To summarize the results for this confidence region, first the origin
of the coordinate system was translated to the position given by the
vector
angle
X,
= Xu
itself.
Table 81.
a2
Quantity
Examined
Known
variance
factor
under
hypothesis
Statistic
vTL~ 1 v ~
Confidence Region
x2 (v)
v&2
v&2
[ ___Q_ < a 2 < ____2.. ]
2
a2
o
XP2
Xpl
(Section 8.2)
ratio of two
variances
(a~) 2 /(a;)1 ,
ratio
under
hypothesis
(;2) /(a2)
0
yes
(02) /(a2)
(Section 8.3)
deviations
from estimated
\?olution vector
A T
(XX)
~ F (vl' v2)
LX1
2 <
[FP1(""2)
ao 1
(AXX ) ~d X2( u )
(a~) 2
( ;2)
.
."H
1
z:,..X
(XXH)~X
A
)
'
(Section 8.4)
1'
~ao 1
deviations fron
estimated
solution
vector X
(Section 8.5)
FP2 ~]
IA2)
.S..
 (a2)1
(0 < (XY:.)
1'
( ;2)
(XX)
no
A ...
EX(XX)
u
~ F (u,v)
[0 .::_
(xx.l
i::;:1 (xx.t>
u
< F ]
p
132
9.
=o .
The additions to this model which can and have been made are
innumerable.
We will assume
that observations L have been made on satellites from one or more ground
stations by some means which we need not specify here.
These observations
are related both to the ground station coordinates and to the satellite
coordinates, which together make up the elements of X.
9.1
133
Then our
mathematical model is
F(x1 , x2,
1)
= 0
where
xl = xo1 + xl
x2 = xo2 + x2
= L
X~
and
X~
and
v =0
or
(9.2)
where the misclosure vector W = F(X~, X~, L) and the design matrices
, and B
= aF
134
W+ A X + B V
I
such that A
= [A~A 2 ]
multiplication.
'
X =[
~;l
=0
= minimum
(9.3)
(9.4)
The variation function is
av
to zero we have:
= 2VT
p +
2KT
=0
or
"
T "
P V + B K
=0
(9.5)
or
(9.6)
=0
or
(9.7)
135
The normal equations are
p
BT
0
0
A2
Al
AT
2
x2
AT
=0
(9.8)
'
=0
A2
Al
AT
2
AT
1
xl
w
+
=0
(9.9)
A~(BPlBT) 1A 2
AT(BPlBT)lA
2
1
AT(BPlBT)lA
1
2
AT(BPlBT)lA
1
1
AT (BPlBT)lW
2
x2
= 0. (9.10)
AT (BPlBT )lW
1
xl
N21
x2
u2
=0
Nl2
Nll
xl
ul
(9.11)
136
where
(9.12)
From the first of Equations 9.11
or
(9.13)
From the first of equations 9.9
or
(9.14)
F.rom the first of equations 9. 8
p
+ BT
K= 0
or
(9.15)
The estimators for the total solution vectors are given
by
137
F(X'
E) = 0
However let us assume that we have two sets of observations from the same
ground stations.
The
Fl(X, El) = 0
(9.16)
F2(X, E2) = 0
where
X= X0
+ X
= a2o
= v~ 02
~l
L
u~l
1
an d L2 h ave
=F
aF
(X 0
l
L ) + __l
l
ax
aEl xo
L
' l
138
0 ,
or
(9.17)
aEl xo
This mathematical model is
e~uivalent
L
' l
W + AX + BV
=0
so that
=0
' p  [pl
0
E~uations
9.17.
OJ '
p2
139
(9.18)
(9.19)
where
K=
~]
Then
or
(9.20)
or
(9.21)
140
or
T"
Ais + A2K2
=0 .
(9.22)
pl
BT
1
vl
p2
BT
2
v2
Bl
Al
Kl
B2
A2
K2
w2
AT
1
AT
2
wl
=0
(9.23)
which are a partitioned version of the normal equations for the combined
method of Chapter 7.
Eliminating V1 from Equation 9. 23.
p2
BT
2
v2
wl
1 T
BlPl Bl
Al
Kl
B2
A2
K2
w2
AT
1
AT
2
Al
Kl
wl
=0
(9.24)
1 T
B2P2 B2
A2
K2
AT
1
AT
2
w2
0
=0
(9.25)
141
=0
(9.26)
Eliminating K2 from Equation 9.26
or
( 9. 27)
From the first Equation of 9.26
(9.28)
From the first Equation of 9.25
(9. 29)
From the first Equation of 9.23
( 9. 30)
9.2~
(9. 31)
142
(9. 32)
+ X
F(X,
=o
E)
ematical model
=o
F(X)
X and E is
Fl(X,
E)=
specified
(9.33)
=o
F 2 (X)
where
X = X0 + X
E=L
= cr02
1
~L
F1 (x, E)= F1 (X 0
aF
,
L) +
_1
ax
aE
v =0
143
=0
or
(9.34)
where the misclosure vectors
w1 = F1 (X 0 ,
w2 = F 2 (X 0 )
L) and
and the
design matrices
, and B
W + AX + BV
=0
so that
A
and setting
A2
AT A
V PV = minimum
under the constraints
(9.35)
(9.36)
where K 
[~ll
K2J
or
(9.37)
or
(9.38)
The normal equations are
p
BT
Al
Kl
wl
AT
AT
2
K2 J
= 0
(9.39)
w2
whieh are a partitioned version of the normal equation for the combined
method of Chapter 7.
Eliminating V from Equation 939
BPlBT
Al
AT
A2
A2
K2
wl
Kl
T
=0
( 9. 40)
AT(BPlBT)lA
1
[ 1
A2
Eliminating X from Equation 9.41
 A (AT(BPlBT)lA )lATK +
2 1
1
2 2
(W2  A 2 (A~(BPlBT) 1A 1 )lA~(BPlBT)~ 1 )
=0
or
K
2
(9.42)
(9.43)
(9.44)
From the first Equation in 9. 39
146
X= X
+ X .
(9.46)
relationships, and particularly in new applications where the relationships may not yet be fully known, this is not a valid assumption to make.
For example in the previous section we assumed that satellites follow an
elliptical trajectory.
is to assert that the resulting least squares estimates for the unknown
parameters must fall within the limits specified by these variances of
the preliminary estimate.
147
"pseudomeasurements".
In practice this means that our mathematical model
F(X,
L). = 0
'
where
=X
E =L
+ X
+
W + AX + BV
=0
(9.48)
where
and
W + BV
=0
A]
so that
B
and
[B
148
W + [B
A]
[~]=0
(9.50)
or
(9.51)
or
(9.52)
'I'he normal equations are
149
0
B
=0
'
(9.53)
which are a partitioned version of the normal eQuation for the condition
method of Chapter 7.
Eliminating V from EQuation 9.53
(9.54)
or
(9.55)
From the first of EQuation 9.54
(9.56)
v 
( 9. 57)
Px
= 0.
1
fore we will reformulate the normal eQuations in such a way that our
150
1
AT
Px
Pv
=0
(9.58)
(9.59)
or
(9.60)
and
(9.61)
(9.63)
151
10.
We will
take it for granted that any chopping scheme must yield the same final
re~mlt
Step by step least squares estimation is not a new concept [see for
example Tobey 1930; Tienstra 1956; Schmid and Schmid 1.965; and Kalman
1960].
a<hi_1:!stm~nt,
phased
adj~lstment,
There are differences :in detail between some of these methods, but
basically they all involve the derivation of expressions for the current
least squares estimate i.n terms of the previous estimate plus a
"correction" term, using the rules of matrix partitioning.
In this Chapter, we will not attempt to be exhaustive, but will
derive a sequential expression for the solution vector X following
Kraki wsky [ 1968].
the Kalman filter equations for the case where the unknown parameters
are not time variable.
152
10.1
E=0
AX  V + W = 0
Applying the least squares criterion
I
...
I
w =0
101
Note that P, A,
V and K are
I
I
I
I
...
K
k1
102
=0
...
X
We will denote by Xkl the estimate obtained only when the previous
observations are used.
153
I
pk1
I
0
vk1
Ak1
Kk1
T
Ak1
xk1
0
+
wk1
=0
103
l04a
/\
l04b
1\
l04c
where
( T
)1
l04d
pk1
I
pk
I
I
Ak1
Kk1
Ak1
AT
k
xk
I
Ak
vk1
vk
0
+
wk1
/\
Kk
wk
=0
105
154
P1
k1
Ak1
T
Ak1
AT
k
P
wk1
Kk1
,..
xk
1
k
"
=0
106
wk
Kk
"
from 106
Eliminating Kk1
rk1 ~1 [~1
1
P
k
Ak
[~1]
= 0
107
Therefore
. Kk = (P1
k
+A
108
or
or
109
From the second of equations 105
or
1010
155
'Eo
where
C2
=
N~1
Ak
1
(P~ + Ak Nk1
T 1
Ak)
Nk
[~
Xk
= [Cl
1
Nk1
CT
pk
1
CT
2
C2]
Multiplying out
[v~pk Vk~P~l~
1012
We will now compare equations 108, 109 and 10ll"with the Kalman
Filter equations, as given for exa.11ple by Sorenson ( 1970 J.
156
are two time dependent factors  the actual value of the statespace
vector is changing continuously, and new observed data is being accumulated
continuously from which new estimates of the new value of the statespace vector can be made.
During the past ten years, the big news in optimal control has been
the Kalman filter [Kalman 1960].
We will now review Kalman's equations (using optimal control
notation) and then simplify the equations by dropping the time dependence
and rewriting in our notation.
The time dependence ofthe state vector is expressed by the mathematical model
1013
where xis the statespace vector (solution vector),
4l
k+l,
(plant model),
{wk} is the plant white noise sequence (residual vector).
The linearized mathematical model between the state vector and the
measurement data is expressed by
1014
157
and Rk
stated as follows:
a)
"
to find the least squares estimate xk/k
of the current
state xk using all data up to and including the current set zk,
b)
zk simultaneously.
xk/k1
= ~ k,
k1
xk1/k1
1015
1017
p
k/k1=
k, k1
k1/k1
~T
+ Q
k, k1
~1
1018
1019
158
estimate xk/k.
Dropping the time variability of the state vector means we can
ignore equations 1013, 1015, and 1018 above.
notation conversion is
Kalman
These notes
W
Kalman
These notes
xk/k
xk
Hk
Ak
pk/l'
1
Nk
xk
pk/k1
1
Nk1
vk
V
zk
Rk
Kk
1
pk
no equivalent
no equivalent
~
Kk
1021
1022
which can be seen to be equivalent to equations 108, 109 and 1011,
although the definition of Kk is different in the two cases.
159
REFERENCES
Carnah~,
Applied Numerical
160
Tobey, W.M. (1930). "The Differential Adjustment of Observations",
Geodetic Survey Publication No. 27, Ottawa.
Wells, D. E. ( 1971) . "Matrices", Department of Surveying Engineering,'
Lecture Notes No. 15, University of New Brunswick.
161
APPENDIX A
NUMERICAL EXAMPLW*
A.l
degrees.
tan (a. + z)
~
where (x
, y 0 ) are the unknown observers coordinates, (x.,
y.)
are the
 0
~
~
known landmark coordinates, a. are the measured directions from observer
~
to landmarks, and z is an unknown orientation angle between the measurement coordinate system and the coordinate system to which the landmark
and observers coordinates are referred.
=L
Then
+ V where
26.7047
*/180
L =
in radians
a 10
168.6730
TI/180
and
*data taken from Bennet, J.E. and J.C. Hung (1970). "Application of
Statistical Techniques to Landmark Navigation". Navigation Vol. 17,
No. 4, page 349, Winter.
162
5
7
LANtHARK
MEASURED DIRECTION
Noe
x~kml
;r{kml
~degrees2
....
,(.
2
J
4
2
5
5
6
2
8
.;
26.7047
26.5795
103.8340
157.9978
53.1393
68.2511
135.0673
50.3190
120.8734
168. 6730
7
8
6
6
5
10
e:
6
6
10
1
ILLUSTRATIVE EY_M.!PLE
... _fi'R'U.r.e. A"'::"~ 
vl
v =
vlO
and we will choose
0
xo =
so that x
=X =
(0 .I
yo
z
where cr 2
Finally E
L
= cr 2
Using the
= tan(a.l.
z) X.
l.
=0
Xo
l.
A.
=[
l.
a:i
ax
0
B.
l.
[()~i
()a.
l.
af.
l.
d:i]
az
()f.
aa:o
r~
x.
l.
1
X.
l.
sec
164
= f 1.. ( X
W.
1.
= tan
L)
0 ,
a. l.
yi
x.
l
= arctan
Yi 
Yo)
X.
1.
 z
from which
y.
f.(iC) L.
1.
1.
= arctan
 z  a.1.
X.
=0
1.
A.1.
[a~i
Clf.
ax
ay
1.
W.
l.
= f.1. (X
a~i]
xo
Clz
0 )
L.
1.
[yi
x.1.
2 2
x.+y.
= arctan
1.
1.
ci)
X.
l.
2 2
x.+y.
1.
1.
l]
 a.l.
A.3
Solution
The nonlinear mathematical model
F(X)  E = 0
has been linearized about the initial approximation X0 and observed
values L to give the linearized model
 o 2 " 1
I:>
10 10 
Matrices X0
..1
I
~L
10v1 = AX +
lOEl = L +
A
02
0

1\
VTPV
n
:::
VTPV
7
0.63 meters
0.25 meters
b)
166
1
2
o.o
o.c
o.c
to
L,
RC~
1
2
~0.463ESS7EE5C
0.46f1E4~4r~C
~0
RC\6.
3
4
5
6
7
O.l812t:4!:lit:C
0.275758lc21C
0.9274e57472C
0.1lgl2Cc413C
Oo23=73ESCS7C
CC
Cl
C1
CO
01
Cl
RC~
RC~
RC~
~C~
RCw
RC~
RCw
RC\'t.
o.e7e2~2223C
0.21C~c3ccec
Tao~2"94.3ES(2T<;"C~
C0
01
Ci
.RC'II
RC\',
2
3
RC\\
RC"'
RCY.
RCW
RC'r\
RC'11
RC\\
RCw
.  5""
6
7
8
g
10
0.2CCCCCCCCOC03
O.lCCCCCCCOOC03
o.1t'ic47C:cscc3
~.c8gceet724C04
o.acccccccc~cc4
Oo17~4137S31CC3
Oo8323233323CC4
o.~e~6CtEe74Cc4
0.7352S4ll76CC4
0.384f.l!:3e46CC4
10
w,
1
RC~
FL'I.
RCY.
RC'I.
3
4
5
6
7
8
9
10
RC~
h (',.a.
RCY
J; (\',
RC"'
RCw
0.24273317~7CC2
TABLE A1
2
C.4CCCCCCCCODC3
C.lCCCCO~OCOC 0~I
Co20CCCOCOOOC03
C o2gttll7t.471C04
o.tooocoocaoc
c.r.asE551724C04
C.E::332:33333CC4
o.tccccoocooc 01 :
 c .1 0 c c c 0 0 0 0 0 0
c.tocooooaooc 01
 c 1 c tc_t_o oc coco 1
c.tccccoooooc 01 I
Co44ll'ic47CcCC4
C.l~23C7E<;23CC3
01
C.lOCCCOOOOOC 01 f
C.lCCCCOOCCCC 01 t
c.toccooooooc 01 :
OD
1
~C~
RC~
1
2
CO
0.2523S43427C JO
""R<;C:;,,\A.i'.;;;30 .3 2 ::1 c I c 4 ~ 8: c 3
RCw
O.f3~1~3412fC
O.t231S2412CC CC
RC~
2
0.2523943427C CQ
R~C;:1.\_ _ _...;3:;_...;.,00 ?. 2 1 C 7 CI, 52 C C3
1\
IO V,
RCW
RC~
RC~
2
3
4
5
6
RCW
RC~
RCw
o.~6144c1C~~CC3
0.3S1ESS1431CC2
0.32ES~32~C2CC2
0.227376S7S3CC3
0.46777E~S~ECC3
cv..,.;.6
R C W'H E U ~8! V ~~ 2
RC~
RC~
. 9
~
e2 c 1 e. e: :::t:~6c:.:.: c 3
0.1312fCS324C02
Oe5S5EC7787SCC3
10
"
Jti
RC\\
RC~
;.;: c ~. .
RCw
RCw
RCW
RC\1.
0.1EE7f2E2S2C02
1
2
o.4e41S73t2!:C
0.46233E3424C
cc
J
4
~.tetct<..LlC.7c:
0.27cOE~1354C
Sl
Cl
Oo927cE31242C
6
7
O.l1SC73EE2~C
o.23!:6e4e:;::oc
~:
co
co
'Jl
01
cc
RC\11
RCw
o.e7~7 i7~:=c:~:c
RC ....
10
1.211CS51413C ~1
0.2St.44<;5Cl7C Cl
0.12794507540
Oo1E277Ec7<.HD
"
3 ~x3
1
0.53C7141~24C
C2
1 c 4 5 <; 1 2 2 4 c c  o2  
2  o ~
3
Oel5cC174412C~C2
TABLE
2
C.1045S12246C 02
c 2 1 7 2 3 c: 7 c; 2 1 c o2
C.2Et4ecC551CC3
A2
3
C.15E0174413C02
C.2614560551C03
C.6lglQ30316C06
~
168
"2
a
X
a
yo
= 21
1\2
= 0.62
meters
x 10
6 radians 2
= 73
meters
,.a
= 4.6
meters
xo
Yo
1\
az
d)
53 meters 2
/12
c)
= 2.6 arcminutes
""T ....
V PV
<
<
X~ 975....
or
12.79 <
[ 16.01

02 <
0 
12.79
1.69
or
is not rejected.
=1
APPENDIX B
STATISTICAL TABLES
170
TABLE B1.
.00
.03
.04
.05
.6217
.6591
.5080
.5478
.5871
.6255
.6628
.5120
.5517
.5910
.6293
.6664
.5160
.5557
.5948
.6331
.6700
.5199
.5596
.5987
.6368
.6736
.52391
.5636
.6026
.6406
.6772
.6915
.7257
.7580
.7881
.8159
.6950
.7291
.7611
.7910
.8186
.6985
.7324
.7642
.7939
.8212
.7019
.7357
.7673
.7967
.8238
.7054
.7389
.7704
.7995
.8264
.7088
.7422
.7734
.8023
.8289
1.4
.841.3
.8643
.8849
.9032
.9192
.8438
.8665
.8869
. 9019
.9207
.8461
.8686
.8888
.9066
.9222
.8485
.8708
.8907
.9082
.9236
.8508
.8729
.8925
.9099
.9251
1.5
1.6
1.7
1.8
1.9
.H332
.9452
.9554
.9641.
.9713
. 9315
.9163
. 9564:
.9649
.9719
.9357
.9474
.9573
.9656
.9726
.9370
.!l184
.9582
.9664
.9732
.949n
.9591
.9671
.9738
2.0
2.1
2.2
.9772
.9778
.9826
.98134
.9896
.9D20
.9783
.9830
.9868
.9898
.9922
.9938 I .9910
.9953 I .9955
.9965
.9966
.9975
.9974
.9981
.9982
.9987 I .9987
.9990 II .9991
. 9~)93 I . ~JB93
. 9~l95
.9995
. ~l997
.9997
.0
1
.2
.3
.4
.5
.6
.7
.a
.9
'1.0
1.1
1.2
1.3
2.3
2.4
2.5
2.6
2.7
2.8
2.9
3.0
3.1
3.2
3.3
3.4
.01
.02
.5793
I
I .5040
I .5438
. 58:32
.6179
.6554
.5000
.5:398
.9821
.9861
.9893
.9918
.09
.5279
.5675
.6064
.644.3
.6808
.5319
.5714
.6103
.6480
.6844
.5359
.5753
.6141
.6517
.6879
.7123
.7454
.7764
.8051
.8315
.7157
.7486
.7794
.8078
.8340
.7190
.7517
.7823
.8106
.8365
.7224,
.7549
.7852
.8133
.8:389
.8531
.8749
.8944
.9115
.9265
.8554
.8770
.8962
.9131
.9279
.8577
.8790
.8980
.9147
.9292
.8599
.8810
.8997
.9162
.9306
.8621
.8830
.9015
.9177
.9319
.93s~
I .H394
.9505
.9599
.9678
.9744
.9106
.9515
.9608
.9686
.9750
.9418
.9525
.9616
.9693
.9756
.942H
.9535
.9625
.9699
.9761
.9441
. 95,15
.9633
.9706
.9767
.9788
.9834
.9871
.9901
.9925
.979:3
.9838
.9875
.9904
.9927
.9798
.9842
.9878
.9906
.9929
.9803
.9846
.9881
.9909
.9931
.9808
.9850
.9884
.9911
.9932
.9812
.9854
.9887
.9913
.9934
.9817
.9857
.9890
.9916
.9H3G
.9941
.9956
.9967
.9976
.9982
.9943
.9957
.9968
.9977
.9983
.9945
.995H
.9969
.9977
.9984
.9946
.9960
.9970
.9978
.9984
.9948
.9961
.9971
.9979
.9985
.9949
.9962
.9972
.9979
.9985
.9951
.9963
.9973
.9980
.9986
.9952
.9964
.9974
.9981
.9986
.9987
.9991
.9994
.9995
.9997
.9988
.9991
.9994
.9996
.9997
.9988
.9992
.9994
.9989 I .9989
.99921 .9fl92
I . !39~)4 . 9CJ94
.99961 .99961 .9996
.9997 .9997 .9997
.9989
.9992
.9995
.9996
.9997
.9990 .9990
.9993 . !)!)!);j
.9995 I .9995
. 9996 . 9~l97
.9997 .9!l98
I_
.06
.07
.08
1'71
'!'ABLE B2.
Values of
'v
"'
I x2 ,
.00
!l
. 0000 39
2 1.0100
3
4
5
6
7
8
9
10
. 0717
.207
.412
.6 76
.989
1.34
1.7 3
2.1 6
x~{t:>
X2ot
.00016 1 .00098
.0201 .0506
5.02
7.38
9.35
11.14
12.83
6.63
9.21
11.34
13.28
15.09
7.88
10.60
12.84,
14.86
16.75
1.64
2.17
2.73
3.33
3.94
2.20
2.83
3.49
4.17
4.87
10.64
12.02
13.36
14.G8
15.99
12.59
14.07
15.51
16.92
18.31
14.45
16.01
17.53
19.02
20.48
16.81
18.55
20.28
21.%
23.59
25.19
4.57
5.23
5.8B
6 ,;),
r<7
7.26
5.58
6.30
7.04
7.79
8.55
17.28
18.55
19.81
19.68
21.03
22.36
21.06
22.31
25.00
21.92
23.34
24.74
26.12
27.49
30.58
28.30
29.82
31.32
32.80
7.96
9.39
10.85
13.85
18.49
9.31
10.86
12.44
15.66
20.60
23.54
25.99
28.41
33.20
40.26
26.30
28.87
31 .41
36.42
43.77
28.85
31.53
34.17
39.36
46.98
32.00
34.81
37.57
42.98
50.89
34.27
37.16
40.00
45.56
53.67
29.05
46.46
100.62
51.81
74.40
140.23
ss. 76
59.34
83.30
152.21
63.69
88.38
66.77
158.95
163.G4
.115
.216
.297
.554
.484
.831
.872
1.24
1. 6[}
2.0fl
2.56
1. 24
1.69
2.18
2.70
3.2.5
6.2G
30
5.1 4
6.2 6
7.4 3
9.8 9
13.7 9
5.81
7.01
8.26
10.86
14.95 i
40
I zo. 71
60 I 35.53
183.8 5
x~99>
3.84
5.99
7.81
9.49
11.07
5.23
120
X~s9
2.71
4.61
6.25
7.'78
9.24
s.63
24
X~m
.0158
.2107
.584
1.064
1.61
4.GG
20
x\,
.0039
.1026
.352
.711
1.15
4. 11
16
lB
x29o

3.82
4.40
5.01
14
t5
corresponding to Pr.
x2to
x:o:.
2. 60
3.0 7
3. 57
4.0 7
4 .G 0
11
12
13
x2p
3.05
3.57
6.91
8.23
9.59
12.40
16.79
i'
23.6g
79.08
146.57
18.48
20.09
21.67
23.21
24.73
26.22
27.69
29.14
26.76
91.95
where
AduptNi 'dth
I'
permi.<~.~ion rrornlfl!r(c!.~:'ion to
Stati.;,'ioll At~al!rl.'li.s (:~d td.) hy \V. J. Dixon and 1'''. J. 1\.{asscy, JL,
!\iC'Gr.;,w1lill Book Company, Inc.
Cop~right,
!957,
171 a
x2
PERCENTILES OF THE
01 STRI BUTt ON
PROBABILITY FUNCTIONS
PERCENTAGE POINTS OF TilE x2DISTRIDUTIONVALUES OF
x21N TERMS OF Q AND
\Q
1
2
3
4
5
0.995
0.99
~4)1.57088
0.975
0.95
( .3) 3.9.3214
0,102587
0.351846
0.710721
1.145476
~4J9.82069
2 5.06.356
0.215795
0.484419
0.8.31211
0.25
1 .323.30
2.77259
4.108.35
5.38527
6.62568
1.610.31
0.75
0.101531
0.575.364
1.2125.34
1.92255
2.67460
0.5
0.454937
1.38629
2.36597
3.35670
4.35146
5.34812 7.84080
6.34581 9.03715
7.34412 10~2188
8,34283 11.3887
9,34182 12.5489
0.9
0.0157908
0.210720
0,584.375
1.06.362~
6
7
8
9
10
0.675727
0.989265
1.344419
1.734926
2.15585
0.872085
1.23904.3
1.646482
2,087912
2.55821
1.237.347
1.68987
2.1797.3
2.70039
3.24691
1.63539
2.167.35
2.7.3264
.3.32511
3.940.30
2.20413
2.83.311
.3.48954
4.16816
4.86518
.3.45460
4.25485
5,07064
5.89883
6.73720
11
12
13
14
15
2.60321
3.07382
3.56503
4.07468
4.60094
3.05.347
3.57056
4.10691
4.66043
5.22935
3.81575
4.40.379
5.00874
5.62872
6.26214
4.57481
5,22603
5.89186
6.57063
7.26094
5.57779
6.30.380
7.04150
7.7895.3
8.54675
7.58412
8.43842
9.29906
10.165.3
11.0.365
10.3410
11.3403
12.3398
1.3.3393
14.3389
13.7007
14.8454
15.9839 .
17.1170
18.2451
16
17
18
19
20
5.14224
5.69724
6.26481
6.84398
7.43386
5.81221
6.40776
7.01491
7.6.327'3
8.26040
6.90766
7.56418
8.23075
8.90655
9.59083
7.96164
8.67176
9.39046
10.1170
10.8508
9.31223
10,0852
10.8649
11.6509
12.4426
11.9122
12.7919
13.6753
14.5620
15.4518
15.3385
16.3381
17.3379
18.3376
19.3374
19.3688
20.4887
21.6049
22.7178
23.8277
21
22
23
24
25
8.03366
8,64272
9.26042
9.88623
10.5197
8.89720
9.54249
10.19567
10.8564
11.5240
10,2829.3
10.9823
11.6885
12.4011
13.1197
11.591.3
12.:n8o
1.3.0905
13.8484
14.6114
1.3.2396
14.0415
14,8479
15.6587
16.4734
16.3444
17.2396
18.1.373
19.0372
19.9393
20.):372
21.3370
22 .3369
23.3367
24.3366
24.9.348
26.0393
27.1413
28.2412
29.3389
26
27
28
29
30
11.1603
11.8076
12.4613
13.1211
1.3.7867
12.1981
12.8786
13.5648
14.2565
14,9535
13.8439
14.5733
15.3079
16.0471
16.7908
15.3791
16.151.3
16.9279
17.7083
18.4926
17.2919
18.1138
18.9392
19.7677
20.5992
20.8434
21.7494
22.6572
23.5666
24.4776
25.3364
26.3363
27,3363
28.3.362
29.3360
30.4345
31.5284
,)2.6205
33.7109
34.7998
40
50
60
70
80
20.7065
27,9907
35.5346
43.2752
51.1720
22.1643
29.7067
37.4848
45.4418
53.5400
24.4331
32.3574
40.4817
48.7576
57.1532
26.5093
34.7642
43.1879
51.7393
60.3915
29.0505
37.6886
46.4589
55.3290
64.2778
3.3.6603
42.9421
52.2938
61.6983
71.1445
39.3354
49.3349
59.3347
69.3344
79.3343
45.6160
56.3336
66.9814
77.5766
88.1.303
90
100
59.1963
67.3276
61.7541
70.0648
65.6466
74.2219
69.1260
77.9295
73.2912
82.3581
80.6247
90.1332
89.3342 98.6499
99.3341 109.141
2.5758
2.3263
1.9600
1.6449
1.2816
0.6745
QC:\:21,.).,. 22r ;
[
)rl . r e:q2I
o.oooo
0.6745
1 dt
From E. S. Pearson and H. 0. Hartley (editors), Biometrika tables for statisticians, vol. I. Cambridge
Univ. Press, Cambridge, England, 1954 {with permission) for ({ > 0.0005.
172
TABLE B3.
t.w
t.70
.325
.289
.277
.271
.267
.727
.617
.584
.569
.559
.265
.263
.262
.261
.260
.260
.259
.259
.258
.258
I
I
t.so
f.PO
t.P>
t.97(>
t.99
t.995
1.376
1.061
.978
.941
.920
3.078
1.886
1.638
1.533
1,L176
6.314 112.706
2.920 I 4.303
2.353 I 3.182
2.132
2.776
2.015
2.571
31.821
6.965
4.541
3.747
3.365
63.657
9.925
5.841
4.604
4.032
.553
.549
.546
.543
.542
.906
.896
.889
.883
.879
1.440
1.415
1.397
1.383
1.372
1.943
1.895
1.860
1.833
1.812
2.447
2.365
2.306
2.262
2.228
3.143
2.998
2.896
2.821
2.764
3.707
3.499
3.355
3.250
3.169
.540
.539
.538
.537
.536
.876
.873
.870
.868
.866
1.363
1.356
1.350
1. 3.15
1.341
1.796
1.782
1.771
1.761
1.753
2.201
2.179
2.160
2.145
2.1.31
2.718
2.681
2.650
2.624
2.602
3.106
3.055
3.012
2.977
2.947
20
.258
.257
.257
.257
.257
.535
.534
.534
.533
.533
.865
.863
.862
.861
.860
1.337
1.333
1.330
1.328
1.325
2.120
1.746
1.740 I 2.110
1.734 I 2.101.
1..729
2.093
1.725 I 2.086
2.583
2.567
2.552
2.539
2.528
2.921
2.898
2.878
2.861
2.845
21
22
23
24
25
.257
.256
.256
.256
.256
.532
.532
.532
.531
.531
.859
.858
.858
.857
.856
1.323
1.321
1.319
1.318
1.316
1.721
1.717
1.714
1. 711
1.708
I
I
2.518
2.508
2.500
2.492
2.485
2.831
2.819
2.807
2.797
2.787
26
.256
.256
.256
.256
.256
.531
.531
.530
.530
.530
.856
.855
.855
.854
.854
1.315
1.706
2.056
1.31.4
1.703
2.052
1.313
1. 701 I 2.048
.1.311 I 1.699
2.045
1.310
1.697
2.042
2.479
2.473
2.467
2.462
2.457
2.779
2.771
2.763
2.756
2.750
.255
.254
.254
.253
.529
.527
.526
.524
.851
.848
.845
.842
1.303
1.296
1.289
1.282
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
27
28
29
30
40
60
120
00
I
I
Ii
I
I
iI
1.684
1.671
1.658
1.645
2.080
2.074
2.069
2.064
2.060
2.021 I 2.423
2.390
2.000
1.980
2.358
2.326
1.960
I
I
2.704
I 2.660
2.617
2.576
Adapttd hy ftf'rmiR..<Jion from /nlrmllll'tillrt to St(lti,'(ti('al..tnIIJIti,'l f~d 1d.l h,. \\. J. Oixun and F. J. ~fa~~Py, Jr . Copyrh:ht. 1957, ~fcGraw .. Hill nook
Cornrany, Inc. t;ntri<'S ori.:inally lrom Table Ill ol Stati.<liral Tablrs h> R A. t'ishPr and F. Yatt.. , 19:18, Olivl'r and Boyd, Ltd., london.
81
TABLE
FP
Ii
~I
3
.4
.
1+11
3!1.86
..
0
"D
c
E
0
C1l
o
E
0
"(I
<I>
.......
C1l
"'<I>C1l
tn
<I>
"(I
..
s::
49.50
~~:!
I4
,)..h
~1
~()~
:L2!l
:3. :!:!
:L 18
:L 11
3.10
1s
a. 07
16
:3.0;)
17
18
:LO:l
) q7
20
21
22
23
24
25
26
27
28
29
30
40
60
120
~~!) i ~~:l
:!.4!1
2..16
>
'l
 .,.
)I"'
:!.40
2.36
:!.33
., 31
) )(~I

:!.27
2.271
2.24
., )}
)"'0 11'
2.1s i
2.21
2.18
..
:,3~~3:~:,
,,
. ...
3.01
2.78
2.62
2.51
~. :l:l
y)
:;:
2.r,o
:!.88
2.1!1
I
:! . ";,)
_,,l.l
'!
2.~)0
 ..!.!
a)
~. 2,1
I :;hi'
.j
J) !
__ .w :;IZ:
.11
:;~!
2.2~1
2.28
:!.16
:!.151
~.I 1
__ JO
'
) Qt' I
:;.I
~. 05 '
.01/
.,
oJ
__ o7j
2.0!i
2.06
__ oo
:!.oo
l.!J9
~f;l
~.2:~
:!.:!5
:!.30
:!.l:l
l.!J!J .
:!.Ol:i
l.!l4
l.!JO . u;t 1
1.85 1 1.771
30
40
60
1 20 .
::!.:l:~
s1
3 ii''
3":,1
31;;
1 .,
. _..
.
2.87
3 s> 1 1 i'O
J "
! ;.1~ ~1:1
1 <>.1. >.J.l
3 ~s '! ~,.
~::i:~;si~;:l3~.,.
~;~
i .
,. .
I .  1 .
'
2.!!1
::!.59
2.12
2.30
2.06,2.02
2.0:! l.!l!l
.,. oo 1 1 '16
1 ..('V
1 "'
"'l
),,
l.!JG 1.91
ci;
2.12
:!.0!.1
.,. <JC'l
l.S!J!
I ')" I
) ()'
1..1.,
1.!14.
l.!l:l\
2.091
2.061
o) !
cl I
0)
 ( 1
1.us!
I .,
,
2.02
>

I ..
i
..
.,,
;:,II
1 'J'l
I..Jl
I.!JO
l.X!l
I '"'
..  . I ."'
;)5 I
:;I ;'.I
:;.08! ~.011 1.;1~ ,. 1.;1I
'
2.:l:i
3~,.
. I
2.90
.LJ I
~. ~;,
lUll
2.70
2.50
2.42
2.:!.)
2.1!1
2.14
2.10
>1
2.:18
24
         
3 s7
3)"
2.!16 1
2.72.
~.59 2.561
2.47 l 2.44
uo
2.30 I 2.:!7 I
2.24
2.211
2.:w
2.16
2.13
:2.12
2.41 I
:!.:l11
2.:!8
:2.2:1
2.19
. ,) .,2.:15
..'!6
2.56
20
<>.
I
2.38
~.16
) 1'3
2.11
~~~
2.1:!!
.,. It)
) ovo I
2.06
>
1~
 >
. 
2.98
2.751
i
2.25J 2.161 2.09
2.0!
>3
.., I'.. I .,2.06
.2.22
 I 2.1a
.O" I .,.2.01o
:!.SH
2.8!1
~~8 i ~:~:)
2.70
2.67
., 61
>
., 6"':!.61
:;~
~:!51 ~:~~
2.tG
2.3!)
2.:l3
2.28
2.24
.... )
2.521
2.l;i
:!.:l!J
2.35
2.31
.>.3J
.J l'J
12
:!.61
2.54
:!.. 48
2.43
2.3!1
/. i 2. 71 I
10
:.!.7:!
2.66
2.61
2.,i6
2.52
:;:!:,
~:7!)
2.1'16/
2.t'1j
2.76.
2.73
:!.!J:!
2. !II
:! .!10
) 84
:3.05
2.83
2.67
:!.55
> ~1
:;;11
~:;1:!
3.~!
2.Ei!l
:;;16
:;;l:l
3.11
2.88
2.7:!
2.61
I
:!.!):.!
:l. 01
:!.!1!1
19
3.46
:L2!l
3.:!6
3.07
3.11 ., 2.9~
:3.01 . 2.81
10
11
12
13
14
55.83
~.I? I ~~4
II
.,6:,!1' 3:..,,'
,). ,,_
3.11\
2.!)6
2.81 I
2.69
1 .. ,
I<
:l.7S
7 ., 3.5!1
8
:l.t6
9 ! 3.:l6
53.59
I I I I
I .I s I I '
! .
I 'I
I I
i
I
I
I
I I I i
57.21 I 58.20 I58.91 I 59.44 I 59.86160.19\60.71 ioL22!61.74 62.oo :62.26 62.5:! ;62.7!1 63.06 ~63.:33
s
.>.46
.~~ :~~~
., I1 .. .
6
'
j
1 ,q (
~.I
l.~X
t.x7/
1.87
1.86 I
""7
. ~. II ',s> 11 .I.
1.~~ ll.l'i1 1.7~
l x~
'
I./I., 1.so
1.84 1.79
1.8:!11.78
'"')
~.'
1.~1
1.7<> 1.10
1.74 1.69
1.7:l,l.{i8
1 "'') ' 1 Gf
~~~I . . :
l.bti jl.f..}
1.67!1.6t
1.66. l.ti:l
l.G51l.ti:!
I .
..).
1 f'l
. :
1.~1
I
11! ~~I
~('
i
1 i.G
1.~1:
1 :"'"}
:!.;..
II..'?
1.'!.! ll..JJ
1.60 1 1..,7
1.5!111.56. 1.:.:!
1.5$ 1.55jl.51
'
~ I
l.~X! 1.~~
I.18
1.17
t.!l:l,I.ssj u;5
1.82: 1.77 1.7211.67 1.61: I.?I
1.51! 1.50 11..16
1.8l
1.8:! 1 1.7!1
1. 7G jl.~~ 1 1.~6 1.?1 1.~7 ! I. a~ , l.a1 ! 1.17 ! 1.~~ .
l.X_, 1.771 1.7111.7lll.b6[1.h0 l.;d l.al!l.lh.1.H 1 1..10;1.3.>jll
1.77)1. n I Ui8j t.G:>. I. GO 1.5511 Al:i J..t5 j 1..11 1 1.:n 1 1.:!2 11.26 , 1.1!1
1.72
1.671 1.63
1.60 11.55 11.49 1.42 1 1.38 1.34 11.30 11.24 I 1.17,1.00
I
1.;;:
1'
<
w
TABlE i)~(Continued).
F OS (nl, n:)
n1
I'
2
3
,~.
0
161.4 iIL9
..~l
II
,1:
i
:
10
I
I
12
15
20
...0
t:
,...
(!)
,
0
E
0
o~
'
Id.).7
Ci
I1')?
~~~4.o
:"'
')
J.J
!')
:"
u.
...
...Ol
20
(!)
ell
<:>
"'C
il
'
Cl{
i~
"
'l'' '
,...
1
I
"V'. '.l./
I
... !
.
""""
4'JGI
....
!
:~:.
t9
21
22
23
24
2s
26
27
28
29
30
40
60
120
.,
:
I1
4.v4
4.4>J I
4 '1:)
, 'I
4.41,
4.3SJ
'
o'
~''(i'
0.0)!
3~3 l1
t)
')l
'
!""'
1
1
3AS
""!
.,
,,...
337
..
.f
'
.~;
3'><)!
'"""'!
i
i
i
!
I
I
4.10 i 3.71 1 :L48 1 3.:J:J I a.22 i 3.14
l I
60
i "'
120
...
'
I" )
:
I)
'
!}""
.)...
......."e
40
'
'
I
~~~~~I,,,~I230.I8.. 0 h,_36.8 j,,~:i8.9 I')"'40.u:' !24
..
,. 9 I"'43.9
')
)" !249., I,2JO.t ' 2.l1.1 ,2.). J3.3 ,..,,JL3
A.) .. l 1'"'"'8.0
I 1s.51 11(?~ !1~ ..1~ 1;~~!! r~.:3o 11}!.;38!1 1:J.:~;! 1~.:lz JI~.:Js11~.~o
\I:~.~!! l!i.i~ 1 l!~+iil~.L>: 1:1AG: 19.:?! l!i.~~~ l!l.!~l E~.~o.
1
I 10.13 .' .J .. 1 .J.<:S 1 .1.12 I .J.Ol
;:;.,J.i! 8.~.J I 8.:i:J '. :\.81 I
,g 1 8.7 ... , 8.1v 8.66 8.61'. 8.6~
8.'J:l
8.<>L. S.:J.J:. 8 ..):31
7.71! G.J1 i 6.:)!! I 6.:3~)! 6.Z6j 6.16! 6.0~)
6.01j G.DO! 5.961 5.911 5.S6i 5.801 5.771 5.7GI .1.72: 5.69 5.6Gi 5.6:l
I
!
I 5.191I 5.05 I 4.!JG iI 4.88 Il 4.8~ i' 4.77jl 4.741 4.681I 4.G2i! 4.ii6[I 4.5:31 4.50 1! 4.46! 4.43: lAOi! 4.361
I 6.61 lI 5.79
l f>.41
:
"
.
I ;),;;.) ,J.il i l.l(> I 4 ..,.3, 4,.,.)1 4."'0 I 4 . ..1 I 'l.Lll 4.10 i 1.01) I 4.. 00 ,) .. lll 3.811 3.SI1! .LXll
.L7.1
3.dl .l.GI
' ..
' l ..H
4. d , 4 .. l> I 4.L . .l.J r .Lilt: .l.7.J 1 .3./.l I 3.6X 1 .l.61 1 3.'"1 J.;;l 1 ., .. JI .l.l,r. ",_.lx:
1 ,,_,J
1 .J .. lO .J.2.t! .3 .... ,,
1 3 ''
1
1
'('
I
'
0~1
'l
'l
G'l
'l
""
I
3
o
3
'l<l
I
''
3'
3
>c
':l
.,.,,
'l
13
'>'
'l
0
"
'l
0'
'l
0
I
.,
''!1! 
> 'J'l
! .) ..., .....
ot
, ..
)o I
..
a
.
L.l , . ;>I
t; .
:
10 : 1.~JG 1
11 i 4.Sl 1 3.~Js! 3.ii:JI
12 ! 4.7:)' 3.89 I 3.4~1
t3 1 4.67! 3.81 I 3.41:
14 I 4.60
3.74 I 3.34j
15
16
30
f
5
6
24
I1
3.68 i
3.6:lJ
3 g
. :)
3.36!
3.26
3.181
3.11j
.~.9 l
).
3.06!
3.24 I 3.01!
3 Jf'v Il 2 96 !I
:3.161 2.93!
3.1:31 2.90 i
I  . . .
3.55.
3.521
'
3.20
3.11
3.0:)
2.96
3.09
3.00
2.92,
2.85
,..
2 .. 10 J 2.19 I
2.85 i 2.74'
2 "1 j 2 "'0
l
!'I
2.77
2.66 1
1
2.74
2.63 11
0
'
Lt't
"
1 ''1"
3')'3
''i
).U.
I
i
d,
~~''
~
.li::j
3.491
3.47 i
3.44 I
3 ....'21
3.40
3.10
2.87jl
3.07 i 2.84
3.051 2.821
3.0:~1 I 2.80
,..,.
j
3.0~
2.11:\,
II
"61
I'
....
2.11
2.66 I
). tJ
Al I
Vi8j
2.54 1
2.o4 J
2.59!
1
2 . ,).)
c, !
2.51 1
2A8 1
2.60 J
2.57
2.55 I
2.5:l
,...
2.;:,1,
I)
I )
: < l
2.;).),
2.54j
~ '":1:.:1
'0 \
.....
2.451
2.42 1
2 ..rYt 1.
2.4!)!
2 4".) I
2.41 t
2.381
I

I
I.
A
...
;)
,_
<
.....
li
I .
.
""'.
I
i
,..,..,
'
,..
o:. A1\
71
30
.~~~
'
,,....]
"0"
~).dj
;
z.s;,l
2.nl
2.62J
2.53!
2.46j
:
0\
41
.
2.77:
2.6'>1
2.ii4!
2.461
2.39'1.
2.2oj
2.1~
2.1G,
2.1:$:
2.121
2.10
2.07J
2.05J
2.161.
r15!
2.131
2.121
?
.10 ;.
')O'J'
 '
2 oo!I
{)'
1.9;
u.;:ll
1.75:
, , !
_,;3.3:
2 ..A(\!
,v,
')
2.3G;
2 ._}''''
1:
2.27i
2.2:31
2.28[
')..... ')n!
...,,ji
2.19!
2.1s1
.:..11 !
C)
'I
....
'I
0.31
2.0:Ji
2.041
?
' !
~.03:
~.061
2.011
1.91
1.91!
1.961
...
1.94:.'
;.o7!
"Oil
.
I
l .'J''I
~.
,i,
1.?>. 1
1.7~
1.67J
2.74
2.6li
2.511
2.42 1
2.351
, (;
2.2.),
2.241.
2 1'
.;;~
2.15;
2.111
n,...
:) 1
>"I
"'~bi
:
2.7o'
2.:d
2.i7,
2.:Js 1
2.31'1
....
'
.l.l.{i
)L>>'
.O)j
'
l..
l.!Hi
l.!Jl;
...
....
ro
1.92!.
1.901
l.881
l.S7
..
!
.1..8:)
1
 (
1.1:\J:
,...
'
1'l
. I..
. l
1 .
6<'1
~(
u:
7'
.
z.:,s:
2.45 1
2.3il
2.25!
2.181
I
l11
,,,.,,
.....
2 51.
2:4o'
2.30
2.21
2.13
2.01~
2.01
1 '16
1.92
1.88
1.901
1.871
1.8li
1.84
1.81
1.86;
x_,
l.,A
1
1.81:
l.1..qi
.. 1
1.76
1.77!
1.71
1.751
1.73;
1.71:
.... .
l.10
1.
1.69
1.67.
1.65
i.G,~
:
I
l"'l
. ,. .
1 . 61!I
~
1.10 1
<.6;,1
1.'1 .. 1
l..,.l .
1.61!
1.521
LuG!
1.46!
1.sol
1.39 1
l.t:l:
1.a2
)o'!
1.951
UJ2!
l.S!li
1.661
1.57t
,...
'
.
1.1;,1
1
!
.2.d,
2.06[
''~. 01'i
l.!i7l
1.931
1.87J. 1.82;.
1.8~1 1.~01
1.8 1 1.,91
1.82: 1.77:
1
~.Sli' 1. ,'
1:):
;
!
1
lSi'
. . I
1 i'
J l
:...
...
)ro,
..... l . ,.~. . . ..,r,
. . i)j
.
2 GGi 2.62 1
1
z:s:1l 2.4!J'
2.4:l 2.:38[
2.341 2.30!.
2.27i 2.22
. 9 > !I ') .. :
'>
'J :1
.., __ .,, .~01 .16,
2.l!Ji 2.l:)t 2.11'
2 1"'
? .l
'0I1 ).:.... 0~'
):
....
.)
2.111 2.061 2.02
2.07! 2.03 1.98 1
1.99!
l.!J61
1.961
1.~?!
1.9.3,
1.91j
~ ( '
1.80:
'
1 1
~1
2.08!1 z.o4.
2.05, 2.01i
2.0:l' 1.98
2.01:. 1.96.
( :
..
1..18
1 J..94,
193'
'8'J'
. I.
l . . . :.
1 84j 1 <)'
i. i
...
'
.1 .... \
2.281
2.2Gi
2.2:3i
2.201
2.18:(
'
0
')
 '0:
'
>ol!
.::.,,;"!
~ r
2.4S!
2.421
2 'jQl
0,
2.:Hi
2.31'1
,....
2.71
2.68
2.66
2.64
...
I
1
3.07
:3.02 1 :2.!18 1 2.9li
3.01 1 2.95 1 z.~lo! 2.8:> I 2.7!JI
2.91 I 2.8:) I 2.80
2.75 I 2.69(
2.83 1 2.77 1 2.71 1 2.67 I 2.601
2.76 I 2.70
2.65! 2.60 I 2.5:lj
!
L261
r.
'
1.78
...
I'"!'<
L<l
1 16"
16
. ''I
1 .co'
1 . J'1
)ol
,....;
(l
1.4q l ..l.,
1.:3:Ji 1.2~
1.221 1.0
Adapted with permission from Biometrika Tables. for S!atistir1an.'{, Vol. I (Zd cd.), fdited by E. S. Pe;1rson nnd H. 0. Hartlf'y, Copydght 1958, CamlJridhf~ tr:r.iV("'rs;ty Prc.~s.
!'
.j
+="
!4,  ,
l
1 I 647.8 I
2
38.511
3 1 17.44,
4 1 l'l ')')I
~
10.011.......
8.v1,
s.o7 7 c~1
~ .,
5 I
..
.2
7
II
1:
E
0
~I
~I
il
10
11
12
13
14
"
I. "".':.
.1
1
1
I
I,
6.~!4;
'I
6 ~z
s.s5
6.~_1 1
6.;30I
!
3
1 4
1  5
1 6 1 7 I
8
1 9
1
, o
1 12 1 1 5 1 20 1 24 1 30 1 40 1 60 1 , 20 1 >0
~,4~i~.:,,l,1ll,,
. ,)1
8.131
,..)"
/.0
6.:J4
6 or.:r
"71
. Jl
"" j
7.76!1
660
.
.
5.891
i. A)
;).''~1
r.
'>
o. ovj
5.46i
 ''61
0.
4.83/
4 6'''
''I
4.471
5.Io!
4.971
4.861
4.~'?1
4.2<ij
17
18
19
1
6'?0
I
6.12i
()'11
6 ~.i:
5.DSi
r. 9'''
v.
_,
4 ./1,
r"'!,. .r
4.69!
,1(;21
__ .)
4.56!
4 .i)"ljI
4 1"'"
4.08!
40ij
3.951
3 .gol
~I
~I
20
21
22
23
24
5.871
5.83'
5.79\.
.........a1~
:>.1
:; ~'),
v.l
4.46
4..12:
4.:381
4 3"::.,
4 ''3'"1
3.823.78/
3 7 Y.
r
ci
25
27
5.691!
~.ss
~.63j
4.29:
4.:::!1
U.;
28
29
5.v9i
4.20:
30
..,l
5 01~
:!:I
0
15 1'
16 !
1'
26
o.~li
40
5A2i
"991
o.~
60
120
 1
:).
'Ji
"'
5.02!
\
I
'
'
799.51 864.2 89!!.61921.8 937.1 1948.2 . 956.7
39.00,. 39.17. :~9.25i 3!J.301 39.:33 :l9.361 39.371
16.04 15.44! 15.1o1 J4.88i 14.731 11.62 14.5l!
O !'8i
10 0" 1
1\ 60'
9 '>J:I
Q 21J"I
<J Q7
8 ():;I
1.~21
418'
I
4.051
3'J3'
'
., 8oi'
<).
3.G9!
!
a.
I "''I .. '! .
7~391
6''3'
 j
5.52~
r.: oc
0.
4 n21
.I
I .1"'8I
4.47l
4 .2RI
4.12;
4.ooJ
3.891
3 ~o
""I
3.7:Ji
3GG'
>t
:3.61!
3 "6'i
I .. ;)
3.8~
7.W
r:o<;;l
iJ.J.,
5.2H
4 "'Q)'l
3.461
33'
.....
3 2"1
0
3.121
'
6.76i
6.681
sol.,
iJ.
;J.IJ1.
1
4.!!0i
4.82!1
4 1~!
4 'lG
'''.I
4 1~
. vi
" 'I
. "i
4 0''
.;;!
. oj
.vol
gl
8 Cl'
6.621
4"
:).
4.76
4 '301I
6.52!
6.331
6.28,
0. ~
."
"I
..
'7CI
"3~JI.
iJ.
4.671
4 ')0 i
3.62!
3 ...'3'i
3.2~
3.52!
1
"'""'''
>JI
3.181
3.1'l!
I
3 . or:)\
2.!J9j
3.0o
'
2 "!
"61
'''l'''
2.9;J
I
2 .~'
R6 1
2.7!Jl
........... :
><NI
~.v .... 1
,,~zl
.1
'>
3." 1
3.''ql
3.3ti
1
3''8
~...i
:3.221
3 . 1nl
3 1
')fl
3.22j
,31~'
. b 1
3.101i
3 o;)I
3 ~I
''0'
3.121
30'''
'Oj
3.01;
''.... n~'
;;t>i
3 ..j"''
'JJ
..
'" 8" 1
'?
 8',
2.87.11
2.771
2 .In2;i
3.131
3.09
3.051
3 . O?l
_
2 .~1<91j
3.D11
2.97,
2.931
2 901j
2 Snl
2.91i
2.87!
2.81\
2 '8''
~1
2 781!
2.811
2.Eol
2.761
2.,7J!
.,,
2 ,olI
2.771
2. 731
2.701
2 .or:7 .
2 641~
2.681
2.64 1
2.60,
2 ..Jrtq
2 'C)A''!
2.85j
2.75:
2.fi8i
2.61j
2.~0!
2.18/
2.761
2.11:
2.631
..n,..::../;J~1
2.6211
r\
j"r
,).)~,
3.::;0:
31"
.
~~
3.381
3 ..3''1
'1
3 .101
,_
i
I'
30"
>j
2.90I
,1
n~g
~~..11
2 6~'I
2.5
3 ql
''*!
'I ..
I
2.971
2.9~1
2.9<;1
2.901
2.88
28"'1I
~
2.74
2631
...
2 5')_,1
2.41,
'
2.~21
')f"'1t
~ ..J. 1
2 3'''
"I
2.29!
'
3.~;; 1
~z~:
3.~!1
3.0;)1
')'"'I
.... ;;~._)
2.9:3;
''i
~.s01
'i
3.721
3 r.>'
'.''j
:3.37
3 ..l8JI
3.~~~
6.43j
"27 j
0.
4.571
4 101!
3 ~.,I
vnl
J.
3.;J0I
3.~oj
"I ..3A4J
8 6~1
3
'.o'j
3 <JG
3.6oJ
3.7~
3.13'!
3.1o
3.08,
3.06,
3.04
32"!
. ;)!
3.1:3!
30
11
~
2 . 89j.
Z.Bj
(\(j!
3.781
3 a"fl.. 1l
3.35!
??31
.3.~11
3.291
3.27
\!
"''I
:3!J.451
14.17
3.l'Sj
3 6" 1
3.51,
3.69i
s.6!!
3.6:;l
3.6...,
3.611
3a.
.:).}1
39.43j
14.2'>1
3.. 9~'i1
'l 7 j
3.61
3 ~,21\
. ~ .2"'~,1
4 391
14.47!
39.411
14.:l1
4.07!
'3 ~Si
'"':
3.731
3.29\
3.23!
3.221
3 1"
o.
6.8:
"70
,).
4.99i
4 .v
"3/
.. '!
39.401
14.42.
8 8'1
4.2...'
4 .0'.
3.89
3.511
3.481
3.441
3 411
3 381
6.981.
58'/
.5.12
4 scj
,)
~'
39.39!
3.2~
~I~
2.8~li
.I
~?~j'
2.6.7'1.
 G''~I
2.57
Z.:iaj
2.')0:
 4nll1
2 44'1
CJ
'
2.4111
2.?!!
2.51!
2.4'2.4 !,1
.::.32'
z.~~~
.3oi
2.6~[
2.~~~
2.a;~1
2.o3'
2.4"!'
2.4~1
~311
21'."'
.DJ:
2""~
.t)lj1
.,.1
J.,t.Dl
2.53:
2.4i'ii
"'.,3
"1
2 z>
"'I
2.11
241'
l
2.291i
')1"I

2 o:
1.94,
231
.. !
2.18!'
206
1
1 94 i
1.831
2.611
?41
 .,
')~>tl;
~"'
2.19!
2.~1)1
~a~1
z.:JH!
2')/""!'i
__ ,,
z 1"Oj1
2.0511
"I
il
'
11>
. ,,
4.47,
4 001I
') 6~1
v.
~~
3.42
3 .2"ll
'
3.o71
2.951
2.841
'I
"~r::
<..IU,
39.46!
14.121
..J
il
6.2:Ji
r)
o~
1).
4.42
) ()Ci
3 61 1
4.:l6i
J., .)I
3.0.11
O{'f
3.
Jui
CCI
..
3.:n!1
3 . lnII
3.021
:i.:lii
"1'''
" ;
2.191
.7;:,!
I
2.~91
2~o~
'.
2.6:3!
2"6
,;) i1
2.50i
2.96!
~8~1
"s.:
__ ,.,
2.68j
26''
~ ... l
2.5Gl
2 .<Li
,j
"1"
"'' ai
2.441!
2 .39 I
2.46i
2.42/
2.:39!
2 361.
1
2 V'JI
~)
2.411
2.371
2.33..
2 ..J,.
'1 '1'
2 27'~
2.35\
2.31!
2.27!.
2.24'1
2')11
 l
2.301
2.24 1
2.~~~
2.2,:;1
2.21 1
2.191
2.1 zll
2.1vl
2.18i
2.1s 1
2.1~!
2.2~1
'
?CJO!
........ I
2.07,
19''
,,
, 8')'
.
:
1.711
I
~.22 1
2'4'
.l '
2.01 11
188
1
1.I~s.'
1.641
'
2.57'
C)~o!
'"J
... ),!
6.18/
"01 1
,).
4.31
6.121
4"6'
.v !
4.2Gi
o.
3 "''!
Q4"!
.....
'7V
cl. 'ol
>I
jJ
.j
3.1111
2 "'!
4
2.79j
3.os
Vl
2.67!
I
2 .~Qr
) .. 1
2.51,
''4'1
..... '*!
2.38i.
2 .3''
o)l
2.61!
I
2 .>~>r
2.'1;) 1
.,.Jv!
..... ~ o:
2.32'
2 .2"'
2.o.Jj
2.L
'}*'
.'s
')". o
2'''''
.J ... :
"''"
w..Y
2.29j
2.25J
2.21'
? 18'

' 1
2 lc);
' 1
I
2.12i
3.26!13 06 '
2.nj
~.78j
'"I
~.721
II
1:
207
i
'>011
' ... I
l<l4!
J .94i
1.88i
l.f.Ol
lSJJ
1 ~
r;g:i
1.57J
4.:w:
3")'!
I"
6.0:!
4s
'l
4.14
3.:Wi
3. oolI
2.H5 1
.3''"i
2.0il 1
4<~0
... 1
o.v
. ''!
.all
2.0;~1
2.1~:
6.07 1:
,, .,..
3 
2.22!
2.18:
2.14!
2 1"1 1
2 0"'
''!
'
2.05!
2.os:
2.?0:
1.981
1.96;
o.v
z.o~i
2.0 ~~
2.0~1
1'74~
1
J.J'l
2.~~1
2.:)8
2.26~1
2 ''0 I
"6"""!
1.
1
1.Dr1'I
1A8j
1.39!
'
'
1 .""'
)''1
2.72
~6~
2.:):!
2.1~1
2 13
2.16:
2.111
2.01;:
2.0!)
2.01
2.00
 ol:l
194
'>.... 04';
'J
;
l.H8i
1.9G!
1.93 1
1.9~1
1.89,
'1:
.....
8"
o
187.
I
1.72!
Rl
Lu~'
1
1 43 I1
1.27j
.
1 'J~1
1.91
1.s~
1.8y
1.~3
1.81
1~"
,L1
1.64
148
.
1.311.
1.00
n1
= degrees
nt/
j
j
I
I
i
I
I
I
I
I
\
I
I
[
I
!
I 1 I 2 'j 3 II 4 I s I 6 II 7 iI 8 !I 9 II 1 0 l_:..:_jl
2 I
1 s I 20 i 24 I 30 i 40 i 60 I i 20 I
"'
I
I
1
I
I
!
.,'~~~,'1
I
,.
I
,
I
I
I
!
i
I
I
!
I
i
i
I
i
i
I
4052 14999.5 !5403
5625
5764
15859 15928
15~)32 '6022 160;)6 [6106 J6157 IG209 [62:l5 [6261 :szs7 16313 i633!J f6366
1
1
1
1
00
1
co,
<)"
001
Cj<)
1
"1[
nQ
?'l
"9
'30
9'~
3'
3
1
')<l
"G'
""l
l9
no[
<W
40
'"'
4')
99
43'
''"
'cj
'l9
46
"9
,~,
9'l
47'
99
481
9
49
2
98 ,i.l' " 'v!
"~i.li
;; I
"" 'J! .,,,,,;II. ..)., . ..J. i "" ~~ ..  I ""'*i.l "'' ' " .'>li
c. i
. I " 1j 9"O' C(]
HJ
I'
&:
3U2I
21.20
i
s 16.261
~l
13. .IJI
7 I 12.25,
S 11 !1?6.
..... 1
..
0
c
0
t:
C)
.
....
"tl
0
0
"tl
Q
..........
Q.l
"'
I
61
1
10 !]
11
12
Q)
"tl
10 .0''i)1
r~
9."'''
~I.D
~s j
"~od
..,. ,
1
G,";.),I
"'3'
.. :
9 07
14 I
8)i6j
6.;H!
'
13
"'l'3'
~
~
1
8 , 6 ,');
l S
16
""'"
o.:).Ji
r
84o
, .
1
17
20
s>)'I
8 18,;
I1
21
80 }.
1
l&
19
1.)
,..,.;
8.101
1:
/.5
26
27
23
2?
60
120
,.,
6 !~ol'
1
6 ,o361
69'JI
..... ,
611
,
j1
60
' 11.lj
: ')'Ji
.).;
'~~
;;.8.,,
r::
6rrl
J,lj
))'
6.~,
7.7:!.'
~I. 6f<'
L,
~i.r}t
"''1
~GO'
1.;
"cs!
t ) I
0 11.
~l,u
7.0Si
6.85
" I
6.6.3:
'
::> 'J91
'I
 ,...,1:
a.G
II
 64j
0.
. ""~
;,,:3;c1
cAs)l
V.V
4.691
4 , s 9'
, j
sl
4'Ji
_.,
;),
")<)'
~
....... 1
~1,
4~~
., q
"1"
:).
~~
467.
,
!
CQ<l'
v:
ACS'
"'J",o)
o),
cOl!I
,).
L~H 1
4~n.
, itlt
ro.
t.J i
' !
4.4Jr
4~1
,o) 'I
4 .31 j
4"'/;1
.:.:..>1
1 1
))'
.. 68:
~ 11
'<8,
5.53!
 ... ,
~lt~i
 4,
!),
\)l
" 4'''
"i
4.61:
4 . GO';
4 ,,)q
'~
4 .>
r
.:LJil
4 . lli
4 . o~'i]
4 .r
("I
"3"Jl1
v.
 1.,,
v.
bj
4.98~
4 .1!
) '
' 31!l
'7.
4.1:3\
3.9:>'
4 ..0'~7
3 t::'3 :1
c.~.JI
v.
7.46;
7.191 6.99!
6.841
1
5r.3
63~
61k
60'''
.o 1
. lj
.... c)
. t)l
6.061 5.801
5.61 1
5.47[
o.04J
D.D6,
 "'
,...,l
4.7!!~
4.6 1'!
...
:>.'''i
27.49[
14.801
4 ,,(!
86
4 .>'
i... l
611
27.67[
14.981
r. ')''
C(;6 :
,),
."
.).
J
f.),/01
28.241 27.911
15.521' 15.211
"A'
~~t)
~Lui
Q>
~.1
1.1 !.
15.98!
,'),t .q.
~/ 'Jo.)'
7';;j
,()t_,\
28.711
c <Jrl
o),,
,)1.
487'
i
4 , 8"'1
30
40
cu'
">I
.)./..,.,
22 1'
23 I
2 .1!,.
29.46i
16.691
Ill
r:C1
10.56!
30.82!
18.00
.f
'I
3 .I~ol
O
1'
4 .~)
444
. !1
43'
, 't'l1
4'r
,....,,):
4 1'I
I
4.10,
40'1
. >f.:,
3 go!
OJ'r
31'
.Jr
3 ..<J0!1
l
3.8:,.
:ts~(
3 7 t>l' 1
3 ~,_;(;)I
3 'I
~3
3 70'
I'
"
n
v.JJ..j
:~.6:J\
3.34;
3.171
') 'I,
v.02
1
c ')()'
)\
... "11
;,.0
1
i 8')'
,
._.:
'f!
4.;:>:;,
'G'l
~l.
'}f
4 .....
6>11
4 :~'~1
A <
4.81
4.46/
4 ,,3')'
.... (
4'>0'
.... 1
4101
401i
3 ..9';
"i
o
0
'
4 , 1.1!~
40'3
1
3'3'
o3""
,Cd:)
'')
"~~:
v. 11;
!
3.70,~
3u11
.o
3 ~ 76
311
, I ~
31"4'
~) \
3 ,,)cql
.. 1
3c!
',,)'i:
:3.43'
" '>> 1
,>.v~!
~QI
t) ...... "l
,..I
3.81 1
.3 ._)f'7'
..
3.6.;!
3.:J~)!
0
"61
.p:
U . .J
_}I
0
' 11
'i.lJ
'
(''1
I'
'
.::).o,,
1
'3o'
t ._;
1 0 "'
3"'11
,I I
'3 6'3'
. ,.
.)..,,,,r1
'l"'!
' .. )1;
...
3 ,'!.)
cl.
0
1
Jj
.). ....
3 ,J"I~'
.1,
J
) . .::...'.'1)
, ,,
3 .i.ll
ror
'3 '3');
>i
I.
"'HJ:
:~.1:!. 1
.j.!).'
3 4"I!1
., 2"'
.).
;J\
::U2i
2.9Sl
'2.o0
" I
1
".' 41
1,.).j\
c'
A '\Oi
"%
!
4.141
')q)~
"'
o.
,3 ,;),)!
;. ,..._
3 ..f'i
,r,,
.,j,;) ):
 nr,l
,),l)_,i
3 '3r'
n;
1
""'0
o),o)
~3 .t>
_ ..... !
I
2.~J:;i
2.7:!'
> 6 +!
,.

;3.2~3.
.,,.I
'' ''l'i
3 ' 1 0'
'l
t},.;;..
'3
1~'1i
'>......qo!
_. .,1
2.::<'2
2.6G 1
>
 !
.;;li
27.3.51
1<1.66.
27.23i 27.0.'),
14.551 14.371
10.16j lO.os1
1
7<!8>
... 1 7~
.oij
6.72!!
6.621
"91
"Ol ,
;.>,
D.o
5.351
5.261
!
I
4 ,.;t
nl
1.63 1j
'""
"{,o.JJ/
4 ....jQ
;::!
4.0>i
' ;;:r:
i.e>;
~
... ,~
'L'Hi
1.i.'3"1
V:
'1()'l
"*
3.941
3 oo'...:!Jj
R" 1
J~vl1
>1o
1
3r.s
VI.)
(iOI,
o),
3 ...,:
.. )~~
. "',
3At>,
'34nl
"'I
3 .'l"''f1
'>'jQ'J
1.).
3 .0"0 ~
''6'
>
Ji
1
'3""
.~,J;
3.11
,;) ,
3 '".>;
Ali
":I
3 ..l,,
"3"11
)
i
3 ,...,0[
>~''
'j'Jll
, ... 1
'!
">('I
v.):
'J J"!
).
"~1
1"i
q,).,i
3.1SI
3 1"'
~):
" "'[
5.13 "'
Q<l
'3.
onllj
2 .::.
'''JI
2.72:
2.561
1
'
2.411
3.~>
~L09;
3 OG'i
'l .\'l'li
.
3 .ov!
,
2 9;
vi
'>.....o
"0.I
2.63:
2.47i
2 '1:
'J)i
26.871 26.69:
14.20, 14.021
9.89j
~7''l
1. ""'
6.47)
9.721
1
7"6
J I
5~
~.
'!
6.311
""2
D.O
5.11!
4.96!'
4 .11;
"' 1
4.~0:
A.
4 1611
4 rrol
)Q
4.~>
')..
4 o011,
3 . .::.1
3.801
3.66!
3 6~q
3""'
.u:J;
'316
,'
!
37'11
U,
'3 30I
'.
3..;~
)<.;
'317
'3 1 ")1
..i.L.o
1Q
....
, '\
.. :
3 0'3
')
(;
8''1
4.10!
3 ,(.86'1
3 ,..66 j
341
. :1
'33'1
, .;..
31'"
, Ji
'J . , , ,
),;).;..,
'jJ)I
0
,...,.oJj
3 .1)1
... 1
'
.3.091
'lO'll
'> 1181
._.,._
')~J'3i
..... 1
>
'' 0 '
t....OJl
301(1
c:
3 00'
'I
0
~.S.J:
2.81)
2 ,/,)I
~:r
')./.)\1
"
0 41
90
I
9 70:

')~xi
......
'Uj
>..... b'J;
~,.,
">:
.....
) ... I
2.G'J
2.:H)
2 . 1'''
01
2.35[
2.19i
2 0 '<.II
l
9.47li
7''1
>
6.07!
b"'J8.
.... 1
4.n:
rrl
''~.)).
2 '3~'I1
2.20;
2.0:)'
1 .88''
71
"1'""!
::.>.
4.57:
9.2011
706
1
5.82 11
"03':
V.
4.48!
4 . l"i
lj
4 08:i
4.0
'3 "8'
~of
3 .v.l
cgl
3.43\
<
3'S'
.1~...,
'l(l'''
, Oi
'300'
) ..,..,;
1~:
0
<)
_)\6 1
''"0'
>
w, 7c
>:
I'
)__ ("('
JJ,
}
);
.G,
2.581
,, r'
.....)::J,
'''.:}.,.
.... :
2 '"I
1 1
i
1~1
3.86
3 .G'''
i....
3l,),...i
431
.l. 18. :
~4'i
3 ~
3.3Di
3. tJ
3.18)
3 ..'('
J_l
3 ')4'1
3 ,U:)
zoo
.JJ1
><j),
u ...,:
)<HI
..,.,C':i:i
.(,.,~o)j
.," 8''
"'
1
1
~ ,
2.18
'
1
2.,.
.I .... !
'> 6ir
....,.
1
2"')
,\.l:
')_. c,.;:
)~~.
~.:J
}A'
2.50;
)......4~'i
'> 44!
4' I
~.,
o)
., :
3.6.!,
3 o":t)j.
'"'
0
3 ~2~
)CI
3.091
I
3 ,,11
,,.)
3.Gol
3 'l6.,
1 '
3 1~
:'
3.00
>~;
~ :
.....
,,.,.rr
6"''I
Ii
_.,.
[)i
s~I
?R4
..... ~~ ;
27"'
;):
'J66'

;
"~8
.,
.0:.,..1.
.,~ ':)
>"'
v:)
.....
''""
..... ,.)!
)~"I
'')
2.61
.... v.J
J
.....,,,) !
')". 1
"0'
2'61
.'1 .
'J
...,,t'0'
'J
''i
>~.;,J.:.
Jr<:
...... )1:
'}'."\
.i)i
)'j."!
 d;
)>i"
_,._J
>
JO'
~ ./
)
.....
40'.
'''
:<')
).....''1
) '
'}')
j
'})J
.h,l
~ .. ,6!
~.II
)
...
2.1~:
2.:3:31
2.2:3:
).......,)i:\;
.. ,
')~J'~
"'"'
2 ~.1)!.
F
2 2r.:
Oi
1
)
'33
.2
"231
I
2 Jo
 I
.,
..... 1~1\,
2 14
I
> ''0
> 11'I
2.0:3
1.86!
1.94
1.76
oo'
I
2.'J:
2.12.
UJ;'i 1
1
1"0
' ,
A
"'t.
4"'6
.o
4.:nl
> 1 1.
.........
"'
4"
.... U!
4.4o:
9.02
6''"
.oo
5.651
., vi
I
)
(
.G.!!
''6l'
 !
''k.)Vi
.ui
7" 1
"3'l
2 .....JO'L
....
'i
9.11 1
697
1
5.74 1
1
3 1'3'
I.
30>'
. i
''9''
.;....,
:
....
26.1:3
13.46
3 , 2'!
..l;
310'
.
')r,o 1.
,;,V
>
' !.I
2 ......
""'
7,
1 .I~:
ro.
.LH,
3 o ~a
I I
''" 1
o3 ,kvi
>0'
 l ,
2.66:
<) 6'''
....
V;
.,.... r.o
u /'
2 .vi,
.~
5.99 11
520
4.65 1,
9.291
714
'
5.91.1
)u
J... /01
2 .1"1'
~.,
9.38 1!'
7')3

26.32 1 26.22
13.65! 13.561
4 ,...,,),
')CI
.(,.0
26.41;
13.75j
I
)
2.J4,
2"'''
ol..H.}:
2 ,0
"'l 1\
(
26.50i
13.841
1
4 '3''
Jj
I
1
l "),~.10i
.. j
~.u>;
2.9G\
,,.;;,.}.}1
"
> . 0
"O!
')~~
S7'
'
3.<>lj
i
3 ,,_,'l~:tli
3''6
..... 1
)" ...
0 '
4 ,'!'I
1.1
..
3 HJ'6 l
1
9.551
"40
I.
6.161
nsl
t>.J
4.811
I
26.60!
13.93i
...... )
1":
;n'
""'1'l
''
.... 0''

U\i'
l.GG!
1 47'1
... .....
'
)
.4~
')
'''l''
.... ..)
>
''1
~
2.17
2.1:3
2.10
2.06
2.03
1 ~<J>

1.73\
1.5:3
1 '3'''

2.011
l.SO,
1.601
.1.00
"'l
l,.)r;
f'
......J
0'.
177
APPENDIX C
PROPERTIES OF EXPECTED VALUES
Definitions:
Given the random variable x with probability density function
~'
E [f(x)]
= Jf(x)
E [f(x)]
=E
~(x)
f(x)
dx
~(x)
for x continuous
for x discrete.
~'
00
[f(x,y)J~t
f f(x,y)
=E
E [f(x,y)]
(x,y) dx dy
E f(x,y) ~(x,y)
for x, y discrete.
Properties(Theorems):
Given a constant k
=k
f(x)] = k
E [k]
E [k
E [f(x)]
E [E[f(x))]= E [f(x))
E [E f.(x)] = E E [f.(x)]
J.
for x, y continuous
J.
178
APPENDIX D
PROPERTIES OF MOMENT GENERATING FUNCTIONS
Definition:
Given the random variable x with probability density function
the moment generating function of
M (t)
X
=E
~.
is
[exp (xt)]
Properties (Theorems):
Given a constant k
exp ( kt ) M ( t )
X
M_
~KX
(t)
= MX (kt)
nw(k.t).
i xi 1
179
APPENDIX E
PROPERTIES OF MATRIX TRACES*
Definitions:
Given a square matrix A, its trace is the sum of its diagonal
elements
Trace A= Tr(A) =Ea . . .
l.
l.l.
.
then
ClTr(F) =
ClA
[a Tr(F) ]
aa.j
'l.
Properties (Theorems):
Given a constant k
Tr(kA)
=k
Tr(A)
= Tr
(B A}
180
Given two matrices A and B conformable under both multiplications ATB
T
and AB
T B)
Tr (A
= Tr
(A B )
From the above properties it is evident that similar matrices have the
same trace, that is for any nonsingular matrix R, and any matrix A of
same order as R
Tr (Rl A R)
= Tr
(A)
= I:
A.
l.
a Tr(F)
a AT
a Tr(F)
]T
aA
a Tr(A B)
aA
= a Tr
aA
(B A)
= BT