You are on page 1of 3

2 to do this, break

M down into submatrices M=

q in the same way:   u q= w : by writing fq1 = u1 ;:::;qm = um g and fqm+1 = w1 ;:::;qn = wr g. Then  U0 V   u  Mq = V T W0 w and qT Mq = uT U0 u + uT V w + wT V T u + wT W0 w so that Im becomes
and likewise separate the vector

U0 V  V T W0

(E{11)

(E{12)

(E{13)

(E{14)

Im = exp

1 2

u U0u
T

Z

:::

Z
exp

1 2

T [

w W0w u V w + w V u dw1 :::dwr


T + T T ]

(E{15)

w, rst complete the square on w by writing the exponent as [ ] = (w w)T W0 (w w) + C ^ ^ and equate terms in (E{14) and (E{16) to nd w and C : ^ wT Ww + uT V w + wT V T u = wT W0w wT W0w wT W0 w + wT W0 w + C ^ ^ ^ ^ This requires (since it must be an identity in w): uT V = wT W0 ^ V T u = W0w ^ wT W0 w + C = 0 ^ or, w = W0 1 V T u ^ C = (uT V W0 1 ) W0 (W0 1 V T u) = uT V W0 1 V T u Then Im becomes
To prepare to integrate out

(E{16)

(E{17)

(E{18) (E{19) (E{20) (E{21) (E{22)

Im = e

1 2

(uT U0 u+C )

:::

exp

1 2

w w W0 (w w) dw1 :::dwr : ^
T ^)

(E{23)

But by (E{9) this integral is

and from (E{18)

The general partial Gaussian integral is therefore

W0) uT U0u + C = uT [U0 V W0 1 V T ]u :


det(

(2 )r=2

(E{24)

(E{25)

Chap. E: MULTIVARIATE GAUSSIAN INTEGRALS


I

m=

Z
:::

Z
exp[

1 2

(2) q T M q] dqm+1 : : : dqn = p

n m
2


exp

1 2

det(W0 )

uT U u


(E{26)

where

U U0 V W0 V is a \renormalized" version of the rst (m  m) block of the original matrix M .

1 T

(E{27)

This result has a simple intuitive meaning in application to probability theory. The original (n  1) vector q is composed of an (m  1) vector u of \interesting" quantities that we wish to estimate, and an (r  1) vector w of \uninteresting" quantities or \nuisance parameters" that we want to eliminate. Then U0 represents the inverse covariance matrix in the subspace of the interesting quantities, W0 is the corresponding matrix in the \uninteresting" subspace, and V represents an \interaction", or correlation, between them. It is clear from (E{27) that if V = 0, then U = U0 , and the pd 's for u and w are independent. f Our estimates of u are then the same whether or not we integrate w out of the problem. But if V 6= 0, then the renormalized matrix U contains e ects of the nuisance parameters. Two components, u1 and u2 , that were uncorrelated in the original M 1 may become correlated in U 1 due to their common interactions (correlations) with the nuisance parameters w. Inversion of a Block Form matrix. The matrix U has another simple meaning, which we see when we try to invert the full matrix M . Given an (n  n) matrix in block form


M =

U0 X

V W0


(E{28)

where U0 is an m  m submatrix, and W0 is (r  r) with m + r = n, try to write M block form: M

in the same (E{29)

1=

A C

B D

Writing out the equation M M 1 = 1 in full, we have four relations of the form U0 A + V C = 1; U0 B + V D = 0, etc. If U0 and W0 are nonsingular, there is a unique solution for A; B; C; D with the result M where

1=

W0 1 X U U U0

U0 1 V W W

1

(E{30) (E{31)

V W0 1 X

W W0 XU0 1 V (E{32) are \renormalized" forms of the diagonal blocks. Conversely, (E{30) can be veri ed by direct substitution into M M 1 = 1 or M 1 M = 1. If M is symmetric as it was above, then X = V .

Another useful and nonobvious relation is found by integrating u out of (E{26). On the one hand we have from (E{9),

Z


Z
exp

1 2

uT U u


du1    du

(2)

m=2

det(U )

((E{33)

but on the other hand, if we integrate fu1    u g out of (E{26), the nal result must be the same as if we had integrated all the fq1    q g out of (E{9) directly: so (E{9), (E{26), (E{33) yield

4 det( ) = det( ) det(

Therefore we can eliminate Z




exp[

W0 and write the general partial Gaussian integral as    n=2   det(U  1 T q M q] dqm+1 dqn = (2)M ) (2)m=)2 exp 1 uT U u 2 det( 2


W0)

(E{34)

(E{35)

You might also like