This action might not be possible to undo. Are you sure you want to continue?

Definition of quadratic forms

**Let {x1, x2 , ..., xn} be n (non random) variables. A quadratic form Q is, by definition, an expression such as : Q=
**

ij

aij xi xj

where the aij (the coefficients of the form) are real numbers. So a quadratic form is a second degree, homogenous (no constant term) polynom in the xi. Studying quadratic forms is made quite a bit easier by using matrix notation. * The ordered set of variables {x1, x2 , ..., xn} is considered as a vector x with coordinates (x1, x2 , ..., xn ). So we'll write : x' = (x1, x2 , ..., xn ) with " ' " denoting transposition, which simply means that the xi are written as a row (as opposed to "as a column"). * A denotes the matrix whose general term is [aij]. * A quadratic form Q is then defined by the matrix equation :

Q = x'Ax

A is called the matrix of the quadratic form. It is easily shown that A may be assumed to be symmetric without loss of generality. ----Quadratic forms are an important chapter of Linear Algebra. We will only retain the aspects of quadratic forms that are useful to the Statistician.

Expectation of a quadratic form

Let : * x be a random vector with mean µ and covariance matrix * Q = x'Ax. be a quadratic form. We'll show that :

.

E[x'Ax] = tr(A

) + µ'Aµ

This result makes no assumption about the nature of the distribution of x, with the exception of the existence of the first two orders moments. 1

In) Quadratic forms in a standard multivariate normal variable and the Chi-square distribution It is then natural to ask whether there exist other quadratic forms in a spherical multinormal variable of unit variance that also have a Chi-square distribution. * and all the aii are equal to 1.. ----Note that these n variables may conveniently be represented by the single multivariate normal standard variable : x~N(0. distribution are then ?? 2 . we'll show that the rank of P and the number of degrees of freedom of the equal. Xn} be n standard normal independent variables : Xi ~ N(0. In other words. The matrix of the form is then just the identity matrix of order n.Quadratic forms in a multivariate normal vector and the Chisquare distribution Important parts of Statistics like : * Multiple Linear Regression.. We'll show that a necessary and sufficient condition for x'Px to have a Chisquare distribution is that P be an idempotent matrix. ) then certain quadratic forms is x are Chi-square distributed. denoted In. In addition. Ip) ? Recall that a symmetric matrix is said to be "idempotent" if P ² = P. . by definition. X2 . 1) for all i Recall that. do matrices P exist such that : x'Px ~ for x~N(0. the Chi-square distribution with n degrees of freedom is the distribution of the variable X² defined as : X²= and is usually denoted : X²~ n i Xi² So the variable X² is defined as a very special quadratic form in the Xi in which : * There is no cross-term (aij = 0 for i j).. * Variance Analysis. Chi-square distribution Let {X1. extensively use the fact that if x is a multivariate normal vector : x~N(µ.

3 . we'll show a bit more than that as we'll consider variables ~N(µ. A symmetric idempotent matrix P or rank r is interpreted as the matrix of an orthogonal projection operator on a r-dimensional subspace. unit variance multinormal vector is Chisquare distributed with r degrees of freedom if and only if it is the squared length of the projection of x on some r-dimensional subspace. In the above illustration : * The vector x has a spherical multinormal distribution with unit variance. The above result is then interpreted as follows : * A quadratic form in a spherical. * The squared length of Px is distributed as r . * Px is the projection of x on the r-dimensional subspace . I) that are not necessarily centered on the origin. then : x'Px = x'P ²x = x'PPx = x'P'Px For P is symmetric = (Px)'(Px) and x'Px is therefore the squared length of the projection of x on the subspace. If P ² = P. The projection of the vector x is Px.In other words : * If x~N(0. In) * Then Q = x'Px ~ r if and only if : *P²=P * rank(P) = r In fact.

Craig's Theorem Independence of two random variables is a condition that makes life of a Statistician a lot easier. Independence of two quadratic forms in a multivariate normal vector. It is therefore natural to ask under what conditions two quadratic forms in a multivariate normal vector x are independent. the formal definition of Fisher's F distribution involves the ratio of two independent variables. the interpretation in terms of the distribution of the squared length of the projection of x on a subspace is lost. In particular. in the general case. for instance when studying the Mahalanobis distance.Quadratic forms in a general multinormal variable and the Chi-square distribution We then address the general issue of quadratic forms in a multivariate normal variable with an arbitrary covariance matrix : x~N(µ. ----So. ) We'll show that the quadratic form Q = x'Ax follows a (non central) Chi-square distribution with r degress of freedom if and only if both the following conditions are satisfied : 1) A = A A 2) rank(A) = r the value of the noncentrality parameter being µ'Aµ. Independence of quadratic forms We'll then show that : 4 . and A is not a projection matrix. For example. in addition. many results pertaining to the sum and the ratio of two random variables explicitely assume that these variables are independent. ----This situation is encountered. it will be needed later when we address the issue of the independence of quadratic forms. Independence of two linear forms We'll first show that two linear forms A'x et B'x in a multinormal vector x with covariance matrix are independent if and only if : A' B=0 This result is important by itself but.

As it happens. The result is then known as Craig's Theorem. this condition becomes the simpler AB = 0. * And let x'Ax and x'Bx be two Chi-square distributed quadratic forms. * Then these two forms are independent if and only if A B=0 which is the same as before but with the assumption about the distributions of the quadratic forms removed. ). Craig's Theorem The above result explicitely assumes that the quadratic forms are Chi-square distributed. Cochran's Theorem The results about the nature of the distribution and the independence of quadratic forms in multinormal vectors as described here are a key element of a most important theorem in Statistics known as Cochran's Theorem. which is stated and demonstrated here. whose demonstration lies beyond the bounds of this Glossary. More generally : * Let x~Np(µ. * And let x'Ax and x'Bx be two quadratic forms. ). ----It is not assumed that the Chi-square distributions of the quadratic forms are central or have the same numbers of degrees of freedom.* Let x~Np(µ. this assumption is unnecessary and is introduced just as to make the demonstration easier. * Then these two forms are independent if and only if A B=0 In the special case where x is spherical. __________________________________________________________________ Tutorial 1 5 .

the covariance matrix of x.The "necessary" part states that if Q is Chi-square distributed with s degrees of freedom. only a small additional effort will be needed to express the result as a function of .g.f.The "sufficient" condition states that if P is a projection matrix with rank r. ----We then establish a necessary and sufficient condition for a quadratic form Q : Q = x'Px in a multivariate normal variable to be Chi-square distributed. So the Table of Contents of the general case is quite short. a common strategy when studying the multivariate normal distribution. . of the quadratic form Final result 6 . of the quadratic form Second form of the m.g. * We then go over the general case (arbitrary covariance matrix) by identifying a transformation that will turn the problem into the already solved "special case" problem.f. then Q is Chi-square distributed with r degrees of freedom. . and the moment generating function (m. QUADRATIC FORMS IN MULTINORMAL VECTORS AND THE CHI-SQUARE DISTRIBUTION Expectation of a quadratic form ------------------------------Multivariate normal distribution with identity covariance matrix A sufficient condition for a quadratic form in a multivariate normal vector with identity covariance matrix to be Chi-square distributed The condition is also necessary First form of the m.f.g.In the first section of this Tutorial. we calculate the expectation of a quadratic form with no assumption about the distribution of the variable. The demonstration is a bit more difficult. as the brunt of the work has already been done when solving the special case. then P has to be a projection matrix with rank s. This section is independent of the remainder of the Tutorial. * We first address the special case of a spherical multinormal distribution with unit variance (identity covariance matrix). Once the transformation has been identified.) of the quadratic form will be of central importance.

this assumption is unnecessary and is introduced only to make the demonstration easier. which is very difficult to demonstrate. unit variance multinormal vector. The reader won't be surprised to see us first solve the case of quadratic forms in a spherically symmetric. The general case (arbitrary covariance matrix) will then be solved by identifying a transformation that turns the general case into the already solved special case. ----Note that we always consider quadratic forms with Chi-square distributions. ----We then move on to quadratic forms and establish a necessary and sufficient condition for two such forms to be independent. We first solve the simpler problem of the independence of two linear forms.Rank of P Eigenvalues of P P is idempotent Noncentrality parameter General case : multivariate normal distribution with arbitrary covariance matrix TUTORIAL ______________________________________________________________________ _____ Tutorial 2 We now address the issue of the independence of two quadratic forms is a multivariate normal vector. This result will anyway be needed in the remainder of the Tutorial. INDEPENDENCE OF QUADRATIC FORMS Independence of two linear forms in a multivariate normal vector Independence of two Chi-square distributed quadratic forms in a standard multinormal vector The condition is necessary The condition is sufficient The final result Independence of two Chi-square distributed quadratic forms in a multinormal vector with arbitrary covariance matrix TUTORIAL 7 . As it turns out. The condition remains valid without it and then bears the name of Craig's Theorem.

___________________________________________________ Related readings : Chi-square distribution Multivariate normal distribution Cochran's Theorem Projection matrix 8 .

- Tổng hợp các bài viết từ blog của thầy Giang Lê
- Week 8 Capital Adequacy Sounders1871
- Nghiepvunganhang Tslethamduong 120712015130 Phpapp02
- 2.8 Restricted and Unrestricted Regression
- 19735118 a Guide to Using EViews With Using Eco No Metrics a Practical Guide R R Johnson
- Guja1
- Money and Banking
- sachdaisotuyentinh1

- Quadratic Forms
- Chiang Ch5.Ppt
- sec1_2ov
- Lecture 9
- Chi Square Test
- Numerical Matrix Analysis. Linear Systems and Least Squares 2009 Book
- Chan Jeli Az Kov 2009
- HW1_IVD
- A New Castep and Onetep geometry optimiser
- 00. Ch3.Simplex.method
- Output Conventional Bank
- MVA Section1 2012
- f 0111025030
- Inverse problem assignment
- Evan Steven Cody (2)
- Haro G., Rodolfo _2007_ English. Paper Istambul DDD
- Hamming Codes
- Dcs
- lecture1_D3
- BAJIGUR
- Cs 838 Course Survey
- The Power of Penalties
- 01 Outline, Review of Some Simple Matrix Algebra and Maximum Likelihood Estimation
- stochastic-I-PP.pdf
- quadratic forms
- Decentralized state estimation in connected systems
- Lu Decomposition
- Parimutuel Betting on Permutations
- Intro SVD
- omp mtlb
- Quadratic Forms

Are you sure?

This action might not be possible to undo. Are you sure you want to continue?

We've moved you to where you read on your other device.

Get the full title to continue

Get the full title to continue reading from where you left off, or restart the preview.

scribd