0% found this document useful (0 votes)
184 views6 pages

Orthogonal Polynomials

Uploaded by

Thales Institute
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
0% found this document useful (0 votes)
184 views6 pages

Orthogonal Polynomials

Uploaded by

Thales Institute
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
0 GoNAL POLYNOMIALS © t of’ * . 253 conversion response surface of the A and the conto’ rd model ae shown jm panes a and b of gue 18. We fi © maximi re 7.18, = (OE nse 4 ec and 20% concentration. um. percent conversion Hei onse surface problems th 7 ny resp’ ¢ ms the experi many © response y or estimating the ean pee fog the ace. The response surface plots in F re 7.19 acting. cess variable spac lots ig sal pace. The response 5 in Figure 7, pr in th ical display of these quantities. Typically ane t ye e pl ‘pically, the variance of the int ma sraPO"jg0 of interest, because this is a direct measure of the lik ¢ likely T is interested in tion iS f " x edictiO” «ated with the point estimate di p jat Produced by the model. Recall that | ror a00" of the estimate of the mean r ‘ | ane arian Oa (X'X)7'x9. Plots of Vata] ie point Xo is given by | oO fh aan t[9(xo)] » with o? estimated by eesidual mean square MSpe, = 5.89 for this model for all values of xo i the region of experimentation, are presented in panels a and b of Fi x fe i. Both the response See in Figure 7.19a and the contour plot of at Var F*0)] in Figure’ 7.19b show that the /Var[$(xo)] is the fame (07 all points Xo that are the same distance from the center of the “asign. ‘This is a result of the spacing of the axial runs in the central “ie design at 1.414 units from the origin (in the coded variables), and jsa design property called rotatabil ty. This is a very important property for a gecond-order response surface design, and is discussed in detail in the Serences given on RSM. 5 ORTHOGONAL POLYNOMIALS models in one variable, even if centering, we may still-have high difficulties can be eliminated by We have noted that in fitting polynomial sonessential ill-conditioning is removed by levels of multicollinearity. Some of these using orthogonal polynomials to fit the model. Suppose that the model is 1,2)...” (78) Y= By + Bix; + Box? + + Bext + Br t be orthogonal. Furthermore, adding a term p,.1x**', we £ the lower order parameters Grterally the columns of the X matrix will no! od increase the order of the polynomial by must recompute (X’X)~! and the estimates © B---» By will change. low suppose that we fit the model Y= @gPo(x;) + @,Pi(x)) + @2P2( Hi) +7" +a P(¥i) + Fe ij 21j25. 20,7 (7.9) 254 POLYNOMIAL REGRESSION Nove, - where P,(.x,) is a uth-order orthogonal polynomial defined such that . Ep(yP(y) 20. res 87 Ob Dek Po(X;) = Then the model becomes y = Xar+ &, where the X matrix is Pon) PC) Ph) Pasa) Pia)” Pea) Mem fie : : Palin) Pia) Pan) Since this matrix has orthogonal columns, the X'X matrix is Peay oe eo Xs 0 LPHa) v0 el a oro SPR) fa The least-squares estimators of a are found from (X'X)"!X'y as EAs) JH 01k (7.10) Ems) Since P,(x,) is a polynomial of degree zero, we can set. P)(x;) = 1, and consequently aus The residual sum of squares is a k n 11) Sah) =85¢- Fal Enisrn] Poe" Tt WQGONAL POLYNOMIALS 13.08 258 sum of squares for any model parameter. dogs n regression sum ¢ Nadel parameter dogs not depend on i depend o1 the other parameters in the model, This regression sum of aquaren i " SSuCQy) = a AO (1,12) we Wish tO ASSESS the significance of the highest order term, we ie No: a 70 Uthis is equivalent to testing Mo: Py = 0 in Bq. Gata would use & & MCD nor K)/( = F statistic. Furthermore, note that if the order of the model is changed pe SM oO” Salk) ke 1) as the a to k +r, only the r new coefficients must be computed. The coefficients Sop Gyvenes Se do not change duce to the orthogonality property of the ei etmials, Thus, sequential fitting of the model is computationally easy The orthogonal polynomials P,(x;) are casily constructed for the case where the levels of x are equally spaced. The first five orthogonal polynomi- als are P(x) =1 ren off?) (Fe (eJ-P AES (7 = 9) ney y 4 = x\?(3n? - 13 3(n? — 1) ris) a4(* =) -(5 ) (=a +360. Where d is the spacing between the levels of x and the (A,) are constants chosen so that the polynomials will have integer values. A brief table of the numerical values of these orthogonal polynomials is given in Appendix Table AS. More extensive tables are found in DeLury [1960] and Pearson and Hartley [1966], Orthogonal polynomials can also be constructed and used in cases where the x’s are not equally spaced. A survey of methods for Benerating orthogonal polynomials is in Seber [1977, Ch. 8). \OMIAL REGRESSION Mi 256 POLYN' (ODE Example 7.5 Orthogonal Polynomials mputer simulation An operations research analyst has developed a comp’ Mode imented with the simulatj, ' experiment = lation ingle item inventory system. He has ntities on the a seal a mveatgate the effect of various ree Table TAL. erage annual cost of the inventory. The data are i is a convex functi Since we know that average annual inventory cost ion of the reorder quantity, we suspect thi at a second-order Fete is the i fore, we will fit highest order model that must be considered. There! i= 1,2,...,10 yy = ay Pox) + Pil) + P(x) +o * ‘The coefficients of the orthogonal polynomials Fala Fila and P,(x)), obtained from Appendix Table ‘AS, are shown in fi ‘Thus, en) 0 0 1 0 0 10 nt XX = 0 LPs) 0 =| 0, 330.10 in 10 - o 0 ¥ P24) 0 0 132 = 0 LPo(x)y | | 3243 0 Xy=| LP(x)y |=] 245 a LPA(x)y 369 1 and ° 0 |} 3243 324.3000 300 |} 245) =} 0.7424 9 a }L 369 2.7955 B= (eX) Ixy = som The fitted model is I= 324; 24.30 + 0.74247, (x) + 2.7955P,(x) RTHOGONAL POLYNOMIALS 28 TABLE 7.11. Inventory Simulation Output for Example 7.5: Reorder Quantity, x, Average Annual Cost, y, 50 $335 75 326 100 316 125 313 150 31 175 314 200 318 225 328 250 337 215 345 TABLE 7.12 Coefficients of Orthogonal Polynomials for Example 7.5 7 Pox) Px) P(x) 1 1 -9 6 2 1 <7 2 3 1 =5 =1 4 1 -3 -3 5 1 a1 -4 6 1 1 -4 7 1 3 -3 8 1 5 -1 9 1 7 2 10 1 9 6 10 10 L PHx,).= 10 YD PPx,) = 330 i=l i=l The regression sum of squares is SSq( a5 2) = | 0.7424(245), + 2.7955(369) 181.89 + 1031.54 = 1213.43 The analysis of variance is shown in Table 7.13. Both the linear and quadratic tems contribute. significantly. to the model. Since these terms account “07 Most of the variation in the; data, we tentatively, adopt the quadratic model Subject to a satisfactory residual analysis. 0 POLYNOMIAL REGRESSION, Mopp. 258 my TABLE 7.13 Analysts of Vatiance for the Quadratic Model in Example 7.5 Sum of Degrees of abies Source of Variation Squares Freedom = Mean Squa 5 lug 2 606.72 159.24 Onn Regression 1213.43 2 : : linear, ay (181.89) 1 ed as i 0m Quadratic, ay (1031.54) 1 aa .! 0004 Residual 26.67 i . Total 1240.10 9 We may obtain a fitted equation in terms of the original Tegressor by substituting for P(x,) as follows: 5 = 324.30 + 0.7424P\(x) + 2.7955P,(x) x- 162.5 = 324.30 + 0.2424(2( sine 1) (x ~ 1625)? (19)? 1 + 2.79555 KS) ance = 312.7686 + 0.0594(x — 162.5) + 0.0022(x — 162.5)* This form of the model should be Teported to the user, PROBLEMS 71 Consider the values of * shown below: *= 1.00, 1.70, 1.25, 1,20; 1.45,1.85, 1.60, 1.50, 1,95, 2.00 Suppose that we Wish t le regressor Variable YOu see any Potent 0 fit a second. x. Cal *. Calculate the Correlation between x ane” ial difficulties in fitting the model? candi fot Order model using these levels

You might also like