You are on page 1of 9
AMOVA tatle for Multiple Regression a For multiple regression there cS an AVOUA F tect whieh tests the hypotters that all of te regression coe bfeceuts ( except tnterept) Ore Bro. Mere ¢$ tho goueral forms of the AMO table > Seurce | at | 3s MS = a ss. Met | «| 2a | P| “eZ Ewoy J met | € tye-$e)'| *6te Teal Fomor | styeay| ssrgye The Su ws of Squares represeut sources of varia kor Steg) = Sa Ge ae |p S ets tolfe aT Tetal a MSB=s= @ Ha: at least ong Pe #0 Ho: nonsot the explanatory variable 5 cut predictors of the ves poise Vince ble, Ha: at least ome cS a pyeoketor. A common evrer in theuce of multiple represscon et a te assume thatall ef tic regres ecoftcecents cre statis teeally diftenout Prem 220 whurower the F tatestee Aas 4% small P-value . ANOVA F Test: Tu tu muaultcple regression nla! , the by hase - - He. pr = Pre... = PRA? = Ha: at (2ost ont Pj O tertecl &y ta ANMOUA F statestce - MSa " MSE Puslu = P (X>F) Xr ECR, nek) ; Squared Multiple Corralation R 2 SSu THE SQUARED MULTIPLE CORRELATION The statistic p= SSM _ DU -3)? SST Li - 5? is the proportion of the variation of the response variable y that is ex- plained by the explanatory variables 2,2, regression. +-v4ig in a multiple linear Rez fetes calleol the multiple correlation coe Hecieut - corrclation between ag — ORsewations Ye anal rece ted values Ube 2 Radguctiol ( eta; ) MS SSH /all Ma Ray = MSs) Ser Pads ey uc (onger tells the Zractiote of Voriakility accounted tor by the puede|, 2 ids wught be > 00% Assumptions anal Conditions 4. Leneari ty Assumption . Check Etk Straight SLauph Cloud? ton : a at cone of Y apeinest €ack of ea pragcte 2, Endependece Assumption. Tke errors mugé €r independent of each other Crck Randomization Cueli hol >. Equal Varcance Assumption Tha varia belité ot the rrory, mut €t gbour the Sas, 4, Wornality Assumption . The trvors rus follows & Morweal mcele) &x . Predicting GPA of seventh-graders. Refer to the education data for 78 seventh-grade students. We view GPA as the response variable. IQ, gender, and self-concept are the explanatory variables. rae | OBS GPA IQ Gender Self-concept 012 5.528 89 013 7.167 104 001 «7.940 111 8.292 107 M | 003 4.643 100 | 004 «7.470 107 005 «8.882 114 | 006 7.585 115 | 007 «7.650 111 008 «2412 97 | 009 6.000 100 | 010 «8.833 112 M M FE M M M ? M O11 7.470 104 F F M F F ¥ F ¥ M ¥ 014 7.571 102 O15 4.700 91 016 8.167 114 O17 14 018 103 019 106 020 105 (a) Find the correlation between GPA and each of the explanatory variables. What percen t of the total variation in student GPAs can be explained by the straight-line relationship for each of the explanatory variables? (b) The importance of 1Q in explaining GPA is not surprising. The purpose of the study is to assess the influence of self-concept on GPA. So we will include IQ in the regression model and ask, «How much does self-concept contribute to explaining GPA after the effect of IQ on GPA is taken into account?» 67 43 52 66 38 51 ri 31 49 31 & RSAUSRL LE 49 Give a model that can4@ answer this question. GPA = Pet Bt TO+ P2SCt+ gt (©) Run the model and report the fitted regression equation. What percent of the variation in GPA is explained by the explanatory variables in your model? (d) Translate the question of interest into appropriate null and alternative hypothese about the model parameters. Give the value of the test statistic and P-value. He: Pree e=3M Ha: pizo p= 0.002 Ad <2 8-235 => reject Ho aud concede pate, Residual Plots for gpa Normal Probability Plot of the Residuals Residuals Versus the Fitted Values 29 2 ° ae ates 3° 8B ol 2 8 g2 z a 2 “* fe . or + 0 25 00. a5 80 a 6 7 0 Residual Fitted Value Histogram of the Residuals Residuals Versus the Order of the Data 20; Fa 2 z 39 Bis 3 z io 2 z z 7 4 °. el * 4 2 oO @ TS i 1 2 3038 50S ETT Residual ‘Observation Order Regression Analysis: gpa versus iq The regression equation is Cera == 3-56 + O11 iy Predictor _Coef SE Coef t e Constant 3.557 1.552 -2.29 0,025 ig 0.20102 0.01414 7.14 0.000 eA 40.28) R-Sqadj) = 39.48 Bee 0.402 R= SR = 0.634 40.1%, Of teh Varcation in CPA con & evptoinsd by thiS RAgASSiOn Iiul, Residual Plots for gpa Normal Probability Plot of the Residuals Residuals Versus the Fitted Values: as 2 5 o < = 3 he i 10} é a e zt . ot So a5 oo a5 7 3 3 7 Residual Fitted Value Histogram of the Residuals Residuals Versus the Order of the Data » 2 5 z g° £ 0 i 2 E, 4 6 7 Baa Serre re eT er Ter Residual ‘Observation Order Regression Analysis: gpa versus ncept ‘te regression equation is gpa = 2.23 + 0.0837 neept Predictor _ cost SE Goof oT Constant. 2.2259 0.9505. 2.34 0.028 peep 0.09165 0.01631 5.62 0.000 R-Sq(adj) = 28.48 9.294 De [0.29 x 0542 20.1% of vercodian is explained by thas regression line. Residual Plots for gpa Normal Probability Plot of the Residuals: Residuals Versus the Fitted Values: | - 3° o lees ° TS Le ety tse eae ad 0 - § 8 . oe Zz i egies ees : Alo ‘ . oe 5.0 25, 0.0 25 5.0 4 6 8 10 aa aa Histogram of the Residuals Residuals Versus the Order of the Data a Bas . [: 7 : _| ” z Fi as [is ea coon Serie Regression Analysi The regression equation is gpa = ~ 3.88 + 0.0772 1g + 0.0813 neept Predictor _Coef SE. Coef 7 Constant -3.882 1.472 -2.64 0.010 ig 0.07720 0.01539 5.02 0.000 ncept 0.05125 0.01633 3.14 0.002 $= 1.54715 RSq= 47.18 R-Sqladj) = 45.78 Analysis of Variance Source DF ss Regression 2 159.902 Residual Error 75 179.525 2.394 Total 77 339.427 CPA =H 4ade OFFETAt OOSIS = Re U7.1% of variation is expdaintd — gpa versus ig, ncept Ms FE P 79.951 33.40 0,000 by TA aud sc. So SC adoed = 7%. | Goweral Linear Models (ELM), wv a mace] tuet anory be written iw ths form, Y= Pot Prt F PX. © PeNe +e ce. t8 es leatar iu the PS, is Kedermed do as a G5enstal Linear Medel ( GLM) X's canket bealcusar combuation of otter %'S . 0g. X= X,tM% , Xse Xo Ke t Xe ya por PX + Perit pr Xs % = x Xs Ye Pot PrBet Ba CX +X) + Ps Xs - Po 4 (Ort Pr) xp 4 (Bit PsdXy xX, % Lag XX, Th eur mode] CS wot @ ELM > use troustormations , Ft. Y= Pe CE VGEIGLg Pur we can have X= tay = ; In y = on pe + yo” fe Gps t pie ¢.IN kay “ ae Foyle plapem 2 ecm £g hay att Xx é = ee @:€

You might also like