0 Up votes0 Down votes

48 views514 pagesMay 17, 2017

© © All Rights Reserved

PDF, TXT or read online from Scribd

© All Rights Reserved

48 views

© All Rights Reserved

- regresi.doc
- Linear Regression
- IMPROVING MOTIVATION AND STUDENT LEARNING OUTCOME ON GEOMETRY TRANSFORMATIONSUBJECT WITH USING MODULES BASED ON GUIDED DISCOVERY.
- Application of Measurement Models to Impedance Spectroscopy
- Industri b Regression
- Factors Influencing Popularity of Branded Content in Facebook
- jbptunikompp-gdl-dhonnyyudh-32926-5-unikom_d-l
- Quality readiness and growth of Indian micro, small and medium enterprises
- 7 Regression NC
- MAT 540 Statistical Concepts for Research
- 548-1724-1-PB
- 79c
- Using Excel for Regression
- Tabel Frekuensi Novnov
- Examples.pdf
- 1st Regression
- Twobytwo Do
- WAC-LP
- Antioksidan
- final report

You are on page 1of 514

of the regression of the first residual on the second. Notice, however, that neither of

the coefficients of time in the separate trend analyses is an estimate of the /3 3 shift

parameter. 11

Multiple Regression Coefficients

Under the conditions shown in the next section, the normal equations solve for b =

(X'X)- 1X'y. The residuals from the LS regression may then be expressed as

e = y-Xb = y -X(X'X)- 1X'y =My (3.17)

1

where M =I - X(X'X)- X'

It is readily seen that Mis a symmetric, idempotent matrix. It also has the properties

that MX = 0 and Me = e. Now write the general regression in partitioned form as

In this partitioning x 2 is then x 1 vector of observations on X2 , with coefficient b2 ,

and X. is then X (k - 1) matrix of all the other variables (including the column of

ones) with coefficient vector b< 2>- 12 The normal equations for this setup are

We wish to solve for b2 and interpret the result in terms of a simple regression slope.

The solution is 13

where M. = / - x.(x:x.)- x: 1

Now by analogy with Eq. (3.17) it follows that

M.y is the vector of residuals when y is regressed on X.

and M.x2 is the vector of residuals when x 2 is regressed on X.

Regressing the first vector on the second gives a slope coefficient, which, using the

symmetry and idempotency of M., gives the b2 coefficient already defined in Eq.

(3.18). This general result has already been illustrated for the three-variable case.

A simpler and elegant way of proving the same result is as follows. Write the

partitioned regression as

12

Note that this is a different use of the star subscript than in an earlier section where it was used to

indicate data in deviation form.

13 See Appendix 3.2.

CHAPTER 3: The k- Variable Linear Equation 89

X's look alike, the more imprecise is the attempt to estimate their relative effects. This

situation is referred to as multicollinearity or collinearity. With perfect or exact collinear-

ity the standard errors go to infinity. Exact collinearity means that the columns of X are

linearly dependent, and so the LS vector cannot be estimated.

3.4.3 Estimation of u 2

which is unknown. It is reasonable to base an estimate on the residual sum of squares

from the fitted regression. Following Eq. (3.17), we write e = My = M(X/3 + u) =

Mu, since MX = 0.

Thus, E(e'e) = E(u'M'Mu) = E(u'Mu)

Utilizing the fact that the trace of a scalar is the scalar, we write

E(u'Mu) = E[tr(u'Mu)]

= E[tr(uu'M)]

= u 2 tr(M)

= a 2trl - a 2 tr[X(X'X)- 1X']

= a 2 trl - a 2 tr[(X'X)- 1(X'X)]

= a 2(n - k)

n-k

defines an unbiased estimator of a 2 The square root s is the standard deviation of

the Y values about the regression plane. It is often referred to as the standard error

of estimate or the standard error of the regression (SER).

This is the fundamental least-squares theorem. It states that, conditional on the as-

sumptions made, no other linear, unbiased estimator of the f3 coefficients can have

smaller sampling variances than those of the least-squares estimator, given in Eq.

(3.25). We prove a more general result relating to any linear combination of the f3

coefficients. Let c denote an arbitrary k-element vector of known constants, and de-

fine a scalar quantity , as

, = c'/J

If we choose c' = [0 1 0 0], then , /32. Thus, we can pick out any

single element in f3. If we choose

then , = E(Yn+i)

- regresi.docUploaded byyuka
- Linear RegressionUploaded byGiancarlo Santos
- IMPROVING MOTIVATION AND STUDENT LEARNING OUTCOME ON GEOMETRY TRANSFORMATIONSUBJECT WITH USING MODULES BASED ON GUIDED DISCOVERY.Uploaded byIJAR Journal
- Application of Measurement Models to Impedance SpectroscopyUploaded byYu Shu Hearn
- Industri b RegressionUploaded byGumilarHarsyaPutra
- Factors Influencing Popularity of Branded Content in FacebookUploaded byDiana Maria
- jbptunikompp-gdl-dhonnyyudh-32926-5-unikom_d-lUploaded byinten
- Quality readiness and growth of Indian micro, small and medium enterprisesUploaded byDr. Moloy Ghoshal
- 7 Regression NCUploaded byGirish Narayanan
- MAT 540 Statistical Concepts for ResearchUploaded bynequwan79
- 548-1724-1-PBUploaded byHaseeb Malik
- 79cUploaded bySalman Zia
- Using Excel for RegressionUploaded byPhat Luong
- Tabel Frekuensi NovnovUploaded byabd. wahid
- Examples.pdfUploaded byGag Paf
- 1st RegressionUploaded byDanielle M. Illana
- Twobytwo DoUploaded byRobson Glasscock
- WAC-LPUploaded byAbdullah Janjua
- AntioksidanUploaded bywulan
- final reportUploaded byapi-253273202
- Lecture3EXCELexampleFUPDATE 0106014.xlsxUploaded byOscar Rojas
- OUTPUT.docUploaded byLuqyana Hidayah
- RegressionUploaded byshags
- Help AnClimUploaded byhcsrecs
- lec3Uploaded byjuntujuntu
- SIP Final TouchUploaded byVishal Chauhan
- Mastering ExcelUploaded byShobhit Mittal
- Business ResearchUploaded byRejaul Karim
- ISI JRF Quantitative EconomicsUploaded byapi-26401608
- summaryUploaded bySyed Osama

- Exponential SmoothingUploaded bySatyan Sonkar
- Chapter 9 problemsUploaded byJohn Santeiu
- Section 06 03 Ess Stats2eUploaded bykaled1971
- cheatsheet_luckycallorUploaded bymmmax
- Continuous Dist Week 5Uploaded byLibyaFlower
- Lecture 7slidesUploaded byDominicTan
- Part 1 Improved Gage Rr Measurement StudiesUploaded bynelson.rodriguezm6142
- angilikaUploaded byGregoria Magada
- Models_ a List of Available Models in Train in Caret_ Classification and Regression TrainingUploaded byBaagya Arun
- minitabUploaded bySara
- THE TRAGEDY OF TITANIC: A LOGISTIC REGRESSION ANALYSIS.Uploaded byIJAR Journal
- Proc Arima ProcedureUploaded byBikash Bhandari
- R Manual to Agresti’s Categorical Data AnalysisUploaded bySitiDianNovita
- Two Way AnovaUploaded byJelly Jean Paye
- Overdispersion in Modelling Accidents and in Empirical Bayes EstimationUploaded byjodomofo
- VAR Group ProjectUploaded byOliver Gannon
- Application of Taguchi Method In Health And Safety (Fire Extinguishing Experiment)Uploaded byAnonymous 7VPPkWS8O
- 1471-2210-10-6Uploaded byJohn Justine Villar
- Panel Data, Stratified Samples, And More Efficient EstimatorsUploaded byKevin Walker
- Probability Distributrions of the Aggregated Residential LoadUploaded byPham van Hoan
- Chapter 04 - Measures of Dispersion and Skewness.pdfUploaded byClemence Cruz
- ArticleUploaded byrmehta26
- Using SPSS for t Tests.docUploaded bymeldestacamentojr
- Output Hasil spssUploaded by08999575874
- Psws Probability Prop Size BierrenbachUploaded byCarlos Isla Riffo
- RVS Certificate in Advanced Business Analytics and Data Science Using SAS and RUploaded bySrikanth Balasubramanian
- Tugas 3 Multivariate and EconometricsUploaded byLastri Junedah
- TR204ftUploaded byNgân Nguyễn
- Reference BalanaUploaded byJennifer Balana
- SEM_Basics a Supplement to Multivariate Data AnalysisUploaded byMary Benard

## Much more than documents.

Discover everything Scribd has to offer, including books and audiobooks from major publishers.

Cancel anytime.