© All Rights Reserved

7 views

© All Rights Reserved

- Statistics
- LAB 4 - Sensory Evaluation of Food Products
- An Empirical Study on Effective Ways to Improve Customer Service
- Linear Regression and Corelation
- Stastical Hydrology
- crd fasf
- MA PF Economics
- Activated Sludge
- Multiple Regression_Statistical Method for Psychology
- Dc 4301598604
- 011011.hatschekfilmsummary
- Group 6 - Reliance Equity Valuation
- accounting and finace
- Regression Analysis
- Ejemplos interesantes de STATA
- DOE Regression Instructions for SPC for Excel 4
- Tarun Das ADB Nepal Upgraded Macro Economic Model-Annex
- Regression&Corr&Annova
- simpreg
- 06 Simple Linear Regression Part1

You are on page 1of 4

In a beginning course in statistics, most often, the computational formulas for inference

in regression settings are simply given to the students. Some attempt is made to illustrate

why the components of the formula make sense, but the derivations are beyond the

scope of the course.

For advanced students, however, these formulas become applications of the

expected value theorems studied earlier in the year. To derive the regression inference

equations, students must remember that Var ( kX ) = k 2 Var ( X ) , and, when X and Y are

independent, Var ( X + Y ) = Var ( X ) + Var (Y ) and Var ( XY ) = Var ( X )Var (Y ) . Finally,

Var ( X )

Var ( X n ) = .

n

In addition, the Modeling Assumptions for Regression are:

subpopulation of responses for each value of

the explanatory variable. These

subpopulations all have a common variance.

So, y | x ~ N ( y| x , e ) .

straight-line function of the explanatory

variable. This means that y| x = + x and

that y = a + bx estimates the mean response

for a given value of the explanatory variable.

Y = + X + with ~ N ( 0, e ) . assumptions

3. The selection of an observation from any of the subpopulations is independent of the

selection of any other observation. The values of the explanatory variable are assumed to

be fixed. This fixed (and known) value for the independent variable is essential for

developing the formulae.

The key to understanding the various standard errors for regression is to realize that the

variation of interest comes from the distribution of y around y|x . This is ~ N ( 0, e ) .

XY i i

Now, if we let X i = xi x and Y = y y , then b = i

. All of the regression

i i

X i

i

2

To see that this is true, consider y = y + b ( x x ) . In this form, we have a one

variable problem. Since we know all the individual values of x and y, and, consequently,

the means x and y , we can use first semester calculus to solve for b. Define

2 2

( )

n n

S = ( yi yi ) = yi ( y + b ( xi x ) ) . Now, let X i = xi x and Yi = yi y , so

i =1 i =1

n 2

i =1

( X Y + bX ) = 0 .

n n

dS dS

= 2 (Yi bX i )( X i ) . If = 0 , then i i i

2

Solving for b, we find

db i =1 db i =1

n n i X iYi

b X i = X iYi and b =

2

.

i =1 i =1 X i2 i

To compute a confidence interval for , we need to determine the variance of b,

using the expected value theorems.

XY XY

, we compute Var ( b ) = Var .

i i i i

Since b = i

i

Since the values of X are

X i

i

2

X

i

i

2

assumed to be fixed, X 2

in the denominator is a constant. So,

X iYi

Var i = 1

Var X iYi

Xi

2 2

X2 i

i i

i

and Var X iYi = Var ( X 1Y1 + X 2Y2 + " + X nYn ) . The Xs are constants and we are

i

interested in the variation of Yi for the given X i , which is the common variance e2 .

So, Var ( X 1Y1 + X 2Y2 + " + X nYn ) = X 12Var (Y1 ) + X 22Var (Y2 ) + " + X n2Var (Yn ) = e2 X i2 .

i

X iYi X i2

Putting it all together, we find Var ( b ) = Var i =2 i = e . This

2

Xi

2 e 2

2

X i2

Xi

i i

i

e2

is often written as Var ( b ) = .

( x x )

2

i

i

So, the standard error for the slope in regression can be estimated by

se se

sb = or sb = .

(x x)

2

n 1 s x

i

i

Confidence intervals for a predicted mean can now be obtained. The standard

error can be determined by computing Var ( y ) . We know that y = y + b ( x x ) , so, as

before, using the expected value theorems, we find

2

yi

e2 = 1 Var y = n e = e .

2 2

with Var ( b ) = and Var ( y ) = Var i i n2

n n2

( x x )

2

i n

i

So,

e2 ( x x )

2

e2

Var ( y ) = Var ( y + b ( x x ) ) = + .

(x x )

2

n i

The standard error for predicting a mean response for a given value of x can be estimated

(x x) .

2

1

by s y = se +

n ( xi x )2

The variance of the intercept a can be estimated using the previous formula for

the standard error for y . Since y = a + bx , the variance of a is the variance of y when

(x)

2

1

x = 0 . So, sa = se + .

n ( xi x )2

Finally, to predict a y-value, y p , for a given x, we need to consider two

independent errors. We know that y is normally distributed around y|x so

y | x ~ N ( y|x , e ) . Given y|x , we can estimate our error in predicting y. But, as we

have just seen, there is also variation in our predictions of y|x . First, we predict y

taking into account its own variation and then we use that prediction in predicting y. So

2 e2 ( x x )2 1 ( x x)

2

+ + ( e ) = e 1 + +

2 2

.

n ( x x )2 n ( x x )2

i i

1 ( x x)

2

s y p = s 1 + +

2

.

e

n ( x x )2

i

se

Standard error for Slope: sb =

n 1 sx

(x x)

2

Standard error for 1

s y = se +

Predicted Mean: n ( xi x )2

(x)

2

Standard error for the 1

sa = se +

Intercept: n ( xi x )2

1 ( x x)

2

Standard error for a

syp = s 1 + +

2

Predicted Value: n ( x x )2

e

i

Reference: Kennedy, Joh B. and Adam M. Neville, Basic Statistical Methods for

Engineers and Scientists, 3rd, Harper and Row, 1986.

- StatisticsUploaded byAndrew Reeves
- LAB 4 - Sensory Evaluation of Food ProductsUploaded byghostly_form4169
- An Empirical Study on Effective Ways to Improve Customer ServiceUploaded byPatricia Herman
- Linear Regression and CorelationUploaded bySally Goodwill
- Stastical HydrologyUploaded byovishalz
- crd fasfUploaded byYogyaning Kartiko
- MA PF EconomicsUploaded bysmobileworld
- Activated SludgeUploaded byAndrew Verrayo Limas
- Multiple Regression_Statistical Method for PsychologyUploaded byderafik
- Dc 4301598604Uploaded byAnonymous 7VPPkWS8O
- 011011.hatschekfilmsummaryUploaded byTushar Garg
- Group 6 - Reliance Equity ValuationUploaded byHEM BANSAL
- accounting and finaceUploaded byLan Anh
- Regression AnalysisUploaded byBitan Banerjee
- Ejemplos interesantes de STATAUploaded byMarcos Polo
- DOE Regression Instructions for SPC for Excel 4Uploaded byFamília Carvalho
- Tarun Das ADB Nepal Upgraded Macro Economic Model-AnnexUploaded byProfessor Tarun Das
- Regression&Corr&AnnovaUploaded byクマー ヴィーン
- simpregUploaded bychristos_nakis
- 06 Simple Linear Regression Part1Uploaded byRama Dulce
- MAE300 Laboratory Experiments1Uploaded byasfsadfasdf
- Chem CalcUploaded byguibur
- Regresi NewwUploaded byYora Saki
- 12 Shibly RahmanUploaded bymailmaverick8167
- 106-310-1-SM.pdfUploaded byAndika Saputra
- InfoPresentTables.pdfUploaded byRiski Adiatma
- Case 2. Honeyed TraderUploaded byCharmie Lagdamen
- OutputUploaded byFitria
- SAPMUploaded byMitali Agrawal
- ocb casesUploaded byms970

- 180014D1_AM7 Central Office Simulator Instruction ManualUploaded byChuck43
- TDPUploaded byFlorin Nite
- indices and logs.pptUploaded byManoj Ratnayake
- MODELING, MAPPING, & CONSEQUENCE PRODUCTION CENTERUploaded byvhgaitan
- EDN's Best of Design Ideas - Volume 1Uploaded byableseaman
- 01 Using LabQuestUploaded byJake
- bcaUploaded byMohd Zafar
- d 1287 - 91 r97 _rdeyodctotfsotdfmq_Uploaded byjamaljamal20
- Stuck Pipe Review (63) REV DNUploaded byWaleed Barakat Maria
- ReleaseNotes MacUploaded byNils Van der Plancken
- Waitakere Ranges Track Risk StatusUploaded bySimon Smith
- Signalling in TelecommunicationUploaded byEarnest Jayakaran
- ABAP-FAQs.pdfUploaded byqhmpl123
- msp430f5529.pdfUploaded byDaniel Gustavo Martinez Martinez
- PaperUploaded bySwapnil Jain
- EBSCO - Impact of Continuous Low Level Heatwrap Therapy in Acute Low Back Pain Patients Subjective and Objective MeasurementsUploaded byAnisah Rahma
- APPLICATION OF GPS IN power system.pptxUploaded byRavinder Kumar
- FAAUploaded byLiki Triwijaya
- Cengel Thermodynamics Heat Transfer 2nd TxtbkUploaded byicething
- CAT Parts Kpi Application TrainingUploaded byenjoythedocs
- Osi RMUploaded bysrshelke
- Qm in Logistics -Business Process-Part2Uploaded byDipak Banerjee
- Sparse signal recovery in a transform domain.pdfUploaded bychokbing
- 63522 EnUploaded byhelpinprocess
- LTSpice Ac AnalysisUploaded byJuan Perez
- TM6004 Teknik Pemboran Lanjut (English).docxUploaded byaci
- Panel Au Optronics t260xw04 v7 Cell 0 [Ds]Uploaded byเกียรติศักดิ์ ภูมิลา
- 1Uploaded bySharib Ahmad
- Exportacion a XcelUploaded byJose Gustavo Sánchez Sandoval
- Pandaros_3_e1Uploaded byshaghayeghkeshani