Professional Documents
Culture Documents
Significance Testing
t= b
SE
b
25
26
13
22-Feb-20
Stepwise Regression
Purpose of stepwise regression is to select, from a
large number of predictor variables, a small subset of
variables that account for most of variation in
dependent or criterion variable.
– In this procedure, predictor variables enter or are removed from
the regression equation one at a time.
– It has several approaches - Forward inclusion; Backward
elimination; & Stepwise solution.
27
Stepwise Regression
Forward inclusion. Initially, there are no predictor variables in
regression equation. Predictor variables are entered one at a time,
only if they meet certain criteria specified in terms of F ratio. Order
in which variables are included is based on contribution to
explained variance.
Backward elimination. Initially, all predictor variables are
included in regression equation. Predictors are then removed one
at a time based on F ratio for removal.
Stepwise solution. Forward inclusion is combined with removal of
predictors that no longer meet specified criterion at each step.
28
14
27-02-2020
Session: 12a
Multiple Regression
General form of multiple regression model is as
follows:
Y = β 0 + β 1 X1 + β 2 X2 + β 3 X3+ . . . + β k X k + e
which is estimated by the following equation:
Y = a + b1X1 + b2X2 + b3X3+ . . . + bkXk
1
27-02-2020
Multiple Regression
Multiple R 0.97210
R2 0.94498
Adjusted R2 0.93276
Standard Error 0.85974
ANALYSIS OF VARIANCE
df Sum of Squares Mean Square
Significance Testing
H0 : R2pop = 0
H0: β 1 = β2 = β 3 = . . . = β k = 0
SS reg /k
F=
SS res /(n - k - 1)
= R 2 /k
2
(1 - R )/(n- k - 1)
2
27-02-2020
Significance Testing
t= b
SE
b
3
27-02-2020
Stepwise Regression
Purpose of stepwise regression is to select, from a
large number of predictor variables, a small subset of
variables that account for most of variation in
dependent or criterion variable.
– In this procedure, predictor variables enter or are removed from
the regression equation one at a time.
– It has several approaches - Forward inclusion; Backward
elimination; & Stepwise solution.
Stepwise Regression
Forward inclusion. Initially, there are no predictor variables in
regression equation. Predictor variables are entered one at a time,
only if they meet certain criteria specified in terms of F ratio. Order
in which variables are included is based on contribution to
explained variance.
Backward elimination. Initially, all predictor variables are
included in regression equation. Predictors are then removed one
at a time based on F ratio for removal.
Stepwise solution. Forward inclusion is combined with removal of
predictors that no longer meet specified criterion at each step.
4
27-02-2020
Caution about R²
Value of R² can be “artificially” increased by simply
adding explanatory variable to regression model.
– For comparing two regression models with same dependent
variable ‘y’ but differing number of explanatory variables – the
model with higher R² value is not necessarily the better one.
Adjusted R²
For comparing two regression models, it is advisable to
compute adjusted R²
Adjusted R² =
Where
• K is the number of independent variables in the model, excluding
the constant.
• N is the number of points in your data sample.
10
5
27-02-2020
Residuals
Time
11
Multicollinearity
It arises when intercorrelations among predictors are
very high.
• Few Problems due to Multicollinearity
– Partial regression coefficients may not be estimated precisely.
Standard errors are likely to be high.
– Magnitudes, as well as the signs of partial regression coefficients,
may change from sample to sample.
– It becomes difficult to assess relative importance of independent
variables in explaining variation in dependent variable.
– Predictor variables may be incorrectly included or removed in
stepwise regression.
12
6
27-02-2020
Multicollinearity: Correction
• A simple procedure for adjusting for multicollinearity consists of
using only one of the variables in a highly correlated set of
variables.
13
THANK YOU
14
7
3/3/2020
Factor Analysis