You are on page 1of 10

22-Feb-20

Significance Testing

Testing for the significance of the β i's can be done in a manner


similar to that in the bivariate case by using t tests. The
significance of the partial coefficient for importance
attached to weather may be tested by the following equation:

t= b
SE
b

which has a t distribution with n - k -1 degrees of freedom.

25

Relative Importance of Predictors


 Statistical significance.
 Square of simple correlation coefficient.

 Square of partial correlation coefficient.

 Measures based on standardized coefficients or beta


weights.
 Stepwise regression.

26

13
22-Feb-20

Stepwise Regression
 Purpose of stepwise regression is to select, from a
large number of predictor variables, a small subset of
variables that account for most of variation in
dependent or criterion variable.
– In this procedure, predictor variables enter or are removed from
the regression equation one at a time.
– It has several approaches - Forward inclusion; Backward
elimination; & Stepwise solution.

27

Stepwise Regression
 Forward inclusion. Initially, there are no predictor variables in
regression equation. Predictor variables are entered one at a time,
only if they meet certain criteria specified in terms of F ratio. Order
in which variables are included is based on contribution to
explained variance.
 Backward elimination. Initially, all predictor variables are
included in regression equation. Predictors are then removed one
at a time based on F ratio for removal.
 Stepwise solution. Forward inclusion is combined with removal of
predictors that no longer meet specified criterion at each step.

28

14
27-02-2020

BUSINESS RESEARCH METHOD

Session: 12a

Multiple Regression
General form of multiple regression model is as
follows:

Y = β 0 + β 1 X1 + β 2 X2 + β 3 X3+ . . . + β k X k + e
which is estimated by the following equation:
Y = a + b1X1 + b2X2 + b3X3+ . . . + bkXk

• As before, coefficient a represents the intercept, but b's


are now partial regression coefficients.

1
27-02-2020

Multiple Regression
Multiple R 0.97210
R2 0.94498
Adjusted R2 0.93276
Standard Error 0.85974

ANALYSIS OF VARIANCE
df Sum of Squares Mean Square

Regression 2 114.26425 57.13213


Residual 9 6.65241 0.73916
F = 77.29364 Significance of F = 0.0000

VARIABLES IN THE EQUATION


Variable b SEb Beta (ß) T Significance
of T
IMPORTANCE 0.28865 0.08608 0.31382 3.353 0.0085
DURATION 0.48108 0.05895 0.76363 8.160 0.0000
(Constant) 0.33732 0.56736 0.595 0.5668

Significance Testing

H0 : R2pop = 0

This is equivalent to the following null hypothesis:

H0: β 1 = β2 = β 3 = . . . = β k = 0

The overall test can be conducted by using an F statistic:

SS reg /k
F=
SS res /(n - k - 1)

= R 2 /k
2
(1 - R )/(n- k - 1)

which has an F distribution with k and (n - k -1) degrees of freedom.

2
27-02-2020

Significance Testing

Testing for the significance of the β i's can be done in a manner


similar to that in the bivariate case by using t tests. The
significance of the partial coefficient for importance
attached to weather may be tested by the following equation:

t= b
SE
b

which has a t distribution with n - k -1 degrees of freedom.

Relative Importance of Predictors


 Statistical significance.
 Square of simple correlation coefficient.

 Square of partial correlation coefficient.

 Measures based on standardized coefficients or beta


weights.
 Stepwise regression.

3
27-02-2020

Stepwise Regression
 Purpose of stepwise regression is to select, from a
large number of predictor variables, a small subset of
variables that account for most of variation in
dependent or criterion variable.
– In this procedure, predictor variables enter or are removed from
the regression equation one at a time.
– It has several approaches - Forward inclusion; Backward
elimination; & Stepwise solution.

Stepwise Regression
 Forward inclusion. Initially, there are no predictor variables in
regression equation. Predictor variables are entered one at a time,
only if they meet certain criteria specified in terms of F ratio. Order
in which variables are included is based on contribution to
explained variance.
 Backward elimination. Initially, all predictor variables are
included in regression equation. Predictors are then removed one
at a time based on F ratio for removal.
 Stepwise solution. Forward inclusion is combined with removal of
predictors that no longer meet specified criterion at each step.

4
27-02-2020

Caution about R²
 Value of R² can be “artificially” increased by simply
adding explanatory variable to regression model.
– For comparing two regression models with same dependent
variable ‘y’ but differing number of explanatory variables – the
model with higher R² value is not necessarily the better one.

 Adjusted R2. R2, coefficient of multiple determination, is


adjusted for the number of independent variables and
the sample size to account for the diminishing returns.
After the first few variables, the additional independent
variables do not make much contribution.

Adjusted R²
 For comparing two regression models, it is advisable to
compute adjusted R²

Adjusted R² =

Where
• K is the number of independent variables in the model, excluding
the constant.
• N is the number of points in your data sample.

10

5
27-02-2020

Residual Plot: Linear Relationship between


Residuals & Time (Autocorrelation)

Residuals

Time

11

Multicollinearity
 It arises when intercorrelations among predictors are
very high.
• Few Problems due to Multicollinearity
– Partial regression coefficients may not be estimated precisely.
Standard errors are likely to be high.
– Magnitudes, as well as the signs of partial regression coefficients,
may change from sample to sample.
– It becomes difficult to assess relative importance of independent
variables in explaining variation in dependent variable.
– Predictor variables may be incorrectly included or removed in
stepwise regression.

12

6
27-02-2020

Multicollinearity: Correction
• A simple procedure for adjusting for multicollinearity consists of
using only one of the variables in a highly correlated set of
variables.

• Alternatively, the set of independent variables can be transformed


into a new set of predictors that are mutually independent by
using techniques such as principal components analysis.

• More specialized techniques, such as Stepwise Regression can


also be used.

13

THANK YOU

14

7
3/3/2020

Business Research Method

Prof. Ravi Shekhar Kumar

XLRI- Xavier School of Management, Jamshedpur


ravishekhar@xlri.ac.in
Session-12b + 13a

Factor Analysis

You might also like