You are on page 1of 2

1.

Describe the causes/ consequences of: multicollinearity in regression

Multicollinearity in regression is a statistical phenomenon that occurs when two or more


independent variables in a regression model are highly correlated. This can lead to a variety of
issues, including:

Causes:

a) Highly correlated independent variables

b) Interaction e ects

c) Measurement errors

d) High dimensionality

Consequences:

a) Di culty in interpreting coe cients

b) Increased variance of coe cient estimates

b) Reduced precision of prediction

c) Masking the true relationship between independent and the dependent variable

d) Poor model t

2. Describe the causes/ consequences of: heteroscedasticity in regression

Heteroscedasticity in regression occurs when the variance of the errors (the residuals) in a
regression model is not constant across the range of the independent variables. This can result in a
variety of issues, including:

Causes:

a) Non-linear relationships

b) Outliers

c) Heterogeneous population

d) Non-constant error variance

Consequences:

a) Incorrect inference

b) Increased variability of predictions

c) Reduced goodness of t

d) Incorrect speci cation of the error distribution

e) Bias in coe cient estimates

3. Describe the causes/ consequences of autocorrelation in regression

Autocorrelation in regression occurs when the residuals (errors) in a regression model are not
independent and identically distributed, but are instead correlated over time or across observations.
This can result in a variety of issues, including:

Causes:

a) Time series data

b) Structural time series models

c) Misspeci cation of the model

Consequences:

a) Incorrect inference

b) Reduced goodness of t

c) Increased variability of predictions

d) Incorrect speci cation of the error distribution

e) Ine cient estimation of coe cients

4. Describe the causes/ consequences of homoscedasticity in regression

Homoscedasticity in regression refers to a situation where the variance of the errors in a


regression model is constant across the range of the independent variables.

Causes:

a) Linear relationship

b) Independence of residuals

c) Equal variance

d) Normal distribution of residuals

e) Lack of outliers

Consequences:

a) Statistical Inference

b) Model Performance

5. What implies the assumption of heteroscedasticity in a regression model?

The assumption of heteroscedasticity in a regression model implies that the variance of the
errors in the model is not constant across the range of the independent variables.

6. What implies the assumption of multicollinearity in a regression model?

The assumption of multicollinearity in a regression model implies that there is a strong linear
relationship between two or more of the independent variables in the model.

7. What implies the assumption of autocorrelation in a regression model?

The assumption of autocorrelation in a regression model implies that the errors in the model are
not independent and identically distributed, but are instead correlated over time or across
observations.

8. What implies the assumption of homoscedacity in a regression model?

The assumption of homoscedasticity in a regression model implies that the variance of the
errors in the model is constant across the range of the independent variables.

You might also like