You are on page 1of 8

1.

1 Question 1
a) Estimate a multiple regression equation to explain the variation in employee salaries
at Beta Technologies by the variables Age, Prior Experience, Beta Experience and
Education

Regression Statistic
Adjusted R Std. Error of the
Multiple R R Square Observations
Square Estimate
.957a .915 .913 8904.470 204
a. Predictors: (Constant), Education, Beta Experience, Prior Experience, Age

Table 1.1: Model Summary

ANOVA

df SS MS F Sig.
Regression 4 170010221255.122 42502555313.780 536.042 .000b
Residual 199 15778626195.859 79289578.874
Total 203 185788847450.981
a. Dependent Variable: Annual Salary
b. Predictors: (Constant), Education, Beta Experience, Prior Experience, Age

Table 1.2: ANOVA

Coefficients
95.0% Confidence Interval
Unstandardized Coefficients
for B
Lower Upper
Coefficients Std. Error t Stat P-Value
Bound Bound
Intercept 1642.064 2730.426 .601 .548 -3742.217 7026.345
Age -144.372 62.065 -2.326 .021 -266.762 -21.982
Prior
3126.812 133.784 23.372 5.76E-59 2862.996 3390.629
Experience
Beta
2642.345 105.422 25.065 1.75E-63 2434.458 2850.232
Experience
Education 7566.422 379.314 19.948 2.35E-49 6818.432 8314.412
a. Dependent Variable: Annual Salary

Table 1.3: Coefficients


To create a predictive equation for the dependent variable using multiple independent
variables. Based on the provided data, the regression equation utilized in this report is as
follows:

Multiple regression equation


A = Age
PE = Prior Experience
BE = Beta Experience
E = Education

Y = β0 + β1(A) + β2(PE) + β3(BE) + β4(E)


Y = 1642.03 – 144.37 (A) + 3126.81(PE) +2642.35(BE) + 7566.42(E)

The multiple correlation coefficient (R) is a measure of the strength of the linear
relationship between the independent variables and the dependent variable. In this case, refer
to table 1.1, the R-value of .957a indicates a strong positive relationship between the variables.
The R Square value of .915 suggests that the independent variables in the model account for
approximately 91.5% of the variance in the dependent variable. This indicates a high level of
explanatory power.

The Adjusted R Square value takes into account the number of independent variables
and the sample size. It penalizes the addition of unnecessary variables that do not contribute
significantly to the model. With an Adjusted R Square value of .913, we can conclude that the
independent variables explain a significant portion of the variance in the dependent variable
while considering the complexity of the model.

The Std. Error of the Estimate is an estimate of the standard deviation of the residuals,
which are the prediction errors in the model. It represents the average amount by which the
observed values deviate from the predicted values. In this case, the value of 8904.470
suggests that, on average, the predicted values may deviate from the actual values by
approximately 8904.470 units.

The Change Statistics section provides information about the changes in the model
when additional independent variables are added. The R Square Change value of .915
represents the change in R Square when the new variables are included. This indicates the
proportion of the variance explained by the new variables alone.
The F Change statistic tests the overall significance of the additional variables in the
model. The high F Change value of 536.042 suggests that the addition of the independent
variables significantly improves the model's fit.

The Sig. F Change value of .000 indicates that the change in the model is statistically
significant. Overall, the regression model appears to have a strong fit, with a high R Square
value, significant F Change, and a relatively low standard error of the estimate. These results
suggest that the independent variables in the model have a substantial impact on explaining
the variation in the dependent variable.

The ANOVA refer table 1.2 summarizes the results of the analysis of variance for the
regression model predicting the dependent variable "Annual Salary" using the predictors
"Education," "Beta Experience," "Prior Experience," and "Age." The "Regression" row in the
table indicates the amount of variance explained by the regression model. In this case, the
regression model explains a substantial portion of the variation in the dependent variable, as
evidenced by the high sum of squares value of 170,010,221,255.122 and the significant F
statistic of 536.042. This suggests that the predictors collectively contribute significantly to
predicting the annual salary.
The "Residual" row represents the unexplained variation or the prediction errors
remaining after accounting for the regression model. The sum of squares for the residuals is
15,778,626,195.859, indicating the amount of variation in the dependent variable that the
regression model was unable to explain. The mean square value of 79,289,578.874
represents the average unexplained variation per degree of freedom.
The "Total" row provides information about the total variation in the dependent variable.
The sum of squares for the total variation is 185,788,847,450.981, reflecting the overall
variability in annual salaries. The total degree of freedom of 203 corresponds to the total
number of observations.
Overall, the ANOVA table indicates that the regression model is statistically significant
in explaining the annual salary. The predictors included in the model collectively account for a
substantial amount of the variance in the dependent variable. However, there is still
unexplained variation, as reflected by the residuals. These results provide insights into the
performance and predictive power of the regression model for predicting annual salaries
based on the given set of predictors.
b) Interpret the result of the regression analysis

The "Coefficients" table presents the estimated coefficients for the predictors in the
regression model predicting the dependent variable "Annual Salary." The constant term,
representing the estimated value of the dependent variable when all predictors are zero, is
1642.064. However, it is not statistically significant, meaning that it does not have a
significant impact on the annual salary.

The predictor "Age" has a negative coefficient of -144.372, indicating that as age
decreases, the annual salary is estimated to decrease by 144.372 units. This coefficient is
statistically significant, with a t-value of -2.326 and a p-value of 0.021. The 95% confidence
interval for the Age coefficient ranges from -266.762 to -21.982, suggesting that the effect of
age on annual salary is likely to fall within this range.
The predictor "Prior Experience" has a positive coefficient of 3126.812, indicating that
as prior experience increases, the annual salary is estimated to increase by 3126.812 units.
This coefficient is highly statistically significant (p < 0.001), with a t-value of 23.372. The 95%
confidence interval for the Prior Experience coefficient ranges from 2862.996 to 3390.629,
indicating a high degree of confidence in the estimate.
The predictor "Beta Experience" also has a positive coefficient of 2642.345,
indicating that an increase in Beta Experience is associated with an increase in annual
salary by 2642.345 units. This coefficient is highly statistically significant (p < 0.001), with a t-
value of 25.065. The 95% confidence interval for the Beta Experience coefficient ranges
from 2434.458 to 2850.232, providing a range of plausible values for the effect.
Lastly, the predictor "Education" has a positive coefficient of 7566.422, suggesting
that as education level increases, the annual salary is estimated to increase by 7566.422
units. This coefficient is highly statistically significant (p < 0.001), with a t-value of 19.948.
The 95% confidence interval for the Education coefficient ranges from 6818.432 to
8314.412, indicating a high level of confidence in the estimate.
In summary, the coefficients indicate the direction and magnitude of the relationship
between each predictor and the annual salary. Age and Education have smaller
standardized coefficients, suggesting a relatively weaker impact compared to Prior
Experience and Beta Experience. However, all predictors show statistically significant
relationships with the dependent variable, indicating their importance in explaining the
variation in annual salary.
1.2 Question 2

Model Summary

Std. Error Change Statistics


Mod R Adjusted R of the R Square F Sig. F
el R Square Square Estimate Change Change df1 df2 Change
1 .134a .018 -.011 1.450 .018 .623 1 34 .436

a. Predictors: (Constant), Inspection Expenditures

Table 2.1: Model Summary

The "Model Summary" table provides an overview of the regression model's performance in
predicting the dependent variable using the predictor "Inspection Expenditures." The
correlation coefficient (R) of 0.134 indicates a positive but weak relationship between the
predicted values and the actual values of the dependent variable. This suggests that the
predictor "Inspection Expenditures" has a limited influence on the variation in the dependent
variable.
The coefficient of determination (R Square) is 0.018, indicating that only 1.8% of the
variance in the dependent variable can be explained by the predictor "Inspection
Expenditures." This implies that the predictor has a relatively weak impact on the overall
variability of the dependent variable.
The adjusted R-squared value, which takes into account the number of predictors
and the sample size, is -0.011. This negative value suggests that including the predictor
"Inspection Expenditures" does not improve the model's fit and may even worsen it, as
indicated by a decrease in the adjusted R-squared value. Therefore, the predictor does not
provide significant explanatory power in relation to the dependent variable.
The standard error of the estimate is 1.450, representing the average amount of error
or variability in predicting the dependent variable using the regression model. A higher
standard error suggests a greater level of uncertainty in the model's predictions.
The change statistics indicate the impact of adding the predictor "Inspection
Expenditures" to the model. The R Square Change of 0.018 represents the increase in the
R-squared value due to the inclusion of the predictor. However, the associated F statistic
(0.623) suggests that this increase is not statistically significant.
In conclusion, the "Model Summary" indicates that the predictor "Inspection
Expenditures" has a weak and limited relationship with the dependent variable. The model's
ability to explain the variation in the dependent variable is quite low, and the inclusion of the
predictor does not significantly improve the model's fit. Therefore, caution should be
exercised when relying solely on this predictor to make predictions about the dependent
variable.

ANOVAa
Sum of Mean
Model Squares df Square F Sig.
1 Regression 1.308 1 1.308 .623 .436b
Residual 71.442 34 2.101
Total 72.750 35
a. Dependent Variable: Motors Returned
b. Predictors: (Constant), Inspection Expenditures

Figure 2.2: Anova

The ANOVA table provides information on the analysis of variance for the regression model
predicting the dependent variable "Motors Returned" using the predictor "Inspection
Expenditures."
The "Regression" section of the table shows that the regression model explains a sum
of squares (SS) of 1.308. This indicates the amount of variance in the dependent variable that
can be accounted for by the predictor "Inspection Expenditures." The model has 1 degree of
freedom (df), which represents the number of predictors included in the model. The mean
square (MS), calculated by dividing the sum of squares by the degrees of freedom, is 1.308.
The F statistic, which compares the explained variance to the unexplained variance, is 0.623.
However, the associated significance level (Sig.) of 0.436 suggests that the regression
component of the model is not statistically significant. This means that the predictor
"Inspection Expenditures" does not have a significant impact on the variation in the dependent
variable "Motors Returned."
The "Residual" section of the table provides information about the unexplained
variance or error in the model. The sum of squares for the residual is 71.442, indicating the
amount of variance in the dependent variable that is not accounted for by the predictor
"Inspection Expenditures." The residual has 34 degrees of freedom, which is calculated by
subtracting the number of predictors from the total sample size. The mean square for the
residual is 2.101, calculated by dividing the sum of squares by the degrees of freedom. This
represents the average amount of unexplained variance per degree of freedom.
The "Total" section of the table represents the overall variance in the dependent
variable. The total sum of squares is 72.750, which is the sum of the regression sum of squares
and the residual sum of squares. It captures the total variability in the dependent variable.
In summary, the ANOVA table suggests that the regression model with the predictor
"Inspection Expenditures" does not significantly explain the variation in the dependent variable
"Motors Returned." The relatively small sum of squares in the regression component and the
non-significant F statistic indicate that the predictor does not have a substantial impact on the
dependent variable. The majority of the variation in the dependent variable remains
unexplained, as shown by the relatively high sum of squares in the residual section.

Coefficientsa
Unstandardized Standardized
Coefficients Coefficients
Model B Std. Error Beta t Sig.
1 (Constant) 66.476 1.157 57.447 .000
Inspection -1.512E-5 .000 -.134 -.789 .436
Expenditures
a. Dependent Variable: Motors Returned

Figure 2.3: Coefficients

The "Coefficients" table provides information about the regression coefficients for the
model predicting the dependent variable "Motors Returned" using the predictor "Inspection
Expenditures." The first row of the table represents the intercept term or the constant in the
regression equation. The coefficient for the constant is 66.476, which indicates the expected
value of the dependent variable when the predictor "Inspection Expenditures" is zero. The
standard error associated with the constant is 1.157, providing an estimate of the uncertainty
in the coefficient.

The second row of the table represents the coefficient for the predictor "Inspection
Expenditures." The unstandardized coefficient for this predictor is -1.512E-5. This coefficient
suggests that, for every one-unit increase in "Inspection Expenditures," the expected change
in the dependent variable "Motors Returned" is -1.512E-5. The standard error associated with
this coefficient is 0.000, indicating a very precise estimate of the coefficient.
The standardized coefficient, also known as the beta coefficient, provides a measure
of the relative importance of the predictor in relation to other predictors in the model. In this
case, the standardized coefficient for "Inspection Expenditures" is -0.134. This suggests that
a one-standard deviation increase in "Inspection Expenditures" is associated with a negative
change in the dependent variable, "Motors Returned," equivalent to 0.134 standard deviations.

The t-value for the coefficient of "Inspection Expenditures" is -0.789, indicating that the
coefficient is not statistically significant at the conventional significance level (e.g., alpha =
0.05). This suggests that the relationship between "Inspection Expenditures" and "Motors
Returned" in the model may be due to random chance rather than a true association.

The associated significance level (Sig.) of 0.436 confirms that the coefficient for
"Inspection Expenditures" is not statistically significant. This implies that the predictor does
not have a significant impact on the dependent variable "Motors Returned" in this regression
model.

In conclusion, the "Coefficients" table indicates that the predictor "Inspection


Expenditures" does not have a statistically significant relationship with the dependent variable
"Motors Returned" in the regression model. The coefficient is not significantly different from
zero, and the associated t-value and significance level suggest that any observed relationship
between the predictor and the dependent variable may be due to chance rather than a
meaningful association.

You might also like