You are on page 1of 6

1.

>Cutting_tool<-read.csv(file.choose())
>attach(Cutting_tool)
>lm(y~x1+x2, Cutting_tool)

Coefficients:
(Intercept) x1 x2
49.9515 0.1380 -0.1156

>x2.f=factor(x2)
>dummies=model.matrix(~x2.f)
>dummies

(Intercept) x2.f416
1 1 0
2 1 0
3 1 0
4 1 0
5 1 0
6 1 0
7 1 0
8 1 0
9 1 0
10 1 0
11 1 1
12 1 1
13 1 1
14 1 1
15 1 1
16 1 1
17 1 1
18 1 1
19 1 1
20 1 1
attr(,"assign")
[1] 0 1
attr(,"contrasts")
attr(,"contrasts")$`x2.f`
[1] "contr.treatment"

> library(car)
> m1=lm(y~x1+dummies[,2],data=Cutting_tool)
> vif(m1)

x1 dummies[, 2]
1.000417 1.000417

Vif(x2)<10
Therefore, there is no multicollinearity.
2.

>m2=lm(y~x1+dummies[,2]+x1*dummies[,2],data=Cutting_tool)
>summary(m2)

Call:
lm(formula = y ~ x1 + dummies[, 2] + x1 * dummies[, 2], data = Cutting_tool)

Residuals:
Min 1Q Median 3Q Max
-0.87625 -0.47176 -0.05946 0.30171 1.77653

Coefficients:
Estimate Std. Error t value Pr(>|t|)
(Intercept) 11.50294 2.80421 4.102 0.000833 ***
x1 0.15293 0.01187 12.885 7.29e-10 ***
dummies[, 2] -4.04413 4.50653 -0.897 0.382814
x1:dummies[, 2] -0.03887 0.01912 -2.033 0.059001 .
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Residual standard error: 0.7134 on 16 degrees of freedom


Multiple R-squared: 0.9919, Adjusted R-squared: 0.9904
F-statistic: 652.1 on 3 and 16 DF, p-value: < 2.2e-16

H0: β1 = 0
H1: β1 ≠ 0
p-value=7.29e-10<0.05

 Therefore reject H0 at 5% significance level.

 Therefore 𝛽1 is significant at 5% significance level.

H0: β2= 0
H1: β2 ≠ 0
p-value=0.382814>0.05

 Therefore do not reject H0 at 5% significance level.

 Therefore 𝛽2 is not significant at 5% significance level.

H0: β3= 0
H1: β3 ≠ 0
p-value=0.059001>0.05

 Therefore do not reject H0 at 5% significance level.

 Therefore 𝛽3 is not significant at 5% significance level.


3.

Reject x1.z1 term.

>m3=lm(y~x1+dummies[,2],data=Cutting_tool)
>summary(m3)

Call:
lm(formula = y ~ x1 + dummies[, 2], data = Cutting_tool)

Residuals:
Min 1Q Median 3Q Max
-1.0028 -0.5968 -0.1582 0.4997 1.6455

Coefficients:
Estimate Std. Error t value Pr(>|t|)
(Intercept) 15.02965 2.39759 6.269 8.49e-06 ***
x1 0.13795 0.01013 13.622 1.41e-10 ***
dummies[, 2] -13.18243 0.34725 -37.962 < 2e-16 ***
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Residual standard error: 0.7763 on 17 degrees of freedom


Multiple R-squared: 0.9898, Adjusted R-squared: 0.9886
F-statistic: 824.2 on 2 and 17 DF, p-value: < 2.2e-16

H0: β1 = 0
H1: β1 ≠ 0
p-value=1.41e-10 <0.05

 Therefore reject H0 at 5% significance level.

 Therefore 𝛽1 is significant at 5% significance level.

H0: β2= 0
H1: β2 ≠ 0
p-value=(< 2e-16 )>0.05

 Therefore reject H0 at 5% significance level.

 Therefore 𝛽2 is significant at 5% significance level.

H0: Model is not significant.


H1: Model is significant.

H0: β1= β2= 0


H1:at least one βi ≠ 0 ; i=1,2

p-value=2.2e-16<0.05
 Therefore reject H0 at 5% significance level.

 Therefore model is significant at 5% significance level.

Goodness of fit for the data


R2 = 98.98%
98.98% of the variation is explained by the fitted model.

Fitted model
y = 15.02965 + 0.13795x1 - 13.18243z1

4.

H0: ρ = 0, residuals are uncorrelated


H1: ρ ≠ 0, residuals are correlated

> library(lmtest)
> dwtest(y~x1+dummies[,2],data=Cutting_tool)

data: y ~ x1 + dummies[, 2]
DW = 1.5434, p-value = 0.08481
alternative hypothesis: true autocorrelation is greater than 0

p-value=0.08481>0.05

 Therefore do not reject H0 at 5% significance level.

 Therefore residuals are uncorrelated at 5% significance level.

>plot(m3$fitted.values,m3$residuals)
Since there is a random pattern, constant variance of residuals is satisfied.

>qqnorm(m3$residuals)
>qqline(m3$residuals,col="red")
Since the plot is nearly linear, error distribution is normal.

You might also like