-Explain in details
SST
SSR
SSE
ns) SST
The sum ofsquares pal, denoted SST, is the sqvared differences
etween the observed dependentyariable and its mean. You can think
F this as the dispersion of the observed variables around the mean —
vch like the variance in descriptre siatstcs.
he Towl 597 +ells you how much variaton tere is in the dependent
nriable. Total SS = U(Yi - mean of¥) 2 find he aca number that
presents asum ofsquares. A diagram (like the regression line above)
eptoral and can supply a visual representation of whatyoure
Nevlating.
is ameasure of he ol variability of the datset.
SSR
The second term is the sum of squares due regression, or SSR. Tt
the sum of the differences between the predicted value and the
ean of the dependentrariable. Think ofitas am ensure that describes
ow well ourline fit the dot.
this value ofSS@ is equal » the sum ofsquares pal, itmeans our
igression model copwres all the observed variability and is perfect.
nce again, We have + menton thatano+ther common notaton is ESF
vex plained sum ofsqvores.
52 = El ¥)2= SST - SSE. Regression sum of squares is interpretedror sum of squares is obtained by firstcompudng the mean
Feach botiery 4 pe. For each battery ofa specified 4pe, the
sbracted from each individual battery's lifeme and then
The sum of these squared terms forall batiery ypes equals
F5E is ameasvre of sampling error.
term is the sum of squares error 0 r SSE. The erroris he
2 between the observed value and he predicted valve.
y Want minimize the error. The sm aller the error, the beter
xfon power of he regression. Finally, Lshould add thatitis
nas residual sum of squares.