You are on page 1of 6

Objective

Can build predictive modelling , reduce multi collinearity with adjusted r square of 42% and can
make promotional statergy.

View > principal components > and get output

Use eigen value approach – take values more than 1 have reduced 21 to 6 , will have only 6
promotional startergies. – Till more than 1 the contributon will be significant

When you rebuild the model consider only 6 ( combining PCA with regression – gives you PCR)

6 components are there

Principal Components Analysis


n = 400

Eigenanalysis of the Correlation Matrix

Component Eigenvalue Proportion Cumulative Stop as soon as u reach 1 – till 1.6168


1 4.7276 0.2251 0.2251
2 3.5495 0.1690 0.3941
3 2.4162 0.1151 0.5092
Decreasing trend 22%>16%>11%>9% - if he is close to 9 he
4 2.3757 0.1131 0.6223 will purchase it – model works under the assumption of
5 2.0085 0.0956 0.7180 normality with 6 components 79 % variance can be
6 1.6168 0.0770 0.7950
7 0.7860 0.0374 0.8324 explained.
8 0.7179 0.0342 0.8666
9 0.5592 0.0266 0.8932
10 0.3938 0.0188 0.9120
11 0.3506 0.0167 0.9287
12 0.3376 0.0161 0.9447
13 0.2666 0.0127 0.9574
14 0.2453 0.0117 0.9691
15 0.1671 0.0080 0.9771
16 0.1398 0.0067 0.9837
17 0.0945 0.0045 0.9882
18 0.0810 0.0039 0.9921
19 0.0742 0.0035 0.9956
20 0.0533 0.0025 0.9981
21 0.0389 0.0019 1.0000

Eigenvectors (component loadings)

PC1 PC2 PC3 PC4 PC5 PC6 PC7


X1 0.013 -0.306 0.145 -0.431 0.029 0.047 -0.310
X2 0.025 -0.302 0.149 -0.433 0.015 0.041 -0.301
X3 0.037 -0.301 0.118 -0.376 -0.020 0.001 0.175
X4 0.016 -0.262 0.066 -0.275 -0.035 0.048 0.618
X5 0.410 0.024 -0.028 0.006 0.039 0.034 -0.251
X6 0.414 0.032 -0.013 0.007 0.011 0.023 -0.239
X7 0.429 0.027 -0.039 0.008 0.017 0.042 -0.103
X8 0.430 0.011 -0.030 -0.001 0.021 0.019 -0.029
X9 0.401 0.006 -0.012 0.001 -0.040 0.002 0.265
X10 0.351 -0.005 -0.001 0.010 -0.025 -0.051 0.431
X11 -0.013 -0.159 -0.534 -0.054 0.145 -0.001 -0.009
X12 -0.026 -0.168 -0.563 -0.033 0.092 -0.001 -0.026
X13 -0.026 -0.135 -0.528 -0.089 0.080 -0.040 0.012
X14 0.023 0.021 0.026 -0.074 0.079 -0.695 -0.015
X15 0.023 0.031 0.039 -0.062 0.104 -0.691 -0.017
X16 -0.012 -0.255 0.125 0.219 0.491 0.042 0.047
X17 0.007 -0.260 0.141 0.232 0.486 0.049 0.023
X18 0.031 -0.201 0.129 0.235 0.366 0.001 -0.016
X19 0.017 -0.382 0.019 0.281 -0.329 -0.070 -0.050
X20 0.022 -0.376 0.014 0.301 -0.324 -0.079 -0.053
X21 0.015 -0.348 0.003 0.248 -0.339 -0.097 -0.054

PC8 PC9 PC10 PC11 PC12 PC13 PC14


X1 -0.226 -0.020 0.140 0.138 0.066 0.044 -0.027
X2 -0.249 -0.036 0.127 0.151 0.057 0.002 -0.065
X3 0.169 0.191 -0.519 -0.527 -0.294 -0.079 0.103
X4 0.499 -0.082 0.313 0.282 0.172 0.027 -0.023
X5 0.320 -0.024 0.009 -0.074 -0.014 0.275 -0.295
X6 0.324 -0.015 0.011 -0.071 -0.008 0.285 -0.219
X7 0.122 -0.014 -0.013 0.037 0.067 -0.280 0.329
X8 -0.025 -0.049 0.035 0.089 0.048 -0.411 0.353
X9 -0.352 -0.009 -0.016 0.051 -0.013 -0.085 0.060
X10 -0.500 0.052 -0.015 -0.043 -0.067 0.305 -0.298
X11 0.041 -0.046 -0.058 0.300 -0.500 -0.106 -0.063
X12 -0.033 0.030 -0.021 0.105 -0.130 0.096 -0.009
X13 -0.064 0.071 0.092 -0.432 0.639 0.021 0.044
X14 0.058 -0.010 -0.511 0.373 0.301 -0.006 -0.088
X15 0.016 -0.044 0.540 -0.324 -0.307 -0.033 0.091
X16 -0.039 -0.378 -0.021 -0.060 0.051 -0.014 -0.018
X17 -0.012 -0.285 -0.106 -0.081 -0.003 0.091 0.081
X18 0.026 0.843 0.129 0.137 0.031 -0.042 -0.022
X19 0.007 -0.055 0.018 -0.048 0.023 -0.281 -0.289
X20 0.010 -0.037 0.022 -0.057 0.010 -0.257 -0.258
X21 -0.013 0.016 -0.023 0.101 -0.033 0.556 0.587

PC15 PC16 PC17 PC18 PC19 PC20 PC21


X1 -0.002 -0.023 -0.192 0.623 -0.273 0.034 -0.051
X2 -0.022 0.010 0.155 -0.608 0.309 -0.032 0.073
X3 0.043 -0.005 0.061 0.001 -0.027 -0.002 -0.002
X4 0.019 -0.001 -0.037 -0.015 0.003 0.010 0.008
X5 -0.028 0.105 0.076 -0.120 -0.202 0.641 -0.095
X6 0.053 0.071 -0.009 0.081 0.093 -0.707 0.083
X7 -0.080 -0.343 -0.086 0.244 0.604 0.195 0.007
X8 -0.005 -0.144 0.045 -0.266 -0.622 -0.155 0.026
X9 0.254 0.724 0.017 0.098 0.152 0.040 -0.073
X10 -0.198 -0.450 -0.027 -0.012 -0.033 -0.012 0.043
X11 -0.516 0.163 0.075 0.054 0.017 -0.044 0.003
X12 0.728 -0.240 -0.110 -0.056 0.001 0.039 -0.004
X13 -0.243 0.107 0.023 -0.003 -0.007 -0.023 0.001
X14 -0.001 0.002 0.002 0.010 -0.007 0.001 0.004
X15 0.034 0.008 -0.022 -0.002 0.027 0.014 -0.014
X16 0.107 -0.078 0.650 0.185 0.010 -0.017 0.001
X17 -0.082 0.084 -0.675 -0.189 0.013 0.003 0.001
X18 -0.004 0.025 0.051 0.001 0.000 -0.008 -0.013
X19 -0.005 -0.072 -0.041 -0.038 0.040 -0.097 -0.686
X20 0.031 0.043 -0.052 0.063 -0.017 0.091 0.705
X21 -0.089 0.031 0.117 -0.015 -0.030 0.017 -0.026

Go to + now if u check u will have 6 components added.


Eliminate pc6 and rebuild the model

Now check for collinearity and other analysis u see that It is 1 so the components are not dependent
on each other.

Seeing the eigen vector of x5 x6 x7 x8 x9 – they are all dependent on something – like financials
check for overloadings with eigen vectors. For each component check this. Give a promotinal
stratergy for each of them based on the overloading for each component in the eigen vector.

You might also like