You are on page 1of 69

Factor Analysis

PLG 701
Lecture 3
WHAT IS ?
Factor analysis examines the interrelationships among a large
number of variables and then attempts to explain them in
terms of their common underlying dimensions.
 
These common underlying dimensions are referred to as
factors.
 
Factor analysis is a summarization and data reduction
technique that does not have independent and dependent
variables, but an interdependence technique in which all
variables are considered simultaneously.

Hair et al. (2006)


What is FA?
• FA is a statistical technique used to identify a
relatively small number of factors that can be
used to represent relationship among sets of
many interrelated variables
WHAT IS ?
Statistical techniques for identifying
interrelationships between items with the
goal of identifying items that group or cluster
together.
What is FA?
• The basic assumption of FA is that underlying
dimensions or factors can be used to explain
complex phenomena. Observed correlations
between variables result from their sharing of
these factors. For example, correlation
between test scores might be attributable to
such shared factors as general intelligence,
abstract reasoning skills, and reading
comprehension.
The Purpose of Factor Analysis
• The purpose of factor analysis is to discover
simple patterns in the pattern of relationships
among the variables. In particular, it seeks to
discover if the observed variables can be
explained largely or entirely in terms of a
much smaller number of variables called
factors.
Factor Analysis....(cont.)
• Many statistical methods are used to study the relation
between independent and dependent variables.
• Factor analysis is different; it is used to study the patterns of
relationship among many dependent variables, with the goal
of discovering something about the nature of the
independent variables that affect them, even though those
independent variables were not measured directly.
• Thus answers obtained by factor analysis are necessarily
more hypothetical and tentative than is true when
independent variables are observed directly. The inferred
independent variables are called factors.
Factor Analysis....(cont.)
Analysis of Interdepence
Data Reduction  Identification of Structures:
Is Analysis Exploratory or Confirmatory?

Confirmatory Exploratory

Grouping Target: Variable or Case?


Structural
Equation Variables Cases

Modeling
Q-type Factor
R-Type Factor
Analysis or
Analysis
Cluster Analysis
History
The method of factor analysis originated with
– C. Spearman
and
– G.H. Thomson
– L.L. Thurstone
– J. B. Carroll
– K. Joreskog
have contributed to further developments
Designing a Factor Analysis

There are two approaches to calculate the correlation matrix


that determine the type of factor analysis performed:
 
R-type factor analysis: input data matrix is computed from
correlations between variables.
 
Q-type factor analysis: input data matrix is computed from
correlations between individual respondents.
 
Variables in factor analysis are generally metric.
Q factor analysis
If the data available involve a relatively small number
of persons with many measurements on these persons
it is possible to undertake a Q factor analysis to cluster
persons

n N

N Q factor analysis

N = number of persons
Ordinary factor analysis n = number of variates
Steps in a Factor Analysis
• First, the correlation matrix for all variables is computed.
Variables that do not appear to be related to other variables can
be identified. The appropriateness of the factor model can also
be evaluated.
• Second step, factor extraction - the number of factors necessary
to represent the data and the method of calculating them –
must be determined.
• The third step, rotation, focuses on transforming the factors to
make them more interpretable.
• At the fourth step, scores for each factor can be computed for
each case. These scores can then be used in a variety of other
analyses.
General Steps to FA (Hair et al., 2006)

• Step 1: Selecting and Measuring a set of variables


in a given domain
• Step 2: Data screening in order to prepare the
correlation matrix
• Step 3: Factor Extraction
• Step 4: Factor Rotation to increase
interpretability
• Step 5: Interpretation
• Further Steps: Validation and Reliability of the
measures
Rules of Thumb 3–1

Factor Analysis Design


 Factor analysis is performed most often only on metric variables,
although specialized methods exist for the use of dummy
variables. A small number of “dummy variables” can be included
in a set of metric variables that are factor analyzed.
 If a study is being designed to reveal factor structure, strive to
have at least five variables for each proposed factor.
 For sample size:
o the sample must have more observations than variables.
o the minimum absolute sample size should be 50
observations.
 Maximize the number of observations per variable, with a
minimum of five and hopefully at least ten observations per
variable.
Rules of Thumb 3–2

Testing Assumptions of Factor Analysis


 There must be a strong conceptual foundation to support the
assumption that a structure does exist before the factor
analysis is performed.
 A statistically significant Bartlett’s test of sphericity (sig. > .
05) indicates that sufficient correlations exist among the
variables to proceed.
 Measure of Sampling Adequacy (MSA) values must exceed .50
for both the overall test and each individual variable.
Variables with values less than .50 should be omitted from the
factor analysis one at a time, with the smallest one being
omitted each time.
• A principal component factor analysis requires:
– The variables included must be metric level or dichotomous
(dummy-coded) nominal level
– The sample size must be greater than 50 (preferably 100)
– The ratio of cases to variables must be 5 to 1 or larger
– The correlation matrix for the variables must contain 2 or more
correlations of 0.30 or greater
– Variables with measures of sampling adequacy less than 0.50
must be removed
– The overall measure of sampling adequacy is 0.50 or higher
– The Bartlett test of sphericity is statistically significant.
• The first phase of a principal component analysis is devoted to
verifying that we meet these requirements. If we do not meet these
requirements, factor analysis is not appropriate.
• The second phase of a principal component factor analysis
focuses on deriving a factor model, or pattern of relationships
between variables and components, that satisfies the following
requirements:
– The derived components explain 50% or more of the variance in each of
the variables, i.e. have a communality greater than 0.50
– None of the variables have loadings, or correlations, of 0.40 or higher for
more than one component, i.e. do not have complex structure
– None of the components has only one variable in it

• To meet these requirements, we remove problematic variables


from the analysis and repeat the principal component analysis.
Number of Factors?

• A Priori Criterion.
• Latent Root Criterion.
• Percentage of Variance.
• Scree Test Criterion.
Number of Factors?

• A Priori Criterion.
• Latent Root Criterion.
• Percentage of Variance.
• Scree Test Criterion.
Eigenvalue Plot for Scree Test Criterion
Number of Factor Extracted
• Eigenvalue > 1.0
• Scree plot
Rotation of Factors

• Factor rotation = the reference axes of the


factors are tuned about the origin until some
other position has been reached. Since unrotated
factor solutions extract factors based on how
much variance they account for, with each
subsequent factor accounting for less variance,
the ultimate effect of rotating the factor matrix is
to redistribute the variance from earlier factors to
later ones to achieve a simpler, theoretically more
meaningful factor pattern.
Rules of Thumb 3–3

Choosing Factor Models and Number of Factors


• Although both component and common factor analysis models yield similar
results in common research settings (30 or more variables or communalities
of .60 for most variables):
 the component analysis model is most appropriate when data reduction is
paramount.
 the common factor model is best in well-specified theoretical applications.
• Any decision on the number of factors to be retained should be based on
several considerations:
 use of several stopping criteria to determine the initial number of factors to retain.

 Factors With Eigenvalues greater than 1.0.


 A pre-determined number of factors based on research objectives and/or prior
research.
 Enough factors to meet a specified percentage of variance explained, usually 60%
or higher.
 Factors shown by the scree test to have substantial amounts of common variance
(i.e., factors before inflection point).
 More factors when there is heterogeneity among sample subgroups.
• Consideration of several alternative solutions (one more and one less factor
than the initial solution) to ensure the best structure is identified.
Two Rotational Approaches:

1. Orthogonal = axes are


maintained at 90 degrees.

2. Oblique = axes are not


maintained at 90 degrees.
Orthogonal Factor Rotation
Unrotated
Factor II
+1.0 Rotated Factor II
V1

V2
+.50

Unrotated
Factor I
-1.0 -.50 0 +.50 +1.0
V3
V4
-.50
Rotated
V5
Factor I

-1.0
Oblique Factor Rotation
Unrotated
Factor II Orthogonal
+1.0 Rotation: Factor II

V1 Oblique Rotation:
Factor II
+.50 V2

Unrotated
Factor I
-1.0 -.50 0 +.50 +1.0
V3
V4 Oblique
-.50 Rotation:
V5 Factor I
Orthogonal
Rotation: Factor I
-1.0
Rules of Thumb 3–4

Choosing Factor Rotation Methods

 Orthogonal rotation methods:


o are the most widely used rotational methods.
o are The preferred method when the research goal is
data reduction to either a smaller number of variables or
a set of uncorrelated measures for subsequent use in
other multivariate techniques.

 Oblique rotation methods:


o best suited to the goal of obtaining several theoretically
meaningful factors or constructs because,
realistically, very few constructs in the “real world” are
uncorrelated.
Rules of Thumb 3–4

Choosing Factor Rotation Methods

 Orthogonal rotation methods:


o are the most widely used rotational methods.
o are The preferred method when the research goal is
data reduction to either a smaller number of variables or
a set of uncorrelated measures for subsequent use in
other multivariate techniques.

 Oblique rotation methods:


o best suited to the goal of obtaining several theoretically
meaningful factors or constructs because,
realistically, very few constructs in the “real world” are
uncorrelated.
Guidelines for Identifying Significant
Factor Loadings Based on Sample Size

Factor Loading Sample Size Needed


for Significance*

.30 350
.35 250
.40 200
.45 150
.50 120
.55 100
.60 85
.65 70
.70 60
.75 50
*
Significance
Significance is based on a .05 significance level (a), a power level of 80 percent, and
standard errors assumed to be twice those of conventional correlation coefficients.
Rules of Thumb 3–5

Assessing Factor Loadings


• While factor loadings of +.30 to +.40 are minimally acceptable,
values greater than + .50 are considered necessary for practical
significance.

• To be considered significant:
o A smaller loading is needed given either a larger sample size, or
a larger number of variables being analyzed.
o A larger loading is needed given a factor solution with a larger
number of factors, especially in evaluating the loadings on later
factors.

• Statistical tests of significance for factor loadings are generally


very conservative and should be considered only as starting points
needed for including a variable for further consideration.
Factor Loadings Interpretation
 Factor Loading is the correlation of the variable/item and the factor  the
squared loading = amount of the variable’s total variance accounted for by
the factor
 Minimum Factor Loading Significance: 0.30 ( 10% variance accounted
by factor)
 The larger the size, the more important the loading in interpreting the
factor matrix
 Extremely large factor loadings (0.80) are not typical
 Therefore emphasis is on practical significance
 The guidelines are applicable when N  100
Rules of Thumb 3–6

Interpreting The Factors


 An optimal structure exists when all variables have high loadings
only on a single factor.

 Variables that cross-load (load highly on two or more factors) are


usually deleted unless theoretically justified or the objective is
strictly data reduction.

 Variables should generally have communalities of greater than .50


to be retained in the analysis.

 Respecification of a factor analysis can include options such as:


o deleting a variable(s),
o changing rotation methods, and/or
o increasing or decreasing the number of factors.
Factor Analysis – SPSS

Goodness of Measures 43
HATCO Data Low value indicating that
there might not be sufficient
KMO and Bartlett's Test
number of significant
Kaiser-Meyer-Olkin Measure of Sampling
Adequacy.
correlations to conduct
.446
Factor analysis
Bartlett's Test of Approx. Chi-Square 567.541
Sphericity df 21 Typically any correlation less
Sig. .000 than 0.3 is insignificant
Correlation Matrix

Delivery Price Price Manufacturer Salesforce Product


Speed Level Flexibility Image Service Image Quality
Delivery Speed 1.000 -.349 .509 .050 .612 .077 -.483
Price Level -.349 1.000 -.487 .272 .513 .186 .470
Price Flexibility .509 -.487 1.000 -.116 .067 -.034 -.448
Manufacturer Image .050 .272 -.116 1.000 .299 .788 .200
Service .612 .513 .067 .299 1.000 .241 -.055
Salesforce Image .077 .186 -.034 .788 .241 1.000 .177
Product Quality -.483 .470 -.448 .200 -.055 .177 1.000

The above indicates that there might be some variables that should not
be included in the Factor analysis
HATCO Data
Anti-image Matrices

Delivery Price Manufacturer Salesforce Product


Speed Price Level Flexibility Image Service Image Quality
Anti-image Covariance Delivery Speed 2.797E-02 2.847E-02 2.380E-03 1.464E-02 -2.47E-02 -6.116E-03 -2.13E-03
Price Level 2.847E-02 3.165E-02 2.152E-02 1.403E-02 -2.62E-02 -4.851E-03 -1.98E-02
Price Flexibility 2.380E-03 2.152E-02 .608 4.372E-02 -1.07E-02 -4.045E-02 8.592E-02
Manufacturer Image 1.464E-02 1.403E-02 4.372E-02 .347 -1.54E-02 -.275 -1.81E-02
Service -2.47E-02 -2.621E-02 -1.07E-02 -1.538E-02 2.281E-02 4.756E-03 1.044E-02
Salesforce Image -6.12E-03 -4.851E-03 -4.05E-02 -.275 4.756E-03 .371 -4.41E-02
Product Quality -2.13E-03 -1.981E-02 8.592E-02 -1.815E-02 1.044E-02 -4.414E-02 .623
Anti-image Correlation Delivery Speed .344a .957 1.825E-02 .149 -.978 -6.005E-02 -1.62E-02
Price Level .957 .330a .155 .134 -.975 -4.478E-02 -.141
Price Flexibility 1.825E-02 .155 .913a 9.514E-02 -9.13E-02 -8.520E-02 .140
Manufacturer Image .149 .134 9.514E-02 .558a -.173 -.766 -3.90E-02
Service -.978 -.975 -9.13E-02 -.173 .288a 5.171E-02 8.762E-02
Salesforce Image -6.01E-02 -4.478E-02 -8.52E-02 -.766 5.171E-02 .552a -9.19E-02
Product Quality -1.62E-02 -.141 .140 -3.903E-02 8.762E-02 -9.186E-02 .927a
a. Measures of Sampling Adequacy(MSA)

MSA < 0.5 indicates that these variables that should not be included in the
Factor analysis. Delete one at a time – pick the lowest value, and repeat the
analysis excluding X5
Before deleting
Total Variance Explained

Initial Eigenvalues Extraction Sums of Squared Loadings Rotation Sums of Squared Loadings
% of % of % of
Component Total Variance Cumulative % Total Variance Cumulative % Total Variance Cumulative %
1 2.526 36.082 36.082 2.526 36.082 36.082 2.379 33.984 33.984
2 2.120 30.291 66.374 2.120 30.291 66.374 1.827 26.098 60.082
3 1.181 16.873 83.246 1.181 16.873 83.246 1.622 23.165 83.246
4 .541 7.731 90.977
5 .418 5.972 96.949
6 .204 2.920 99.869
7 .009 .131 100.000
Extraction Method: Principal Component Analysis.
Communality • Amount of shared, or
common variance, among
the variables
• General guidelines should
be above 0.5

Communalities

Initial Extraction
x1 Delivery Speed 1.000 .884
x2 Price Level 1.000 .895
x3 Price Flexibility 1.000 .649
x4 Manufacturer Image 1.000 .885
x5 Service 1.000 .995
x6 Salesforce Image 1.000 .901
x7 Product Quality 1.000 .618
Extraction Method: Principal Component Analysis.
Component Matrix
Unrotated Rotated

Component Matrix a Rotated Component Matrix a

Component Component
1 2 3 1 2 3
x1 Delivery Speed -.528 .752 .202 x1 Delivery Speed -.752 .071 .560
x2 Price Level x2 Price Level .754 .108 .561
.792 .093 .508
x3 Price Flexibility -.806 .006 .010
x3 Price Flexibility -.692 .374 -.173
x4 Manufacturer Image .117 .921 .153
x4 Manufacturer Image .564 .602 -.452
x5 Service -.062 .176 .980
x5 Service .186 .779 .595
x6 Salesforce Image .034 .945 .077
x6 Salesforce Image .492 .604 -.542 x7 Product Quality .760 .193 -.064
x7 Product Quality .739 -.270 -.005
Extraction Method: Principal Component Analysis.
Extraction Method: Principal Component Analysis. Rotation Method: Varimax with Kaiser Normalization.
a. 3 components extracted. a. Rotation converged in 5 iterations.
HATCO Data
KMO and Bartlett's Test

Kaiser-Meyer-Olkin Measure of Sampling


The MSA overall and for each
Adequacy. .665 individual variable are sufficiently
Bartlett's Test of Approx. Chi-Square 205.965
large. We can now begin to
Sphericity df 15 interpret the results of the Factor
Sig. .000 analysis

Anti-image Matrices

Delivery Price Manufacturer Salesforce Product


Speed Price Level Flexibility Image Image Quality
Anti-image Covariance Delivery Speed .629 4.752E-02 -.210 -4.655E-02 -2.184E-02 .208
Price Level 4.752E-02 .650 .190 -7.706E-02 1.260E-02 -.162
Price Flexibility -.210 .190 .613 3.791E-02 -3.864E-02 9.232E-02
Manufacturer Image -4.65E-02 -7.706E-02 3.791E-02 .358 -.281 -1.15E-02
Salesforce Image -2.18E-02 1.260E-02 -3.86E-02 -.281 .372 -4.68E-02
Product Quality .208 -.162 9.232E-02 -1.154E-02 -4.680E-02 .627
Anti-image Correlation Delivery Speed .721a 7.433E-02 -.338 -9.808E-02 -4.515E-02 .331
Price Level 7.433E-02 .787a .301 -.160 2.565E-02 -.253
Price Flexibility -.338 .301 .748a 8.092E-02 -8.093E-02 .149
Manufacturer Image -9.81E-02 -.160 8.092E-02 .542a -.769 -2.43E-02
Salesforce Image -4.51E-02 2.565E-02 -8.09E-02 -.769 .532a -9.69E-02
Product Quality .331 -.253 .149 -2.434E-02 -9.689E-02 .779a
a. Measures of Sampling Adequacy(MSA)
Communality• Amount of shared, or
common variance, among
the variables
• General guidelines should
be above 0.5

Communalities

Initial Extraction
Delivery Speed 1.000 .658
Price Level 1.000 .580
Price Flexibility 1.000 .646
Manufacturer Image 1.000 .882
Salesforce Image 1.000 .872
Product Quality 1.000 .616
Extraction Method: Principal Component Analysis.
HATCO Data
Total Variance Explained

Extraction Sums of Squared Loadings Rotation Sums of Squared Loadings


Component Total % of Variance Cumulative % Total % of Variance Cumulative %
1 2.513 41.892 41.892 2.370 39.497 39.497
2 1.740 28.992 70.883 1.883 31.386 70.883
Extraction Method: Principal Component Analysis.

• These represents the percentage of variance in X1, X2, X3, X4, X6,
X7 captured by the two factors. Cumulatively the two factors
captured 71% of the variance, which is quite high
• When reporting factor results you need to also include the
percentage of variance explained beneath each factor
• The next issue: Which items to which factor? This assignment must
be done uniquely; I.e. one item can be loaded onto only one and
only one factor.
HATCO Data
Component Matrixa
• The purpose is to assign
Component uniquely each item to
1 2 only one factor.
Delivery Speed -.627 .514
Price Level .759 -6.79E-02 • We do this by looking at
Price Flexibility -.730 .337 the factor loadings
Manufacturer Image .494 .798
• They represent the
Salesforce Image .425 .832
Product Quality .767 -.168
correlation between the
item and the factor.
Extraction Method: Principal Component Analysis.
a. 2 components extracted. • Thus when an item has
• The above pattern of cross-loadings especially significant ( > 0.3)
loadings on more than
for X1, X3, X4, X6 indicates that these 4 items
one factor, then we see
cannot be uniquely assigned to either one of
the problem of cross-
these two factors.
loadings.
• To get a clearer picture, we do a rotation
HATCO Data
Rotated Component Matrixa • After varimax rotation,
Component we can see a clearer
1 2 pattern of assignment
Delivery Speed -.787 .194
Price Level .714 .266
with minimal problem of
Price Flexibility -.804 -1.06E-02 cross loadings
Manufacturer Image .102 .933
Salesforce Image 2.537E-02 .934 • Factor 1 consists of X1,
Product Quality .764 .179 X2, X3, and X7 which
Extraction Method: Principal Component Analysis.
Rotation Method: Varimax with Kaiser Normalization.
refers to physical quality
a. Rotation converged in 3 iterations. attributes. Thus it can be
labeled as Physical
• The next stage in the analysis Quality Perceptions
is to test reliability of the 4 • Factor 2 refers to X4 and
items for physical quality X6 and both are related
perceptions are reliable. to Image. Thus we can
• Similarly for the 2 items of label it as Image
image
HATCO Data
Rotated Component Matrixa

Component
1 2
Price Flexibility -.804 -1.06E-02
Delivery Speed -.787 .194 Sorted by size of loadings
Product Quality .764 .179 within each factor
Price Level .714 .266
Salesforce Image 2.537E-02 .934 Obtained by using the
Manufacturer Image .102 .933 OPTIONS
Extraction Method: Principal Component Analysis.
Rotation Method: Varimax with Kaiser Normalization.
a. Rotation converged in 3 iterations.

Note: X3(Price Flexibility) and X1(Delivery Speed) has negative


loading. This implies that it has to be reverse scored (using
recoding) before X1, X2, X3, and X7 can be combined to form a
scale
Oblique Rotation
Pattern Matrixa

Component
1 2
Delivery Speed -.803 .248
Price Level .704 .219
Price Flexibility -.808 .043
Manufacturer Image .052 .931
Salesforce Image -.026 .937
Product Quality .759 .129
Extraction Method: Principal Component Analysis.
Rotation Method: Oblimin with Kaiser Normalization.
a. Rotation converged in 4 iterations.
Validation – Split Sample
Rotated Component Matrix a

Component
1 2
Delivery Speed .807 .087
Price Level -.700 .220
Price Flexibility .801 .034
Manufacturer Image -.122 .929
Salesforce Image -.040 .938
Product Quality -.707 .305
Extraction Method: Principal Component Analysis.
Rotation Method: Varimax with Kaiser Normalization.
a. Rotation converged in 3 iterations.

Rotated Component Matrix a

Component
1 2
Delivery Speed -.766 .334
Price Level .740 .300
Price Flexibility -.822 -.090
Manufacturer Image .081 .935
Salesforce Image .017 .934
Product Quality .809 .044
Extraction Method: Principal Component Analysis.
Rotation Method: Varimax with Kaiser Normalization.
a. Rotation converged in 3 iterations.
Sample Table 1
Component
Communality
Items 1 2
x1 Delivery Speed -0.787 0.194 0.658
x2 Price Level 0.714 0.266 0.580
x3 Price Flexibility -0.804 -0.011 0.646
x4 Manufacturer Image 0.102 0.933 0.882
x6 Sales force Image 0.025 0.934 0.872
x7 Product Quality 0.764 0.179 0.616
Eigenvalue 2.370 1.883
Variance (70.883) 39.497 31.386
Explanation
• A principal component factor analysis with varimax
rotation was done to validate whether the respondents
perceived the three constructs to be distinct. The
results showed a two factor solution with eigenvalues
greater than 1.0 and the total variance explained was
70.88% of the total variance. KMO measure of sampling
adequacy was 0.665 indicating sufficient
intercorrelations while the Bartlett’s Test of Sphericity
was significant (2 = 205.965, p< 0.01). The criteria used
by Igbaria et al., (1995) was used to identify and
interpret factors which were: each item should load
0.50 or greater on one factor and 0.35 or lower on the
other factors. Table 1 shows that result of the factor
analysis.
Reliability Analysis
When
• Before forming a composite index to become a
variable from a number of items
Command
Analyze  Scale  Reliability Analysis (with option
for Statistics item, scale, scale if item deleted)
Interpretation
• alpha value greater than 0.7 is good (differs
from author to author); more than 0.5 is
acceptable; delete some items if necessary
Reliability - SPSS
Cronbach α if the item is deleted:
HATCO Data For purpose of identifying which
item should be excluded in the
scale

Item-Total Statistics

Scale Corrected Cronbach's


Scale Mean if Variance if Item-Total Alpha if Item
Item Deleted Item Deleted Correlation Deleted
x1 Delivery Speed 28.058 11.382 .030 .325
x2 Price Level 29.209 10.897 .135 .251
x3 Price Flexibility 23.679 13.487 -.198 .477
x4 Manufacturer Image 26.325 8.987 .460 .029
x5 Service 28.657 9.896 .620 .050
x6 Salesforce Image 28.908 10.282 .509 .097
x7 Product Quality 24.602 12.068 -.108 .451

Reliability Statistics

Cronbach  very low < 0.6


Cronbach's
Alpha N of Items
.291 7
HATCO Data
Using the result of Factor analysis we can run reliability
for the two factors
Item-Total Statistics

Scale Corrected Cronbach's


Scale Mean if Variance if Item-Total Alpha if Item
Item Deleted Item Deleted Correlation Deleted
x1 Delivery Speed 17.229 4.060 -.236 -.667a
x2 Price Level 18.380 4.054 -.195 -.786a
x3 Price Flexibility 12.850 4.344 -.298 -.464a
x7 Product Quality 13.773 4.243 -.338 -.302a
a. The value is negative due to a negative average covariance among items.
This violates reliability model assumptions. You may want to check item
codings.

Reliability Statistics

Cronbach's
Alphaa N of Items
Negative Cronbach due to
-.898 4
a. The value is negative due to a negative average
negative loadings for X1 and X3
covariance among items. This violates reliability model
assumptions. You may want to check item codings.
identified earlier in factor analysis

The Cronbach  has improved tremendously


HATCO Data
After reverse scoring the negatively loaded variables X1 and X3

Reliability Statistics Should be


Cronbach's
Alpha N of Items
0.3 and
.810 4 above

Item-Total Statistics

Scale Corrected Cronbach's


Scale Mean if Variance if Item-Total Alpha if Item
Item Deleted Item Deleted Correlation Deleted
X1R Delivery Speed 15.8200 10.592 .769 .695
x2 Price Level 19.9410 13.532 .453 .835
X3R Price Flexibility 15.8200 10.592 .769 .695
x7 Product Quality 15.3340 10.613 .564 .805
HATCO Data
Reliability Statistics

Cronbach's
Alpha N of Items
.846 2

Item-Total Statistics

Scale Corrected Cronbach's


Scale Mean if Variance if Item-Total Alpha if Item
Item Deleted Item Deleted Correlation Deleted
x4 Manufacturer Image 2.665 .594 .788 .a
x6 Salesforce Image 5.248 1.280 .788 .a
a. The value is negative due to a negative average covariance among items. This
violates reliability model assumptions. You may want to check item codings.

Now we can form two factors to represent our independent


variables Quality Perceptions – I.e Physical quality
perceptions (X1, X2, X3, X7) and Image (X4, X6) using compute
MEANS
Assumptions in Factor Analysis
1. The most basic assumption is that the set of variables analyzed
are related.
 
• Variables must be interrelated in some way since factor analysis
seeks the underlying common dimensions among the variables. If the
variables are not related, then each variable will be its own factor.
 
Example: if you had 20 unrelated variables, you would have 20 different
factors. When the variables are unrelated, factor analysis has no common
dimensions with which to create factors. Thus, some underlying structure
or relationship among the variables must exist.

• The variables should not correlate too highly (>0.9): that makes it
difficult to determine the unique contribution of the variables to a factor
(multicollinearity).  

2. The sample should be homogenous with respect to some underlying


factor structure.
Assumptions in Factor Analysis
2. Factor analysis assumes the use of metric
data.
 
• Metric variables are assumed, although dummy
variables may be used (coded 0-1).
 
• Factor analysis does not require multivariate
normality.  Multivariate normality is necessary if
the researcher wishes to apply statistical tests for
significance of factors.
Assumptions in Factor Analysis
3. The data matrix must have sufficient correlations to
justify the use of factor analysis.
 
• Rule of thumb: a substantial number of correlations
greater than .30 are needed.
 
• Tests of appropriateness: anti-image correlation
matrix of partial correlations, the Bartlett test of
sphericity, and the measure of sampling adequacy
(MSA greater than .50).
MSA
MSA Comment
0.80 and above Meritorious
0.70 – 0.80 Middling
0.60 – 0.70 Mediocre
0.50 – 0.60 Miserable
Below 0.50 Unacceptable
Other Issues
Sample size of 100 or more

There are 1101 subjects in the sample who have valid


data for all questions and which will be included in the
factor analysis.  This requirement is met.

Ratio of subjects to variables should be 5 to 1

There are 1101 subjects and 10 variables in the analysis


for a ratio of cases to variables of 110 to 1.  This
requirement is met.

You might also like