5 views

Uploaded by systemlovr061782

Factor Analysis

save

- Bickel and Levina 2004
- AUTOMATIC THEFT SECURITY SYSTEM (SMART SURVEILLANCE CAMERA)
- Spectral Relevance Coding in Mining for Genomic Application
- Upgrading calculation methods for age estimation from cranial sutures in 594 crania from the Poschiavo ossuary
- son
- dm bi
- people.cs.uchicago.edu_~engelhardt_pubs_engelhardt-stephens-plosgen-2010
- International Journal of Biometrics and Bioinformatics, (IJBB), Volume (1) : Issue (1)
- Cameron
- 1847-10901-1-PB
- ECONOMIC-PERFORMANCE-GOVERNMENT-SIZE.pdf
- Image Processing Techniques for Brain Tumor Extraction From MRI Images Using SVM Classifier
- FinMan.pdf
- Mineria de daatos con R
- Detection of Malware by Using Support Vector Machine
- Asae 993034 Final
- PCA Denoise
- 2012-Blends of olive oil and sunﬂower oil - Characterisation and olive oil
- Scott Brucov Upitnik Inovatinog Ponasanja
- paper on lg
- Unit 4 5
- Journal of Botany by Me.pdf
- Cluster3 Manual
- analise de componentes independente
- A Guide to NIRS.pdf
- 4-Image Fusion of Landsat ETM+ and SPOT Satellite Image Using IHS, Brovey and PCA(2007-Cited12)
- f q 3210571066
- Bosselmann, 2009
- Dispersion Trading HalleO
- 77 1512289526_03-12-2017.pdf
- Research_outline.doc
- Research Paper
- BOOKS_STUFF.doc
- QuickMediatorModeratorSEMRecapFA2013_BW
- Ramos Lecture 15 Eating Disorders Slides-2
- First Sight Words.doc
- Writ 140 First Paper
- Judy Draft
- Research Outline
- Course Outline SelfDefense
- Am Pic Ill In
- Apes

You are on page 1of 6

Principle ideas discussed

Potissimus informatio:

Non-Experimental Methods:

**1. Consider the purpose of a factor analysis and the
**

importance of structural validity

Factor Analysis

**2. Understand the steps in the principal
**

components approach to extracting factors

&

**3. Consider the emphasis in component analysis
**

versus that of common factor analysis

Structural Validity

**4. Understand one’s decision regarding the number
**

of factors that should be extracted

5. Note the purpose in various approaches to

rotating factors

6. Consider how to obtain factor scores

Principle ideas discussed

Principle ideas discussed

Pew Data Example - How many dimensions?

Factor Analyses

Generally executed to explain the

relationships among a large number of

variables in terms of a much smaller

number of constructs

Recall the essential nature of executing any

statistical procedure is to summarize a

large amount of information so that we

can interpret complexity in simple terms

Principle ideas discussed

**Start: Sum the Column Correlations to
**

obtain a simple sum of each column

Correlations

Q.1

Q.2a

Q.2b

Q.2c

Q.1

1

.363**

.385**

.284** .103** .049*

Q.2a

.363**

1

Q.2b

.385**

.281**

Q.2c

.284**

.284** .433**

1

Q.11b

.103**

.110**

.033

.036

Q.11c

.049*

.088**

.009

-.033

Q.11d

.024

.028

-.034

-.068** .303** .430**

Total

?

?

Q1. Generally, how would you say things are these days in your life – would you

say that you are very happy, pretty happy, or not too happy?

Q.2a -- Your family life -- Are you satisfied or dissatisfied? Would you say you are

VERY (dis)satisfied or SOMEWHAT (dis)satisfied

Q.2b -- Your personal financial situation -- Are you satisfied or dissatisfied? Would

you say you are VERY (dis)satisfied or SOMEWHAT (dis)satisfied

Q.2c -- Your present housing situation -- Are you satisfied or dissatisfied? Would

you say you are VERY (dis)satisfied or SOMEWHAT (dis)satisfied

Q.11b -- Being able to live comfortably in retirement -- Is this extremely important

for you, very important, somewhat important, or not too important for you?

Q.11c -- Being able to pay for your children’s college education -- Is this extremely

important for you, very important, somewhat important, or not too important

for you?

Q.11d -- Being able to leave an inheritance for your children -- Is this extremely

important for you, very important, somewhat important, or not too important

for you?

Principle ideas discussed

**Start: Sum the Column Correlations to
**

obtain a simple sum of each column

Correlations

Q.1

Q.2a

Q.2b

Q.2c

.024

Q.1

1

.363**

.385**

.284** .103** .049*

.281** .284** .110** .088**

.028

Q.2a

.363**

1

.433** .033

. 009

-.034

Q.2b

.385**

.281**

-.033

-.068**

1

?

Q.11b

.036

?

Q.11c Q.11d

.340**

1

.340**

?

1

?

Q.11b

Q.11c Q.11d

.028

.433** .033

. 009

-.034

-.033

-.068**

1

Q.2c

.284**

.284** .433**

1

.303**

Q.11b

.103**

.110**

.033

.036

.430**

Q.11c

.049*

.088**

.009

-.033

Q.11d

.024

.028

-.034

-.068** .303** .430**

1

?

.024

.281** .284** .110** .088**

.036

1

.340**

.340**

1

.303**

.430**

1

Total

1

110** Correlations ByQ.430 2.494987 0.11c Q.4169 0.02498 0.385** .179685 0.27 = .177979 0.235698 0.461998 1st principal component 1st component's eigenvalue 1.044946 0.01471 -0.049* .3652 .168614 0.008065 0.033 .445694 0.4189 0.009 Q.009437 0.495 0.3193 Total 2.3673 .137299 0.01086 -0.363** Q.2124 0.00289 0.11c .014485 0.153596 0.135181 12th trial vector 0.080138 0.4645 0.284 ** the Q.192577 0.047983 squares 0.148398 0.2b Q.11d ** ** .164191 0.055964 0.106617 27th trial vector each element of the characteristic vector is then multiplied by the square root of the eigenvalue 0.433** .457337 0.05082 0.570066 0.56648 sum 0.024 1 .14058 0.104313 0.4912 0.430** 1 1 ? ? Produces a “trial” characteristic root Principle ideas discussed Next: Produce another trial vector (iterative comparison) Next: jumping to 13th iteration to examine for convergence of characteristic vector… ?? convergence not yet acceptable ?? Principle ideas discussed 0.107028 26th trial vector 4.096748 0.088 -.050501 0.028 .02171 0.116042 0.011767 0.016949 0.153885 0.01688 -0.164191 0.139517 0.002568 0.219051 sum 1.058136 0.05369 RootSS divided by equals 0.495 0.043147 0.031434 0.129845 0.340** ? ? .034 & -.281** Q.01654 0.303**=.034 .4189 0.014485 0.037616 0.103** .212405 0.129845 0.008485 0.106617 27th trial vector each element of the characteristic vector is then multiplied by the square root of the eigenvalue 0.267293 sum 0.21207 0.214985 0.088** .107028 26th trial vector 0.1646 0.11b Q.4189 etc.110**square .11c Q.2a Q.024255 0.153 13th trial vector 0.04601 0.385 .007009 0.2c Q.433** .3673 .032421 0.2c Next: Produce another trial vector (iterative comparison) 1.210611 0.110** .110** .11c .235 0.028 -.002996 0.148322 0.788901 0.464526 0.161277 0.11b .337338 0.4086 0.107 0.136678 0.191153 0.221165 SmSqrs 1st component's eigenvalue 0.152061 0.4625 ideas discussed Next: jumping to 27thPrinciple iteration to examine for convergence of characteristic vector… *** convergence acceptable *** previous trial vector multiplied by original matrix 0.02042 0.226164 0.088** root .033 . 009 sums] -.016385 0.284** .040172 0.21 / 5.189842 0.3193 1.712 0.4883 0.088** .340** 1 .3997 .126579 0.054547 RootSS 0.304 0. .2c Q.003597 -0.065484 0.024 .008808 0.4086 0.494987 0.3652 0.464526 0.002996 0.888201 0.164601 0.2b[the.01212 -0.082818 -0.107 1.3193 .486449 previous trial vector multiplied by original matrix 0.281 .902305 1.363 each simple sum by Q.3193 0.0046 -0.007009 0.755027 0.435709 0.049* 1 Q.11b .910862 1.187264 0.363** 1 Q.72 + .192988 0.219051 sum 1.27 ** Q.012843 RootSS 2.00636 0.4262 0.1Previous Characteristic Vector Q.130525 0.00543 -0.4645 0.2357 0.2357 0.495059 0.051537 SmSqrs 4.71 0.03109 0.94 1.003215 -0.113797 0.001733 -0.1 Q.21 2.49654 0.021877 0.4903 0.939621 0.3312 0.036 4.303** 3.112316 0.3572 .051095 0.2124 0.697902 0.024277 0.4903 0.4086 .033 -.020526 0.007778 0.2124 0.666 0.281 .068** -.010054 0.055964 0.910862 1.304 0.040656 0.882888 0.2a Q.4189 1 of the simple 1 squared .1 .070778 0.130525 0.78 5.939621 0.036 .002568 0.139245 0.025927 0.490275 Matrix result of 1st trial vector multiplied across original matrix 0.656 0.01188 0.011441 0.3673 0.88 .320899 squares 1.03109 0.004378 -0.118968 0.064357 0.190575 0.954391 1.00543 -0.666 0.103Squares .4457 0.049 + 4.1926 0.1646 0.179685 0.189842 0.051095 0.882888 0.4412 0.113797 0.4086 .71 0.4572 0.167706 0.072216 0.3652 0.62 + 4.4965 0.003245 0.028 -.857836 0.235 0.124168 0.462 0.487067 0.3572 0.475567 0.107 0.040876 0.024025 0.131918 0.284** .00919 0.430 1 2.666626 0.284** .01509 0.129845 0.3751 0.033 .3467 0.01319 0.126579 0.281** .82** ** 3.017694 0.159041 0.9197 0.034536 0.017121 0.013006 -0.00894 -0.103** .012052 0.4645 0.191153 0.024 Q.072216 0.168614 0.2124 0.4457 0.43 root 0.4572 0.007646 -0.11d Q.016459 0.001481 -0.016045 -0.495059 0.496664 0.430** Total ? ? .034 -.070778 0.190575 0.704331 0.303** .284** .023364 0.21207 0.013147 0.131918 0.3572 0.11b ** ** sum of all Q.3997 0.4864 0.340+ 1 = 27.464503 0.129822 0.17307 0.457174 0.00364 0.007646 -0.4189 .013223 -0.385224 0.032421 0.433** 1 Q.954391 1.433** ** Sum of Q.217644 SmSqrs 2.028 Divide.047983 squares 4.11d .071446 squares 4.164601 0.01359 0.656 0.071417 0.15 2.049* .496664 0.068** 1 ? .221165 SmSqrs 2.1876 0.4864 0.4572 0.004469 -0.340** 1 .033 -.197968 0.036 Q.153 1st principal component 2 .303** .036 .43 root 0.4965 0.197968 0.284** .949898 1.033 .735882 0.016459 0.01509 0.016051 0.444391 0.11d .Next: Principle ideas discussed Principle ideas discussed Normalize the column simple sums by dividing by the square root of their sums of squares Multiply Column Elements Correlations Q.4645 0.00728 0.2b .495 0.03031 0.129845 0.1302 0.68 .016385 0.068** sqrt .005789 0.050985 0.3673 0.3997 0.4478 0.214985 0.004469 -0.49654 0.3652 .284** .103** .023364 0.024255 0.024 .01688 -0.04601 1.033 Q.839244 0.2a .495 0. 009 -.012936 0.188766 0.4572 0.034536 0.4965 0.337338 0.050985 0.017121 0.104313 0.009 -.02042 0.1352 0.2316 previous trial vector multiplied by original matrix 0.88 .054547 RootSS 1.1 .003786 0.01188 0.017503 0.3997 .007663 0.131208 0.11 1.2814 2nd trial vector ideas discussed Next: jumping to 27thPrinciple iteration to examine for convergence of characteristic vector… *** convergence acceptable *** 0.00364 0.457337 0.76 + * ** .064357 0.021877 0.121448 0.53 .385** .2a 1 .435709 0.45 + 3.033 .107 2.845848 0.110656 0.068** .2c .041256 0.139517 0.001481 -0.034 -.462 0.1646 0.041256 0.363** .1352 0.14058 0.008065 0.712 0.040966 0.00728 0.457174 0.4965 0.464503 0.01179 0.035957 0.1926 0.2b Q.013006 -0.1646 0.385** .3572 .114817 0.284** .212405 0.93 1.040876 0.

0739 -0.26856 0.15832 -0.15832 0.12052 -0.08463 0.0552 0.2164 0.12052 -0.153021 0.256488 0.4742 0.4729 0.976591 0.5585 0.394045 0.16837 q4 0.1086 q2 0.256488 0.0739 0.5041 q1 0.11785 0.5055 0.10986 -0.19319 0.394045 q6 0.006599 0.10986 -0.5055 0.10986 0.09246 -0.16342 0.1565 0.0739 -0.907584 0.06851 -0.4959 -0.1529 -0.0202 0.01365 -0.02395 -0.03485 -0.14294 -0.4959 -0.26856 0.0739 0.556444 -0.14294 0.0465 -0.14294 0.16837 0.944775 0.18716 -0.556444 -0.18345 -0. Square each column’s simple sum and then sum those squares to obtain a “Sum of (Column-Sum) Squares” 3.16342 0.039024 0. The results of these divisions are identified as the elements in the NEW characteristic vector (the new trial characteristic vector) 9.10986 -0.03407 -0.026531 0.00981 0.02931 -0.derive a simple sum for each of the resulting columns 6.032282 0.03407 0.09246 0. Square each column’s simple sum and then sum those squares to obtain a “Sum of (Column-Sum) Squares” 7.00654 0.e.18345 0.256488 0.5697 0.03407 0.0739 0.18716 0.545431 5.012501 0.11284 -0.1529 0.08463 -0.0739 0.11284 0.26856 0.12052 -0.0739 0.11785 -0.394045 0.26856 0. The results of these divisions are identified as the elements in the 2nd component’s first characteristic vector (the first trial characteristic vector for the 2nd component) 0.18176 0.493056 -0.220049 0.18345 0.082644 -0.035886 0.569664 0.12052 -0.03407 0.06851 0.26856 0.06851 -0.1089 0.394045 0.09246 0.15832 0.11284 -0.224462 0.109211 0.4369 0.0714 0.1669 Principle ideas discussed Subtract 1st principal component’s cross-products of loadings from original correlation matrix 1st principal 0.19319 -0.493056 -0.047249 0.16342 0.090184 0.1542 0.143255 0.06851 0.493056 -0.10986 0.569664 0.11785 0.18176 0.18716 0.394045 0.01168 -0.26856 0.15832 -0.16837 0.256488 0.394045 0.394045 0.0234 -0.18176 -0.11785 0.11284 0.15832 -0.19319 -0.4303 0.03407 -0.1529 -0.394045 0.03407 0.1676 -0.4742 0.1529 -0.19319 -0.11785 -0.09246 0.06851 -0.041273 1.1086 0.15832 -0.556444 -0.096998 0.304 q5 0.1529 -0.1994 0.106648 0.18716 -0.03236 0. Compute the square root of the “Sum of (Column-Sum) Squares” and divide each column’s simple sum of inter-item correlations by that computed square root (this is known as the characteristic root) 4.16342 0.556444 -0.10986 -0.4436 0.2164 0.18176 -0.4959 -0.19319 0.538253 0.09246 -0.08463 0.944775 0.1529 -0.16837 -0.18176 -0.00826 0.907584 0.493056 -0.03407 0.18345 0. Compute the square root of the “Sum of (Column-Sum) Squares” and divide each column’s simple sum (of inter-item correlations) by that computed square root (this is known as the characteristic root) 8.15832 0.141929 0.26856 0.06851 0.0739 -0.15832 0. Sum the columns to establish a simple sum of inter-item correlations in each column (i.11284 -0.907584 0.18345 0.976591 This is Now the “Original” Matrix for the 2nd Component Utilize the characteristic vector to iteratively seek characteristic vector convergence Principle ideas discussed Principle ideas discussed Begin Again with 2nd component: Same procedure is followed as with 1st component 0.030755 0.18716 0.1019 0.4671 0.1089 -0.18716 0.16837 0.976591 -0.0360 0.14294 0.Principle ideas discussed Next: compute cross products of the first principal component loadings 0.08463 0.135586 0.1917 -0.18176 -0.907584 0.1994 0.16342 0.15832 0.008993 0.11785 -0.235 q6 Next: 0.006117 0.944775 0.10986 0.16342 -0.46587 0.569664 -0.26856 0.0360 0.09246 0.4658 0.059626 0.2025 0.14294 -0.12052 -0.and .1542 0.18176 -0.1529 -0.26856 0.1529 0.569664 0.19319 -0.1673 0.4369 0.0924 0.16837 0.18176 0.12052 -0.14294 0.09246 -0.256488 0.1529 0.0739 0.5286 0.19319 -0.944775 0.256488 0.16342 0.03407 0.10986 -0.1019 q3 0.12052 -0.16342 0.052714 -0.4671 0.094051 0.479757 0.06851 0.0739 -0.11284 0.01281 -0.19319 -0.14294 -0.1004 -0.5069 0. with respect to each item) 2.16837 0.048896 0.0893 -0.944775 0.18716 0.09246 0.1004 0.11785 -0.1669 0.18345 -0.16342 0.12052 -0.569664 -0.08463 -0.256488 q5 0.11284 0.656 q4 0.1529 -0.1565 0.10986 0.14294 -0.153 component q7 0.976591 q7 …subtract from original matrix and start over Principle ideas discussed Next: The Matrix is REFLECTED after identifying a “split” in the matrix – the intersecting “splits” are reflected Principle ideas discussed Next: The Matrix is REFLECTED after identifying a “split” in the matrix – the intersecting “splits” are reflected In this Example: Results of the Reflection shown Below 0.666 q2 0.11785 0.256488 0.4959 -0.03407 0.07981 0.493056 -0.06851 -0.26856 0.1673 0.18345 -0.256488 0. Multiply the new “Original” correlation matrix by the characteristic vector .556444 -0.16342 -0.19319 -0.067168 0.011991 0.06851 0.03407 0.16837 0.907584 0.12052 -0.256488 0.0465 0.00572 0.11284 -0.394045 -0.14294 -0.06851 0.2158 0.031324 0.976591 0.18176 -0. Return to step 5 and continue until the vector “converges” 3 .18345 -0.09246 0.08463 0.08463 -0.01933 0.18716 0.16837 0.049687 -0.2025 0.18176 -0.11785 -0.0714 0.01732 -0.18716 -0.4729 0.16837 0.086373 0.11785 0.09246 0.19319 -0.14294 -0.18345 0.10986 0.08463 0.71 q1 0.12052 -0.394045 0.11284 -0.18716 0.08463 -0.08463 0.712 q3 0.1063 0.18345 0.2158 In this Example: Difference is in the Matrix Below 0.15832 0.11284 0.4959 -0.

with respect to each item’s row of factor loadings.039784 0.0399 -0.021044 0.052 0.129793 0.g.294 0. before re-reflection) Conceptualizing Principal Component Analysis Principle ideas discussed 1.091402 0.1945 -0.633 0.111 are re-reflected 0.0847 0.228402 0.Principle ideas discussed After establishing a new “convergent” characteristic vector 0.145067 0.002733 0.044715 0.685727 0.333265 0.19447 0.148693 0.04197 0. These conditions make interpretability no better than simply trying to explain the results without reducing the size of the dimensions in the model 4.00663 0.633 0.002949 0. they are descriptive 2.0154 0.579726 0.03756 0. The other components are bipolar composites that are each orthogonal to all the other vectors 3.224306 0.03978 0.3089 1.756 0. The amount of variance accounted for by the 1st principal component is the amount of variance that is common to all the variables – the amount of “variance accounted for” in any principal component analysis will always be largest in the first extracted principal component 4.08467 0. Eigenvalue > 1 (e. divided by the number of matrix items.384293 0.255 -0.935251 smSquares 1.009961 0.078988 0.088704 0.57976 characteristic Sq Rt of eigenvalue x obtained loadings = 0.22430 0.227496 0.3089 0.713257 root 1.579765 Principle ideas discussed …multiply the converged characteristic vector by the square root of the eigenvalue to obtain the 2nd component’s 0.308915 SqRt of eigenvalue 9th trial 2nd characterisitc …multiply the converged characteristic vector by the square root of the eigenvalue to obtain the 2nd component’s factor loadings (i.01021 -0.082839 0.2242 -0.0222 -0.255 0.00765 0.004646 0.5774 0.3089 1.054528 0.545528 0. only residuals that are “not accounted for” by that first principal component will be left over …thus.00774 0. The loadings of a factor indicate the correlation between the item and the factor. The sum of the eigenvalues of all the extracted factors.993286 sum 2.039552 0.194521 0.294 0.127727 0. Cattell’s Scree Plot (where does mountain meet plane?) Why Rotate the Factors? As was noted in computing the principal components: 1. In a factor matrix.06816 0.978469 0.052 -0.111 -0.03428 -0.027794 0.036634 0. In subtracting this “variance accounted for” from the original correlation matrix.04076 -0.00439 -0.19452 0.042003 -0.003691 0.224246 0.5797 0.123992 0.577366 0.068075 0.3089 1.577364 0. will explain the amount of variance in all the variables that is accounted for by that factor 8.828086 0.3089 1.986618 squares 0.759 -0. thus the square of the loading indicates the amount of variance in the item that is accounted for by the factor 6.049043 0.48334 0. Components are “real” factors that are derived directly from the investigation’s correlation matrix.756 0. will explain the amount of variance in the full matrix that is accounted for by all the extracted factors Principle ideas discussed 1.e. Substantive expectations about the # of factors 3.084738 0.095908 -0.48334 0.039855 0. the sum of the squared loadings is equivalent to the amount of variance in that item accounted for by all the extracted factors – this sometimes reported as h2 7. If all one cares about is knowing how many factors are required to account for most of the variance.989176 0.007166 0.108089 0.15509 0.438734 0. SPSS default criterion) a. divided by the number of items.035691 0.3089 1.4834 0. The first principal component seeks to explain that which all the items hold in common (a vector of communality) and as such accounts for the most variance of any of the factors 2.57736 0.009554 0.097621 0. recall variance accounted by the factor = eigenvalue/#items 2.3089 1. principal component analysis does the work fine…but… ? Interpretability ? 4 .147681 0.111066 0.0061 0.04284 0. the 2nd principle component will be orthogonal to the 1st component 1.566135 factor loadings before re-reflection 8th trial 2nd characteristic 9th trial 2nd 0.00931 -0.759 2nd component the component loadings need to be re-reflected to account for the earlier reflection of the residual matrix This iterative procedure can continue until all variance is accounted for by the components…the number of components will never exceed the number of variables Conceptualizing Principal Component Analysis Principle ideas discussed 5.48342 0.02344 -0. The eigenvalue of an extracted factor.037755 0.084673 0. The first component is calculated to obtain a mathematical description of all that an investigation’s K variables hold in common – conceptually like trying to identify a single regression line that predicts K different dimension scores from one predictor 3.030789 0.

then one begins to see that the axes in such space can be ROTATED in an infinite number of ways around the variable vectors while the vectors remain fixed in location Varimax Rotation Conceptual axes Variable vectors Principle ideas discussed Rotation of Factors Oblique Rotation Correlated. Generally. very important. or not too important for you? 5 . how would you say things are these days in your life – would you say that you are very happy.Principle ideas discussed Principle ideas discussed Rotation of Factors Rotation of Factors Rotation of factors provides a procedure that maximizes interpretability of the factors Rotation of factors provides a procedure that maximizes interpretability of the factors If one considers different variables as occupying vectors in psychometric space (a Cartesian/Euclidean perspective).Are you satisfied or dissatisfied? Would you say you are VERY (dis)satisfied or SOMEWHAT (dis)satisfied Q.2a -.Are you satisfied or dissatisfied? Would you say you are VERY (dis)satisfied or SOMEWHAT (dis)satisfied Q.Is this extremely important for you.Your family life -.Is this extremely important for you.11c -.Is this extremely important for you. then one begins to see that the axes in such space can be ROTATED in an infinite number of ways around the variable vectors while the vectors remain fixed in location If one considers different variables as occupying vectors in psychometric space (a Cartesian/Euclidean perspective). somewhat important.Your present housing situation -.Being able to pay for your children’s college education -. very important.11d -. then one begins to see that the axes in such space can be ROTATED in an infinite number of ways around the variable vectors while the vectors remain fixed in location Conceptual axes Conceptual axes Variable vectors Variable vectors Principle ideas discussed Principle ideas discussed Rotation of Factors Rotation of Factors Rotation of factors provides a procedure that maximizes interpretability of the factors Orthogonal Rotation Uncorrelated.11b -.2c -.Being able to live comfortably in retirement -. pretty happy. Dependent Dimensions Oblimin Rotation Non-Perpendicular Factors’ conceptual axes ARE related Perpendicular Factors’ conceptual axes are NOT related Principle ideas discussed Recall Pew Data Example Q1. somewhat important. very important. somewhat important. or not too happy? Q.2b -. or not too important for you? Q.Being able to leave an inheritance for your children -. Independent Dimensions If one considers different variables as occupying vectors in psychometric space (a Cartesian/Euclidean perspective).Are you satisfied or dissatisfied? Would you say you are VERY (dis)satisfied or SOMEWHAT (dis)satisfied Q. or not too important for you? Q.Your personal financial situation -.

Principal Component – Varimax Principle ideas discussed Principle ideas discussed Rotation to Simple Structure PEW Data Example: Happy at present Satisfaction with family Satisfaction with finance Satisfaction with housing Retire comfort important Child education important Child inheritance important Factor Techniques (Inter. Each row should contain at least one zero Minimum # zero loadings each factor = # factors Every factor pair contains both zero and sig. 4. loadings across factors Factor Techniques (Inter.000 0. Intra-Individual) Principle ideas discussed Variable Variable Variable Variable … 1 2 3 4 Prt 1. 2.t4 Factor Techniques (Inter.772 Participant Participant Participant Participant … 1 2 3 4 Var 1 Var 2 Var 3 Var 4 … Var k Participant i simple structure = replicable.vs.713 0. loadings Large # zero loadings on each factor Only a few simultaneous sig.t1 1.ti Variable k 6 .t3 1.750 0.vs.000 0.ti 1 2 3 4 Prt 1 Prt 2 Prt 3 Prt 4 … Prt i Variable k Factor Techniques (Inter.t3 Prt 1.vs. Intra-Individual) PresHap FutHop 0.000 0.t4 … Prt 1.693 0. 3. Intra-Individual) Principle ideas discussed Var 1 Var 2 Var 3 Var 4 … Var k Variable Variable Variable Variable … Participant 1.t2 1.t1 Prt 1.724 0.t2 Prt 1.000 0.786 0.vs. interpretive parsimony 1.000 0. 5.657 0. Intra-Individual) Principle ideas discussed Participant Participant Participant Participant … 1.000 0.000 0.

- Bickel and Levina 2004Uploaded bydfaini12
- AUTOMATIC THEFT SECURITY SYSTEM (SMART SURVEILLANCE CAMERA)Uploaded byCS & IT
- Spectral Relevance Coding in Mining for Genomic ApplicationUploaded byCS & IT
- Upgrading calculation methods for age estimation from cranial sutures in 594 crania from the Poschiavo ossuaryUploaded byHeather Mcdonald
- sonUploaded byErditha Miranda
- dm biUploaded byCesario Gillas
- people.cs.uchicago.edu_~engelhardt_pubs_engelhardt-stephens-plosgen-2010Uploaded byvirajparmar
- International Journal of Biometrics and Bioinformatics, (IJBB), Volume (1) : Issue (1)Uploaded byAI Coordinator - CSC Journals
- CameronUploaded byAnonymous 105zV1
- 1847-10901-1-PBUploaded byyudhasaintika
- ECONOMIC-PERFORMANCE-GOVERNMENT-SIZE.pdfUploaded byJuanjo Martinez
- Image Processing Techniques for Brain Tumor Extraction From MRI Images Using SVM ClassifierUploaded byEditor IJRITCC
- FinMan.pdfUploaded byjan18
- Mineria de daatos con RUploaded byJuan Car
- Detection of Malware by Using Support Vector MachineUploaded byEditor IJRITCC
- Asae 993034 FinalUploaded byjhon montero
- PCA DenoiseUploaded byManish Narkhede
- 2012-Blends of olive oil and sunﬂower oil - Characterisation and olive oilUploaded byTrâm Đỗ
- Scott Brucov Upitnik Inovatinog PonasanjaUploaded byAnes Hrnjic
- paper on lgUploaded byIwasokun Gabriel
- Unit 4 5Uploaded byAshok Kumar
- Journal of Botany by Me.pdfUploaded byমাহমুদ অর্ণব
- Cluster3 ManualUploaded byVi Minh Kha
- analise de componentes independenteUploaded bymarryyyy
- A Guide to NIRS.pdfUploaded byDrughins
- 4-Image Fusion of Landsat ETM+ and SPOT Satellite Image Using IHS, Brovey and PCA(2007-Cited12)Uploaded bymiusay
- f q 3210571066Uploaded byAnonymous 7VPPkWS8O
- Bosselmann, 2009Uploaded bySoranek
- Dispersion Trading HalleOUploaded byHilal Halle Ozkan
- 77 1512289526_03-12-2017.pdfUploaded byAnonymous lPvvgiQjR

- Research_outline.docUploaded bysystemlovr061782
- Research PaperUploaded bysystemlovr061782
- BOOKS_STUFF.docUploaded bysystemlovr061782
- QuickMediatorModeratorSEMRecapFA2013_BWUploaded bysystemlovr061782
- Ramos Lecture 15 Eating Disorders Slides-2Uploaded bysystemlovr061782
- First Sight Words.docUploaded bysystemlovr061782
- Writ 140 First PaperUploaded bysystemlovr061782
- Judy DraftUploaded bysystemlovr061782
- Research OutlineUploaded bysystemlovr061782
- Course Outline SelfDefenseUploaded bysystemlovr061782
- Am Pic Ill InUploaded bysystemlovr061782
- ApesUploaded bysystemlovr061782