Skip to main content

FACTOR ANALYSIS


Factor analysis is a collection of methods used to examine how underlying constructs influence the responses on a number of measured variables. There are basically two types of factor analysis: exploratory and confirmatory. Exploratory factor analysis (EFA) attempts to discover the nature of the constructs influencing a set of responses. Confirmatory factor analysis (CFA) tests whether a specified set of constructs is influencing responses in a predicted way. SPSS only has the capability to perform EFA. CFAs require a program with the ability to perform structural equation modeling, such as LISREL or AMOS.

The primary objectives of an EFA are to determine the number of factors influencing a set of measures and the strength of the relationship between each factor and each observed measure. To perform an EFA, you first identify a set of variables that you want to analyze. SPSS will then examine the correlation matrix between those variables to identify those that tend to vary together. Each of these groups will be associated with a factor (although it is possible that a single variable could be part of several groups and several factors). You will also receive a set of factor loadings, which tells you how strongly each variable is related to each factor. They also allow you to calculate factor scores for each participant by multiplying the response on each variable by the corresponding factor loading. Once you identify the construct underlying a factor, you can use the factor scores to tell you how much of that construct is possessed by each participant.

Some common uses of EFA are to:

Identify the nature of the constructs underlying responses in a specific content area.
Determine what sets of items ‘‘hang together’’ in a questionnaire.
Demonstrate the dimensionality of a measurement scale. Researchers often wish to develop scales that respond to a single characteristic.
Determine what features are most important when classifying a group of items.
Generate ‘‘factor scores’’ representing values of the underlying constructs for use in other analyses.
Create a set of uncorrelated factor scores from a set of highly collinear predictor variables.
Use a small set of factor scores to represent the variable contained in a larger set of variables. This is often referred to as data reduction.

It is important to note that EFA does not produce any statistical tests. It therefore cannot ever provide concrete evidence that a particular structure exists in your data. it can only direct you to what patterns there may be. If you want to actually test whether a particular structure exists in your data you should use CFA, which does allow you to test whether your proposed structure is able to account for a significant amount of variability in your items.

EFA is strongly related to another procedure called principle components analysis (PCA). The two have basically the same purpose: to identify a set of underlying constructs that can account for the variability in a set of variables. However, PCA is based on a different statistical model, and produces slightly different results when compared to EFA. EFA tends to produce better results when you want to identify a set of latent factors that underlie the responses on a set of measures, whereas PCA works better when you want to perform data reduction. Although SPSS says that it performs “factor analysis,” statistically it actually performs PCA. The differences are slight enough that you will generally not need to be concerned about them - you can use the results from a PCA for all of the same things that you would the results of an EFA. However, if you want to identify latent constructs, you should be aware that you might be able to get slightly better results if you used a statistical package that can actually perform EFA, such as SAS, AMOS, or LISREL.

Factor analyses require a substantial number of subjects to generate reliable results. As a general rule, the minimum sample size should be the larger of 100 or 5 times the number of items in your factor analysis. Though you can still conduct a factor analysis with fewer subjects, the results will not be very stable.

To perform an EFA in SPSS

Choose Analyze thengoto Data Reduction thengoto Factor.
Move the variables you want to include in your factor analysis to the Variables box.
If you want to restrict the factor analysis to those cases that have a particular value on a variable, you can put that variable in the Selection Variable box and then click Value to tell SPSS which value you want the included cases to have.
Click the Extraction button to indicate how many factors you want to extract from your items. The maximum number of factors you can extract is equal to the number of items in your analysis, although you will typically want to examine a much smaller number. There are several different ways to choose how many factors to examine. First, you may want to look for a specific number of factors for theoretical reasons. Second, you can choose to keep factors that have eigenvalues over 1. A factor with an eigenvalue of 1 is able to account for the amount of variability present in a single item, so factors that account for less variability than this will likely not be very meaningful. A final method is to create a Scree Plot, where you graph the amount of variability that each of the factors is able to account for in descending order. You then use all the factors that occur prior to the last major drop in the amount of variance accounted for. If you wish to use this method, you should run the factor analysis twice - once to generate the Scree plot, and a second time where you specify exactly how many factors you want to examine.
Click the Rotation button to select a rotation method. Though you do not need to rotate your solution, using a rotation typically provides you with more interpretable factors by locating solutions with more extreme factor loadings. There are two broad classes of rotations: orthogonal and oblique. If you choose an orthogonal rotation, then your resulting factors will all be uncorrelated with each other. If you choose an oblique rotation, you allow your factors to be correlated. Which you should choose depends on your purpose for performing the factor analysis, as well as your beliefs about the constructs that underlie responses to your items. If you think that the underlying constructs are independent, or if you are specifically trying to get a set of uncorrelated factor scores, then you should clearly choose an orthogonal rotation. If you think that the underlying constructs may be correlated, then you should choose an oblique rotation. Varimax is the most popular orthogonal rotation, whereas Direct Oblimin is the most popular oblique rotation. If you decide to perform a rotation on your solution, you usually ignore the parts of the output that deal with the initial (unrotated) solution since the rotated solution will generally provide more interpretable results. If you want to use direct oblimin rotation, you will also need to specify the parameter delta. This parameter influences the extent that your final factors will be correlated. Negative values lead to lower correlations whereas positive values lead to higher correlations. You should not choose a value over .8 or else the high correlations will make it very difficult to differentiate the factors.
If you want SPSS to save the factor scores as variables in your data set, then you can click the Scores button and check the box next to Save as variables.
Click the Ok button when you are ready for SPSS to perform the analysis.

The output from a factor analysis will vary depending on the type of rotation you chose. Both orthogonal and oblique rotations will contain the following sections.

Communalities. The communality of a given item is the proportion of its variance that can be accounted for by your factors. In the first column you.ll see that the communality for the initial extraction is always 1. This is because the full set of factors is specifically designed to account for the variability in the full set of items. The second column provides the communalities of the final set of factors that you decided to extract.
Total Variance Explained. Provides you with the eigenvalues and the amount of variance explained by each factor in both the initial and the rotated solutions. If you requested a Scree plot, this information will be presented in a graph following the table.
Component Matrix. Presents the factor loadings for the initial solution. Factor loadings can be interpreted as standardized regression coefficients, regressing the factor on the measures. Factor loadings less than .3 are considered weak, loadings between .3 and .6 are considered moderate, and loadings greater than .6 are considered to be large.

Factor analyses using an orthogonal rotation will include the following section.

Rotated Component Matrix. Provides the factor loadings for the orthogonal rotation. The rotated factor loadings can be interpreted in the same way as the unrotated factor loadings.
Component Transformation Matrix. Provides the correlations between the factors in the original and in the rotated solutions.

Factor analyses using an oblique rotation will include the following sections.

Pattern Matrix. Provides the factor loadings for the oblique rotation. The rotated factor loadings can be interpreted in the same way as the unrotated factor loadings.
Structure Matrix. Holds the correlations between the factions and each of the items. This is not going to look the same as the pattern matrix because the factors themselves can be correlated. This means that an item can have a factor loading of zero for one factor but still be correlated with the factor, simply because it loads on other factors that are correlated with the first factor.
Component Correlation Matrix. Provides you with the correlations among your rotated factors.

After you obtain the factor loadings, you will want to come up with a theoretical interpretation of each of your factors. You define a factor by considering the possible constructs that could be responsible for the observed pattern of positive and negative loadings. You should examine the items that have the largest loadings and consider what they have in common. To ease interpretation, you have the option of multiplying all of the loadings for a given factor by -1. This essentially reverses the scale of the factor, allowing you, for example, to turn an ‘‘unfriendliness’’ factor into a ‘‘friendliness’’ factor.

Popular posts from this blog

Structure of a Research Article

UNIT ROOT TEST

Stationarity and Unit Root Testing l   The stationarity or otherwise of a series can strongly influence its behaviour and properties - e.g. persistence of shocks will be infinite for nonstationary series l   Spurious regressions. If two variables are trending over time, a regression of one on the other could have a high R 2 even if the two are totally unrelated l   If the variables in the regression model are not stationary, then it can be proved that the standard assumptions for asymptotic analysis will not be valid. In other words, the usual “ t -ratios” will not follow a t -distribution, so we cannot validly undertake hypothesis tests about the regression parameters. Stationary and Non-stationary Time Series Stationary Time Series l   A series is said to be stationary if the mean and autocovariances of the series do not depend on time. (A) Strictly Stationary : n   For a strictly stationary time series the distribution of   y(t) is independent of t .   Thus it is not just

ISI Journals - Economics

1. ACTUAL PROBLEMS OF ECONOMICS 2. AGRICULTURAL ECONOMICS 3. AGRICULTURAL ECONOMICS-ZEMEDELSKA EKONOMIKA 4. AMERICAN ECONOMIC JOURNAL-APPLIED ECONOMICS 5. AMERICAN JOURNAL OF AGRICULTURAL ECONOMICS 6. AMERICAN JOURNAL OF ECONOMICS AND SOCIOLOGY 7. AMERICAN LAW AND ECONOMICS REVIEW 8. ANNALS OF ECONOMICS AND FINANCE 9. ANNUAL REVIEW OF ECONOMICS 10. ANNUAL REVIEW OF FINANCIAL ECONOMICS 11. ANNUAL REVIEW OF RESOURCE ECONOMICS 12. ANNUAL REVIEW OF RESOURCE ECONOMICS 13. APPLIED ECONOMICS 14. APPLIED ECONOMICS LETTERS 15. AQUACULTURE ECONOMICS & MANAGEMENT 16. ASIA-PACIFIC JOURNAL OF ACCOUNTING & ECONOMICS 17. AUSTRALIAN JOURNAL OF AGRICULTURAL AND RESOURCE ECONOMICS 18. B E JOURNAL OF THEORETICAL ECONOMICS 19. BALTIC JOURNAL OF ECONOMICS 20. CAMBRIDGE JOURNAL OF ECONOMICS 21. CANADIAN JOURNAL OF AGRICULTURAL ECONOMICS-REVUE CANADIENNE D AGROECONOMIE 22. CANADIAN JOURNAL OF ECONOMICS-REVUE CANADIENNE D ECONOMIQUE 23. COMPUTATIONAL ECONOMICS 24. DEFENCE AND PEACE ECONOMICS 25. EA