Skip to main content

ANALYSIS OF VARIANCE (ANOVA) (Pt-I)


One-way between-subjects ANOVA

A one-way between-subjects ANOVA allows you to determine if there is a relationship between a categorical independent variable (IV) and a continuous dependent variable (DV), where each subject is only in one level of the IV. To determine whether there is a relationship between the IV and the DV, a one-way between-subjects ANOVA tests whether the means of all of the groups are the same. If there are any differences among the means, we know that the value of the DV depends on the value of the IV. The IV in an ANOVA is referred to as a factor, and the different groups composing the IV are referred to as the levels of the factor. A one-way ANOVA is also sometimes called a single factor ANOVA.

A one-way ANOVA with two groups is analogous to an independent-samples t-test. The p-values of the two tests will be the same, and the F statistic from the ANOVA will be equal to the square of the t statistic from the t-test.

To perform a one-way between-subjects ANOVA in SPSS
Choose Analyze thengoto General Linear Model thengoto Univariate.
Move the DV to the Dependent Variable box.
Move the IV to the Fixed Factor(s) box.
Click the OK button.

The output from this analysis will contain the following sections.
Between-Subjects Factors. Lists how many subjects are in each level of your factor.
Tests of Between-Subjects Effects. The row next to the name of your factor reports a test of whether there is a significant relationship between your IV and the DV. A significant F statistic means that at least two group means are different from each other, indicating the presence of a relationship.

You can ask SPSS to provide you with the means within each level of your between-subjects factor by clicking the Options button in the variable selection window and moving your within-subjects variable to the Display Means For box. This will add a section to your output titled Estimated Marginal Means containing a table with a row for each level of your factor. The values within each row provide the mean, standard error of the mean, and the boundaries for a 95% confidence interval around the mean for observations within that cell.

Post-hoc analyses for one-way between-subjects ANOVA. A significant F statistic tells you that at least two of your means are different from each other, but does not tell you where the differences may lie. Researchers commonly perform post-hoc analyses following a significant ANOVA to help them understand the nature of the relationship between the IV and the DV. The most commonly reported
Although it is the most liberal, simulations have demonstrated that using LSD post-hoc analyses will not substantially increase your experiment-wide error rate as long as you only perform the post-hoc analyses after you have already obtained a significant F statistic from an ANOVA. We therefore recommend this method since it is most likely to detect any differences among your groups.

To perform post-hoc analyses in SPSS
Repeat the steps necessary for a one-way ANOVA, but do not press the OK button at the end.
Click the Post-Hoc button.
Move the IV to the Post-Hoc Tests for box.
Check the boxes next to the post-hoc tests you want to perform.
Click the Continue button.
Click the OK button.

Requesting a post-hoc test will add one or both of the following sections to your ANOVA output.
Multiple Comparisons. This section is produced by LSD, Tukey, and Bonferroni tests. It reports the difference between every possible pair of factor levels and tests whether each is significant. It also includes the boundaries for a 95% confidence interval around the size of each difference.
Homogenous Subsets. This section is produced by SNK and Tukey tests. It reports a number of different subsets of your different factor levels. The mean values for the factor levels within each subset are not significantly different from each other. This means that there is a significant difference between the mean of two factor levels only if they do not appear in any of the same subsets.

Multifactor between-subjects ANOVA
Sometimes you want to examine more than one factor in the same experiment. Although you could analyze the effect of each factor separately, testing them together in the same analysis allows you to look at two additional things. First, it lets you determine the independent influence of each of the factors on the DV, controlling for the other IVs in the model. The test of each IV in a multifactor ANOVA is based solely on the part of the DV that it can predict that is not predicted by any of the other IVs.

Second, including multiple IVs in the same model allows you to test for interactions among your factors. The presence of an interaction between two variables means that the effect of the first IV on the DV depends on the level of the second IV. An interaction between three variables means that the nature of the two-way interaction between the first two variables depends on the level of a third variable. It is possible to have an interaction between any number of variables. However, researchers rarely examine interactions containing more than three variables because they are difficult to interpret and require large sample sizes to detect.

Note that to obtain a valid test of a given interaction effect your model must also include all lower-order main effects and interactions. This means that the model has to include terms representing all of the main effects of the IVs involved in the interaction, as well as all the possible interactions between those IVs. So, if you want to test a 3-way interaction between variables A, B, and C, the model must include the main effects for those variables, as well as the AxB, AxC, and the BxC interactions.

To perform a multifactor ANOVA in SPSS
Choose Analyze thengoto General Linear Model thengoto Univariate.
Move the DV to the Dependent Variable box.
Move all of your IVs to the Fixed Factor(s) box.
By default SPSS will include all possible interactions between your categorical IVs. If this is not the model you want then you will need to define it by hand by taking the following steps.
o Click the Model button.
o Click the radio button next to Custom.
o Add all of your main effects to the model by clicking all of the IVs in the box labeled Factors and covariates, setting the pull-down menu to Main effects, and clicking the arrow button.
o Add each of the interaction terms to your model. You can do this one at a time by selecting the variables included in the interaction in the box labeled Factors and covariates, setting the pull-down menu to Interaction, and clicking the arrow button for each of your interactions. You can also use the setting on the pull-down menu to tell SPSS to add all possible 2-way, 3-way, 4-way, or 5-way interactions that can be made between the selected variables to your model.
o Click the Continue button.
Click the Options button and move each independent variable and all interaction terms to the Display means for box.
Click the Continue button.
Click the OK button.

The output of this analysis will contain the following sections.
Between-Subjects Factors. Lists how many subjects are in each level of each of your factors.
Tests of Between-Subjects Effects. The row next to the name of each  actor or interaction reports a test of whether there is a significant relationship between that effect and the DV, independent of the other effects in the model.
You can ask SPSS to provide you with the means within the levels of your main effects or your interactions by clicking the Options button in the variable selection window and moving the appropriate term to the Display Means For box. This will add a section to your output titled Estimated Marginal Means containing a table for each main effect or interaction in your model. The table will contain a row for each cell within the effect. The values within each row provide the mean, standard error of the mean, and the boundaries for a 95% confidence interval around the mean for observations within that cell.


Popular posts from this blog

Structure of a Research Article

UNIT ROOT TEST

Stationarity and Unit Root Testing l   The stationarity or otherwise of a series can strongly influence its behaviour and properties - e.g. persistence of shocks will be infinite for nonstationary series l   Spurious regressions. If two variables are trending over time, a regression of one on the other could have a high R 2 even if the two are totally unrelated l   If the variables in the regression model are not stationary, then it can be proved that the standard assumptions for asymptotic analysis will not be valid. In other words, the usual “ t -ratios” will not follow a t -distribution, so we cannot validly undertake hypothesis tests about the regression parameters. Stationary and Non-stationary Time Series Stationary Time Series l   A series is said to be stationary if the mean and autocovariances of the series do not depend on time. (A) Strictly Stationary : n   For a strictly stationary time series the distribution of   y(t) is independent of t .   Thus it is not just

ISI Journals - Economics

1. ACTUAL PROBLEMS OF ECONOMICS 2. AGRICULTURAL ECONOMICS 3. AGRICULTURAL ECONOMICS-ZEMEDELSKA EKONOMIKA 4. AMERICAN ECONOMIC JOURNAL-APPLIED ECONOMICS 5. AMERICAN JOURNAL OF AGRICULTURAL ECONOMICS 6. AMERICAN JOURNAL OF ECONOMICS AND SOCIOLOGY 7. AMERICAN LAW AND ECONOMICS REVIEW 8. ANNALS OF ECONOMICS AND FINANCE 9. ANNUAL REVIEW OF ECONOMICS 10. ANNUAL REVIEW OF FINANCIAL ECONOMICS 11. ANNUAL REVIEW OF RESOURCE ECONOMICS 12. ANNUAL REVIEW OF RESOURCE ECONOMICS 13. APPLIED ECONOMICS 14. APPLIED ECONOMICS LETTERS 15. AQUACULTURE ECONOMICS & MANAGEMENT 16. ASIA-PACIFIC JOURNAL OF ACCOUNTING & ECONOMICS 17. AUSTRALIAN JOURNAL OF AGRICULTURAL AND RESOURCE ECONOMICS 18. B E JOURNAL OF THEORETICAL ECONOMICS 19. BALTIC JOURNAL OF ECONOMICS 20. CAMBRIDGE JOURNAL OF ECONOMICS 21. CANADIAN JOURNAL OF AGRICULTURAL ECONOMICS-REVUE CANADIENNE D AGROECONOMIE 22. CANADIAN JOURNAL OF ECONOMICS-REVUE CANADIENNE D ECONOMIQUE 23. COMPUTATIONAL ECONOMICS 24. DEFENCE AND PEACE ECONOMICS 25. EA