Testing Multivariate Assumptions
The multivariate statistical techniques which we will cover in this class require one or more the following assumptions about the data: normality of the metric variables, homoscedastic relationships between the dependent variable and the metric and nonmetric independent variables, linear relationships between the metric variables, and absence of correlated prediction errors.
Multivariate analysis requires that the assumptions be tested twice: first, for the separate variables as we are preparing to do the analysis, and second, for the multivariate model variate, which acts collectively for the variables in the analysis and thus must meet the same assumptions as individual variables. In this section, we will examine the tests that we normally perform prior to computing the multivariate statistic. Since the pattern of prediction errors cannot be examined without computing the multivariate statistic, we will defer that discussion until we examine each of the specific techniques.
If the data fails to meet the assumptions required by the analysis, we can attempt to correct the problem with a transformation of the variable. There are two classes of transformations that we attempt: for violations of normality and homoscedasticity, we transform the individual metric variable to a inverse, logarithmic, or squared form; for violations of linearity, we either do a power transformation, e.g. raise the data to a squared or square root power, or we add an additional polynomial variable that contains a power term.
Transforming variables is a trial and error process. We do the transformation and then see if it has corrected the problem with the data. It is not usually possible to be certain in advance that the transformation will correct the problem; sometimes it only reduces the degree of the violation. Even when the transformation might decrease the violation of the assumption, we might opt not to include it in the analysis because of the increased complexity it adds to the interpretation and discussion of the results.
It often happens that one transformation solves multiple problems. For example, skewed variables can produce violations of normality and homoscedasticity. No matter which test of assumptions identified the violation, our only remedy is a transformation of the metric variable to reduce the skewness.
Testing Multivariate Assumptions
The multivariate statistical techniques which we will cover in this class require one or more the following assumptions about the data: normality of the metric variables, homoscedastic relationships between the dependent variable and the metric and nonmetric independent variables, linear relationships between the metric variables, and absence of correlated prediction errors.
Multivariate analysis requires that the assumptions be tested twice: first, for the separate variables as we are preparing to do the analysis, and second, for the multivariate model variate, which acts collectively for the variables in the analysis and thus must meet the same assumptions as individual variables. In this section, we will examine the tests that we normally perform prior to computing the multivariate statistic. Since the pattern of prediction errors cannot be examined without computing the multivariate statistic, we will defer that discussion until we examine each of the specific techniques.
If the data fails to meet the assumptions required by the analysis, we can attempt to correct the problem with a transformation of the variable. There are two classes of transformations that we attempt: for violations of normality and homoscedasticity, we transform the individual metric variable to a inverse, logarithmic, or squared form; for violations of linearity, we either do a power transformation, e.g. raise the data to a squared or square root power, or we add an additional polynomial variable that contains a power term.
Transforming variables is a trial and error process. We do the transformation and then see if it has corrected the problem with the data. It is not usually possible to be certain in advance that the transformation will correct the problem; sometimes it only reduces the degree of the violation. Even when the transformation might decrease the violation of the assumption, we might opt not to include it in the analysis because of the increased complexity it adds to the interpretation and discussion of the results.
It often happens that one transformation solves multiple problems. For example, skewed variables can produce violations of normality and homoscedasticity. No matter which test of assumptions identified the violation, our only remedy is a transformation of the metric variable to reduce the skewness.
การแปล กรุณารอสักครู่..