Collinearity or multicollinearity refers to the existence of strong linear relationships among the predictor variables, which means that one predictor variable can be near-linearly predicted from the others. When there is no linear relationship among predictor variables at all, they are said to be orthogonal. The lack of orthogonality among the predictor variables usually is not strong enough to affect the analysis or the ability of the entire number of predictors to predict the response variables. In other words, the lack of orthogonality does not diminish the usefulness of the model, at least within the sample data used tofind the regression coefficients. However, this condition can produce ambiguous results that are associated with unstable estimated regression coefficients and affects the calculations associated to individual
predictors. Instability in the estimated coefficients can be indicated by large changes in the estimated regression coefficients when a variable is added or deleted, or when a data point is altered or dropped. When dealing with collinearity, the principal component analysis (PCA) method is one of the most common ways to reduce collinearity[25]. The PCA, contrary to grouping methods such as cluster analysis, is a one-sample technique that uses orthogonal transformation to obtain a set of values of linearly uncorrelated variables called principal components, which can be
equal to or less than the original number of predictor variables
Collinearity or multicollinearity refers to the existence of strong linear relationships among the predictor variables, which means that one predictor variable can be near-linearly predicted from the others. When there is no linear relationship among predictor variables at all, they are said to be orthogonal. The lack of orthogonality among the predictor variables usually is not strong enough to affect the analysis or the ability of the entire number of predictors to predict the response variables. In other words, the lack of orthogonality does not diminish the usefulness of the model, at least within the sample data used tofind the regression coefficients. However, this condition can produce ambiguous results that are associated with unstable estimated regression coefficients and affects the calculations associated to individualpredictors. Instability in the estimated coefficients can be indicated by large changes in the estimated regression coefficients when a variable is added or deleted, or when a data point is altered or dropped. When dealing with collinearity, the principal component analysis (PCA) method is one of the most common ways to reduce collinearity[25]. The PCA, contrary to grouping methods such as cluster analysis, is a one-sample technique that uses orthogonal transformation to obtain a set of values of linearly uncorrelated variables called principal components, which can beequal to or less than the original number of predictor variables
การแปล กรุณารอสักครู่..
