Detecting Multicollinearity in Regression Analysis

2020 American Journal of Applied Mathematics and Statistics 1,406 citations

Abstract

Multicollinearity occurs when the multiple linear regression analysis includes several variables that are significantly correlated not only with the dependent variable but also to each other. Multicollinearity makes some of the significant variables under study to be statistically insignificant. This paper discusses on the three primary techniques for detecting the multicollinearity using the questionnaire survey data on customer satisfaction. The first two techniques are the correlation coefficients and the variance inflation factor, while the third method is eigenvalue method. It is observed that the product attractiveness is more rational cause for the customer satisfaction than other predictors. Furthermore, advanced regression procedures such as principal components regression, weighted regression, and ridge regression method can be used to determine the presence of multicollinearity.

Keywords

MulticollinearityVariance inflation factorStatisticsMathematicsRegression analysisLinear regressionPrincipal component regressionVariablesRegressionCollinearityEconometrics

Related Publications

Factor Analysis as a Tool for Survey Analysis

Factor analysis is particularly suitable to extract few factors from the large number of related variables to a more manageable number, prior to using them in other analysis suc...

2021 American Journal of Applied Mathemati... 1776 citations

Publication Info

Year
2020
Type
article
Volume
8
Issue
2
Pages
39-42
Citations
1406
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

1406
OpenAlex

Cite This

Noora Shrestha (2020). Detecting Multicollinearity in Regression Analysis. American Journal of Applied Mathematics and Statistics , 8 (2) , 39-42. https://doi.org/10.12691/ajams-8-2-1

Identifiers

DOI
10.12691/ajams-8-2-1