Multicollinearity is a statistical phenomenon that occurs when two or more independent variables in a regression model are highly correlated with each other. In other words, multicollinearity indicates a strong linear relationship among the predictor variables.This can make it difficult to interpret the individual effects of each predictor on the dependent variable because their effects may be confounded or exaggerated.
Reasons for Test of Multicollinearity
The primary reasons for conducting tests of multicollinearity in regression analysis are:
- Impact on Model Interpretation
- Inflated Standard Errors
- Unstable Estimates
- Reduced Model Performance
- Difficulty in Variable Selection
- Violation of Assumptions
Checking for multicollinearity is crucial for building reliable regression models that accurately capture the relationships between variables and provide meaningful insights for decision-making.
Dataset
After completing the data cleaning process, here are the first 5 rows of our dataset
Imported Package
During the course of conducting the multicollinearity test, the following libraries were imported to facilitate data analysis and statistical computations.
Feature Engineering
It is well known that multicollinearity detection methods rely on numerical data. These methods calculate correlation coefficients or variance inflation factors (VIFs) between predictor variables, which requires numerical inputs. If categorical variables are not encoded, multicollinearity checks cannot be accurately performed.
From our dataset the location column is a categorical column containing 849 unique values:
For this reason, we encode the categorical column in our dataset. Using the categorical frequency encoding method.
Correlation Analysis
To deal only with the predictor variable we drop the target vector.
Correlation measures the strength and direction of the linear relationship between two variables, helping to identify multicollinearity issues and select the most relevant predictors.
Assessing Multicollinearity with Heatmap Visualization
Using a heatmap is an effective visual tool to assess multicollinearity by displaying correlation coefficients between variables.
Multicollinearity Efficiency: Insights from an OLS Model Summary
To check if our multicollinearity plot is efficient and good enough for model bulding, we apply the Ordinary least square summary.
Model Summary
From Our summary it shows that condition number is large implying there might be strong multicollinearity or other numerical problems.
Therefore to confirm if the large condition number is as a result of multicollinearity, we apply the Variance Inflation Factor method.
Variance Inflation Factor Result
VIF Decision Key
- VIF < 2: Minimal multicollinearity; no action needed.
- 2 โค VIF < 5: Moderate multicollinearity; consider further investigation or data transformation.
- 5 โค VIF < 10: High multicollinearity; problematic, requires attention (e.g., variable selection, data transformation).
- VIF โฅ 10: Severe multicollinearity; critical issue, immediate action needed (e.g., variable removal, data restructuring).
The VIF results suggest that there is no multicollinearity issue among the predictor variables in our regression model, since the values of VIF are lesser than the threshole of the VIF.
Test Observation
From our analysis, we noticed a large condition number of 1.21e+05, indicating the potential presence of strong multicollinearity or other numerical problems within our regression model. To confirm this, we conducted a Variance Inflation Factor (VIF) analysis.
Conclusion
Based on the VIF analysis, we can conclude that multicollinearity is not a significant issue in our regression model. The large condition number observed is likely due to numerical factors other than multicollinearity. Therefore, we can proceed with confidence in the validity of our regression analysis results.
Top comments (0)