Regression – Understanding Multicollinearity in OLS

biaseconometricsmulticollinearityregressiontime series

I am reading Greene's textbook Econometric Analysis where he says that, if there's multicollinearity, then:

  • Small changes in data lead to large swings in parameter estimates.
  • Coefficients have high standard errors even though they're jointly significant.
  • Coefficients have the "wrong" sign or implausible magnitudes.

I have three question:

  • What are the consequences for the unbiasedness and consistency of the OLS estimators in the presence of multicollinearity?
  • Is the efficiency of the estimators reduced in the presence of multicollinearity?
  • Do Greene's points hold (yet to a lesser extent) for slightly correlated independent variables? For example, would all three points hold (to a small extent) if the correlation between the regressors is $\rho = 0.1$, for example?

Best Answer

Re your 1st question Collinearity does not make the estimators biased or inconsistent, it just makes them subject to the problems Greene lists (with @whuber 's comments for clarification).

Re your 3rd question: High collinearity can exist with moderate correlations; e.g. if we have 9 iid variables and one that is the sum of the other 9, no pairwise correlation will be high but there is perfect collinearity.

Collinearity is a property of sets of independent variables, not just pairs of them.