Given the following regression:
$y_i = \alpha + \beta_1 x_{1i} + \beta_2 x_{2i} + \epsilon $
If $x_1$ and $x_2$ are exogenous regressors to $y_i$, but are endogenous to each other, or determined simultaneously, what effect does this have on the $\beta$s.
For example, assuming that education and wages are exogenous to health, but we know they themselves are both caused by ability:
$health_i = \alpha + \beta_1 education_{i} + \beta_2 wages_{i} + \text{other variables}+ \epsilon $
Note: I know this may not be a great example, and this is not a question that I am posing for my research (I am not studying the effect of wages and education on health) but just the best example I could think of to illustrate my question.
Best Answer
You will very often have explanatory variables that are related to each other. As far as I know, we reserve the term endogeneity for covariates that are correlated with the error term of the regression and I wouldn't use the term in a situation like yours.
Depending on what you are trying to accomplish with your regression analysis, the consequences of related explanatory variables are somewhere between no problem at all - problematic.
Lets focus on two aims of a regression analysis:
In the first case, collinearity is no problem. In the latter case, however, you may run into trouble, depending on how strong the relationship between $x_2$ and $x_1$ is. The reason is that the regression model will not be able to distinguish the effect that $x_1$ has on $y$, from the effect that $x_2$ has. This shows in larger standard errors for the coefficients, which is the regression's way of saying: I'm not sure I should link $y$'s variation to $x_1$ or to $x_2$, because they always move together. Ultimately this will lead to insignificant coefficients and you will not be able to say something (as) useful about the partial effects of $x_1$.
You can measure the effect of multicollinearity with the variance inflation factor.