Regression Analysis – Gauss-Markov Assumption Violations in Simple OLS

gauss-markov-theoremregression

I am creating a simple linear model with the following form:
$$ y_i/x_i = \alpha + \beta x_i + u_i $$

The response variable has different name other than $y/x$, but it is essentially normalized by X as shown above.
So if the response, by definition includes part of the predictor can I regress this response on X, without introducing problems of bias or inefficiency? If no, can I have help with the intuition?

Best Answer

Your model is equivalent to: $$y_i=\alpha x_i+ \beta x_i^2+u_i x_i .$$

Note $\varepsilon_i = u_i x_i$, such that this model rewrites: $$y_i = \alpha x_i + \beta x_i^2 + \varepsilon_i .$$

So the two differences with a usual linear regression model are:

  • you don't have a constant term,
  • the errors $\varepsilon_i$ are proportional to the covariates $x_i$.

I don't think that the absence of constant term is a big deal (maybe I'm wrong...), but the other point is.

To have Gauss-Markov theorem, you want the $\varepsilon_i$ to have:

  • null expectation,
  • constant variance,
  • null covariance.

In order to have that, you need assumptions on you $x_i$ (which you don't need to have Gauss-Markov theorem for a classical linear regression). Such assumption could be that the $x_i$ are independent from the $u_i$ and have constant mean and variance. Then you'll have that the $\varepsilon_i$ satisfying the three conditions.

So, as Nick Cox commented, the answer is: it depends. If you have a fixed design ($x_i$ are constant), then no (the $\varepsilon_i$ won't have constant variance). If you suspect the $u_i$ not to be independent from $x_i$, then no again. But if the $x_i$ are i.i.d. and independent from the $u_i$, then I think yes.

Related Question