Solved – Is the derivations of OLS different if we do not have an intercept

least squaresregressionself-study

We have an equation with $\alpha =0$ therefore the simple linear regression is: $$ Y_i=\beta X_i+u_i$$ and the question was. Derive the OLS $\beta $ and provide comments on its variance. In this situation should I have derived the OLS the same way as for a regression with intercept? to obtain $$ \hat\beta =\frac{Cov(XY)}{Var(X)} $$ or is it something different that I should have done?

Best Answer

That is not correct if there is no intercept. Without an intercept the OLS estimate would be:

$$ \beta = \frac{\operatorname{E}[XY]}{\operatorname{E}{[X^2]}}$$

In the case of a finite sample, your estimate would be:

$$ \hat{\beta} = \frac{\sum_{i=1}^n x_i y_i }{\sum_{i=1}^n x_i^2} $$

The general case (skip over this if you don't know matrix algebra yet)

If you know matrix algebra, all these are special cases. Minimizing the sum of squares can be written as:

$$ \text{minimize (over $\mathbf{b}$) } \quad\left( \mathbf{y} - X \mathbf{b}\right)'\left( \mathbf{y} - X \mathbf{b}\right) $$

Which has the solution:

$$ \hat{\mathbf{b}} = (X'X)^{-1} X'\mathbf{y}$$

The algebra behind that can be found (among numerous places) on my answer here.

Related Question