[Math] Proof that the least squares regression line goes through the origin . . .

linear regressionregression

Prove that for the fitted least squares regression line through the origin, $\sum_i x_ie_i=0$.

I am having trouble understanding how to prove this. Would someone be able to explain how to go about it?

I know that for simple linear regression through the origin, the estimator $\hat{\beta}$ reduces to :
$$\hat{\beta}=\frac{\sum_i y_ix_i}{\sum_i x_i^2}=\sum_i\frac{x_i}{\sum_j x_j^2}y_i$$

Best Answer

Notice that if $f(b)=\sum_i (y_i-b x_i)^2$, then

$$f'(b)=-2\sum_i x_i (y_i-b x_i)=-2\sum_i x_i e_i$$

When $f$ is minimum, you have $f'=0$.

Notice also that, for the same reason (differentiate the function to minimize), the relation $\sum_i x_ie_i=0$ is also true for the "usual" linear regression (with a constant term).