Simple linear regression (sum of residuals and predictor)

linear regressionregressionregression analysis

Show explicitly that the following identity holds under a Simple Linear Regression:

$$
\ \sum_{i=1}^n r_i \hat{\mu_i} =0$$

with residuals $ r_i = y_i − \hat{\mu_i} $ and $\hat{\mu_i} = \hat{\beta_0}+\hat{\beta_1}x_i$.

I tried to do the following steps, but somehow I end up at the same argument again.

\begin{align}
& \sum_{i=1}^n r_i \hat{\mu_i}\\[10pt]
= {} & \sum_{i=1}^n (y_i − \hat{\mu_i})(\hat{\beta_0}+\hat{\beta_1}x_i)\\[10pt]
= {} & \ \sum_{i=1}^n (y_i − \hat{\beta_0}-\hat{\beta_1}x_i)(\hat{\beta_0}+\hat{\beta_1}x_i)\\[10pt]
= {} & \ \sum_{i=1}^n(\hat{\beta_0}y_i+\hat{\beta_1}x_iy_i-\hat{\beta_0}^2-\hat{\beta_0}\hat{\beta_1}x_i-\hat{\beta_0}\hat{\beta_1}x_i -\hat{\beta_1}^2{x_i}^2)\\[10pt]
= {} & \ \sum_{i=1}^n (\hat{\beta_0}y_i+\hat{\beta_1}x_iy_i- {(\hat{\beta_0}+\hat{\beta_1}x_i)}^2)\\[10pt]
= {} & \ \sum_{i=1}^n(y_i(\hat{\beta_0}+\hat{\beta_1}x_i)- {(\hat{\beta_0}+\hat{\beta_1}x_i)}^2) \\[10pt]
= {} & \ \sum_{i=1}^n(\hat{\beta_0}+\hat{\beta_1}x_i)(y_i-\hat{\beta_0}+\hat{\beta_1}x_i) \\[10pt]
= {} & \ \sum_{i=1}^n\hat{\mu_i}r_i \\[10pt]
&
\end{align}

Any idea how to proceed?

Best Answer

After the third equality, you should use the linearity of the sum and notice that $\hat{\beta}_0$ and $\hat{\beta}_1$ don't depend on the index $i$. That way, you should get: $$ \sum_{i=1}^n r_i \hat{\mu}_i = \hat{\beta}_0 n \bar{y} + \hat{\beta}_1 \sum_{i=1}^n x_i y_i - n \hat{\beta}_0^2 - 2 \hat{\beta}_0 \hat{\beta}_1 n \bar{x} - \hat{\beta}_1^2 \sum_{i=1}^n x_i^2. $$ Now, you should use the fact that $\hat{\beta}_0 = \bar{y}- \hat{\beta}_1 \bar{x}$ and remember the normal equations, namely the second one: $$ \hat{\beta}_0 n \bar{x} + \hat{\beta}_1 \sum_{i=1}^n x_i^2 - \sum_{i=1}^n x_i y_i = 0. $$ The result now directly follows.