[Math] Simple linear regression prove variables are uncorrelated:

correlationcovariancelinear algebraregression

I am working on the following problem:

In a problem of simple linear regression, $$Y = \hat\beta_0 + \hat\beta_1 x(bar),$$ show that the random variables $\hat\beta_1$ and $Y$ are un-correlated (All the betas have "hats").

I have come up with a solution, but I am not convinced whether it is correct logically. Any alternative ways/suggestions/corrections would be helpful.

My solution goes through the following steps:

  1. Express $\hat\beta_0 = Y – \hat\beta_1 x$
  2. Find the variance of the expression: $Var(\hat\beta_0) = Var(Y – \hat\beta_1 x)$
  3. Use the variance formula for $β_0$ on the left side, and split up the right side, so that we have a term that is $Cov(Y,\hat\beta_1)$
  4. Simplify and show that the covariance is $0$. Therefore the correlation will be $0$.

Best Answer

This question does not make sense as stated. What are $x,Y$? Vectors? Is $x$ fixed? Is there an error term?

For example, let $$\left[ \begin{array}{c} y_1 \\ y_2 \end{array} \right] = \beta_1 \left[ \begin{array}{c} 1 \\ 2 \end{array} \right] + \left[ \begin{array}{c} e_1 \\ e_2 \end{array} \right]$$

Then we know that $ \hat{\beta}_1 = (X^\top X)^{-1} X^\top Y = y_1/5 + 2y_2/5 $

can they really be uncorrelated?