Independence of SSE with regression coefficients

linear regressionstatistical-inference

I'm studying a simple linear regression model $y_i=\beta_0+\beta_1x_i+\epsilon_i$, where $\epsilon_i$ are normally distributed i.i.d., and the least square fitting $\hat{y}_i=b_0+b_1x_i$.
$$SSE=\sum_i(y_i-\hat{y}_i)^2$$
Apparently SSE is independent of both $b_0$ and $b_1$. How can I prove this?
I found a proof that SSE and $b_0,b_1$ are uncorrelated here, but in order for this to imply independence they should be jointly normally distributed; how can I prove the joint normality?

Best Answer

The answer you linked to shows that the residual $Y-X\hat \beta$ is independent of $\hat \beta$ by relying on the joint normality of $(Y-X\hat \beta,\hat \beta)$.

Note that $Y-X\hat \beta = (I-H)\epsilon$ and $\hat \beta = (X^\top X)^{-1}X^\top Y = \beta + (X^\top X)^{-1}X^\top \epsilon$, hence both $Y-X\hat \beta$ and $\hat \beta$ are affine transformations of $\epsilon$.

Since $\epsilon$ has multivariate normal distribution, the joint distribution of $(Y-X\hat \beta,\hat \beta)$ is normal.

Related Question