Simple Linear Regression – Are Estimates of Intercept and Slope Independent?

regression

Consider a linear model

$y_i= \alpha + \beta x_i + \epsilon_i$

and estimates for the slope and intercept $\hat{\alpha}$ and $\hat{\beta}$ using ordinary least squares. This reference for a mathematical statistics makes the statement that $\hat{\alpha}$ and $\hat{\beta}$ are independent (in their proof of their theorem).

I'm not sure I understand why. Since

$\hat{\alpha}=\bar{y}-\hat{\beta} \bar{x}$

Doesn't this mean $\hat{\alpha}$ and $\hat{\beta}$ are correlated? I'm probably missing something really obvious here.

Best Answer

Go to the same site on the following sub-page:

https://onlinecourses.science.psu.edu/stat414/node/278

You will see more clearly that they specify the simple linear regression model with the regressor centered on its sample mean. And this explains why they subsequently say that $\hat \alpha$ and $\hat \beta$ are independent.

For the case when the coefficients are estimated with a regressor that is not centered, their covariance is

$$\text{Cov}(\hat \alpha,\hat \beta) = -\sigma^2(\bar x/S_{xx}), \;\;S_{xx} = \sum (x_i^2-\bar x^2) $$

So you see that if we use a regressor centered on $\bar x$, call it $\tilde x$, the above covariance expression will use the sample mean of the centered regressor, $\tilde {\bar x}$, which will be zero, and so it, too, will be zero, and the coefficient estimators will be independent.

This post, contains more on simple linear regression OLS algebra.

Related Question