Solved – Proof for the standard error of parameters in linear regression

regressionstandard deviationstandard error

In the book Introduction to Statistical Learning, the authors describe the relation between predictor $X$ and response $Y$, by linear regression as: $$ Y = \beta_{0} + \beta_{1}X+\epsilon$$
Here, $\beta_0$ is the intercept term and $\beta_1$ is the slope. $\epsilon$ is the error term.
By minimizing the least squares criterion, the values of $\hat{\beta_0}$ and $\hat{\beta_1}$ are found to be:
$$\hat\beta_1 = \frac {\sum_{i=1}^{n}{(x_i-\bar{x})(y_i-\bar{y})}}{\sum_{i=1}^{n}(x_i-\bar{x})^2}$$

$$\hat{\beta_0}=\bar{y}-\hat{\beta_1}\bar{x}$$
Finding this wasn't very hard. Next, the authors find the standard error(SE) of these parameters. They do so by firstly providing the following : $$Var(\hat\mu)=SE(\hat\mu)^2=\frac{\sigma^2}{n}$$
That is, $SE = \frac {\sigma}{\sqrt{n}}$ (where $\sigma$ is the standard deviation of each of the realizations $y_i$ of $Y$).
Next, the authors give the standard errors of both the parameters:
$$SE(\hat\beta_0)^2=\sigma^2 \Big[ \frac{1}{n}+\frac{\bar{x}^2}{\sum_{i=1}^n(x_i-\bar{x})^2}\Big]$$
$$SE(\hat\beta_1)^2=\frac{\sigma^2}{\sum_{i=1}^n(x_i-\bar{x})^2}$$
Where $\sigma^2=Var(\epsilon)$.
There seems no connection between the formulas found for the parameters, and their standard errors. In order to find the standard error, we must have the variance of both the parameters. But how can we find the variance of formulas? They're bound to give an exact value for a particular input.

Best Answer

Note $Var(\hat{\beta}_0) = Var(\bar{y} - \hat{\beta}_1\bar{x}) = Var(\bar{y}) + \bar{x}^2Var(\hat{\beta}_1) - 2Cov(\bar{y},\hat{\beta}_1)$. Try to show that the covariance term is 0.

The $Var(\hat{\mu}) = \dfrac{\sigma^2}{n}$ fact (although I'm not a fan of the notation they used here) is used in the calculation, $Var(\bar{y}) = \dfrac{\sigma^2}{n}$.