[Math] Variance of Coefficients in a Simple Linear Regression

regressionstatistics

I have a linear regression model $\hat{y_i}=\hat{\beta_0}+\hat{\beta_1}x_i+\hat{\epsilon_i}$, where $\hat{\beta_0}$ and $\hat{\beta_1}$ are normally distributed unbiased estimators, and $\hat{\epsilon_i}$ is Normal with mean $0$ and variance $\sigma^2$. I need to show that

$$\operatorname{Var}\left(\hat{\beta_0}\right)=\frac{\sigma^2\sum_{i=1}^nx_i^2}{n\sum_{i=1}^n\left(x_i-\bar{x}\right)^2}$$

$$\operatorname{Var}\left(\hat{\beta_1}\right)=\frac{\sigma^2}{\sum_{i=1}^n\left(x_i-\bar{x}\right)^2}$$

and

$$\operatorname{cov}\left(\hat{\beta_0},\hat{\beta_1}\right)=\frac{-\sigma^2\sum_{i=1}^nx_i}{n\sum_{i=1}^n\left(x_i-\bar{x}\right)^2}$$

Can anyone help me out?

Thanks.

Best Answer

From the least squares estimation method, we know that $$\hat{\beta}=(X'X)^{-1}X'Y$$ and that $\hat{\beta}$ is an unbiased estimator of $\beta$, i.e $E[\hat{\beta}]=\beta$. Moreover, the linear model $$\begin{equation} Y=X\beta +u \end{equation}$$ has the assumption that $$Y\sim N(\mu=\beta_0+\beta_1x,\sigma)$$ or equivalently that $u \sim N(\mu=0,\sigma)$. Based on the above we can prove all three results (simultaneously) by calculating the variance-covariance matrix of $b$ which is equal to: $$Var(\hat{\beta)}:=\sigma^2(\hat{\beta})=\left( \begin{array}{ccc} Var(\hat{\beta_0}) & Cov(\hat{\beta_0},\hat{\beta_1}) \\ Cov(\hat{\beta_0},\hat{\beta_1}) & Var(\hat{\beta_1}) \end{array} \right)$$ By the properties of variance we have that

$$\begin{align*}Var(\hat{\beta})&=E[\hat{\beta}\phantom{}^2]-E[\hat{\beta}]^2=E[((X'X)^{-1}X'Y)^2]-\beta^2=E[((X'X)^{-1}X'(X\beta +u))^2]-\beta^2=\\&=E[((X'X)^{-1}X'X\beta +(X'X)^{-1}X'u))^2]-\beta^2=E[(\beta+(X'X)^{-1}X'u))^2]-\beta^2=\\&=E[\beta^2]+2(X'X)^{-1}X'E[u]+E[((X'X)^{-1}X'u))^2]-\beta^2=\\&=\beta^2+0+E[((X'X)^{-1}X'u))^2]-\beta^2=E[((X'X)^{-1}X'u))^2]=\\&=\left((X'X)^{-1}X'\right)^2\cdot E[u^2]\end{align*}$$But, since $E[u]=0$ we have that $E[u^2]=Var(u)=\sigma^2$ and by substituting in the above equation we find that $$\begin{align*}Var(\hat{\beta})&=\left((X'X)^{-1}X'\right)^2\cdot E[u^2]=(X'X)^{-1}X'\cdot(X'X)^{-1}X'\cdot\sigma^2=\sigma^2(X'X)^{-1}\cdot I=\\&=\sigma^2(X'X)^{-1}.\end{align*}$$ Now, since $$(X'X)^{-1}=\left( \begin{array}{ccc} \frac{\sum x_i^2}{n\sum (x_1-\bar{x})^2} & \frac{-\sum x_i}{n\sum (x_1-\bar{x})^2} \\ \frac{-\sum x_i}{n\sum (x_1-\bar{x})^2} & \frac{1}{\sum (x_1-\bar{x})^2} \end{array} \right)$$ (which is also known or can be easily derived algebraically) you have the result that: $$\begin{align*} Var(\hat{\beta})&=\left( \begin{array}{ccc} Var(\hat{\beta_0}) & Cov(\hat{\beta_0},\hat{\beta_1}) \\ Cov(\hat{\beta_0},\hat{\beta_1} & Var(\hat{\beta_1}) \end{array} \right)=\sigma^2\left(X'X\right)^{-1}=\\&\phantom{kl}\\&=\left( \begin{array}{ccc} \frac{\sigma^2 \sum x_i^2}{n\sum (x_1-\bar{x})^2} & \frac{-\sigma^2 \sum x_i}{n\sum (x_1-\bar{x})^2} \\ \frac{-\sigma^2 \sum x_i}{n\sum (x_1-\bar{x})^2} & \frac{\sigma^2}{\sum (x_1-\bar{x})^2} \end{array} \right) \end{align*}$$