Solved – How to prove $\beta_0$ has minimum variance among all unbiased linear estimator: Simple Linear Regression

linearregressionself-study

Under the condition of simple linear regression model ( $Y_i = \beta_0 + \beta_1X_i + \epsilon_i$) ordinary linear estimators($\hat{\beta_0}$ and $\hat{\beta_1}$) have minimum variance among all linear estimators.

To prove OLS estimator $\hat{\beta_1} = \sum{k_iy_i}$ has minimum variance we start by setting $\tilde{\beta_1} = \sum{c_iy_i}$ and we show that variance of $\tilde{\beta_1}$ can only be larger than $\beta_1$ if $c_i \neq k_i$.

Similarly, I am trying to prove that $\hat{\beta_0}$ has minimum variance among all unbiased linear estimators, and I am told that the proof starts similarly.

I know that the OLS estimator is $\hat{\beta_0} = \bar{y} – \hat{\beta_1}\bar{x}$.

How do I start the proof: by constructing another linear estimator $\tilde{\beta_0}$? Is this a linear estimator $\hat{\beta_0} = c\bar{y} – \hat{\beta_1}\bar{x}$?

Best Answer

You can start by expressing $\hat{\beta_0}$ as a linear combination of $y_i$, similar to $\hat{\beta_1} = \sum{k_iy_i}$:

$\hat{\beta_0} = \bar{y} - \hat{\beta_1}\bar{x} = \frac{1}{N}\sum{y_i}-\sum{k_iy_i}\bar{x}=\sum{(\frac{1}{N}-k_i\bar{x})y_i}=\sum{l_iy_i}$