Solved – Minimum variance linear unbiased estimator of $\beta_1$

linear modelregressionself-study

I am studying Applied Linear Statistical Models (Kutner et al, 2005). They present a proof for minimum variance unbiased estimator of $\beta_1$ as follows:

For regression model $Y_i=\beta_0+\beta_1x_i+\epsilon_i$, the
estimator $b_1$

$b_1 = \dfrac{\sum(X_i-\bar{X})(Y_i-\bar{Y})}{\sum(X_i-\bar{X})^2}$

has the minimum variance among all linear unbiased estimators of the
form:

$\hat{\beta}_1=\sum c_iY_i$

where the $c_i$ are arbitrary constants.

Since $\hat{\beta}_1$ is required to be unbiased, the following must
hold:

$E\left[\hat{\beta}_1\right] = E\left[{\sum{c_iY_i}}\right]=\sum c_iE\left[Y_i\right]=\hat{\beta}_1$

Substituting $E[Y_i]=\beta_0+\beta_1X_i$

$E\left[\hat{\beta}_1\right]=\sum{c_i(\beta_0+\beta_1X_i)} = \beta_0\sum c_i +\beta_1\sum c_iX_i=\beta_1$

For the last equation to hold, the $c_i$ must follow the restrictions:

$\sum c_i =0$ and $\sum c_iX_i=1$

The proof continues but my question is about the last line. I can see that when $\sum c_i =0$ and $\sum c_iX_i=1$ the equation holds but I am not sure how to show that this is a unique solution. I should be able to reach a conclusion based on the number of variables and number of linearly independent equations but I don't see how. Should I treat $\beta_0$ and $\beta_1$ as constants? If so, does summing over $i$ make it a single equation? How should I approach this problem?

Best Answer

So we know its sufficient for that estimator to be unbiased if $\sum c_i=0$ and $\sum c_iX_i=1$

We want to show it's necessary:

Obviously if one of these holds and the other doesn't then we won't get unbiased.

Now suppose both fail:$\sum c_i=\gamma\not=0$ and $\sum c_iX_i=\delta\not=1$, then

$E(\hat\beta_1)=\beta_0\gamma+\beta_1\delta\not=\beta_1$ in general.

The reason is because these constants cannot be based on the parameters because our estimator has to work for any possible value the parameter may take on.