Gauss-Markov theorem [BLUE uniqueness proof]

least squareslinear regressionstatistics

Gauss-Markov theorem states that $b=b_{OLS}$ is the BLUE estimator for $\beta$, i.e it is the unique linear unbiased estimator $b$ of $\beta$ such that:$$\mathrm{Var}[\tilde b\vert X]-\mathrm{Var}[b\vert X]\succeq0,\text{ for every linear unbiased estimator $\tilde b$ of $\beta$}\tag{1}.$$ Here $\succeq0$ stands for positive semidefiniteness.
First, we prove $\mathrm{(1)}$ for $b = b_{OLS} = Ay$, with $A=(X'X)^{-1}X'$.

Pick a linear unbiased estimator $\tilde b=Cy$ of $\beta$.
We have $$\beta=\mathrm{E}[\tilde b\vert X]=\mathrm{E}[CX\beta+C\varepsilon]=CX\beta$$
From here the proof says that it means that $CX=I_k$. I don't get it, as $\beta$ here is a fixed vector $\beta\in\mathbb{R}^k$. I'd say that we simply have that $\beta\in\mathrm{Ker}(CX-I_k)$. Anyway, using the fact that $CX=I_k$ the rest of the proof is clear, as we get that:

$$\mathrm{Var}[\tilde b\vert X]-\mathrm{Var}[b\vert X] = \sigma^2DD'\succeq0$$ where $D=C-A$.

Then, we need to prove that $b=b_{OLS}$ is the unique linear unbiased estimator which fulfills $\mathrm{(1)}$, I can't find any proof of that, though. Any help would be really appreciated.

Best Answer

The relation $\beta=CX\beta$ must hold for any vector $\beta \in \Bbb R^k $, that is why $CX=I_k$.

For the unicity part, suppose that there exists another linear and unbiased estimator $b_*$ that satisfies $\mathrm{Var}[\tilde b\vert X]-\mathrm{Var}[b_*\vert X]\succeq0$ for every linear unbiased estimator $\tilde b$ of $\beta$. In particular, we can choose $\tilde b=b_{OLS}$ which gives $$\mathrm{Var}[b_{OLS}\vert X]-\mathrm{Var}[b_*\vert X]\succeq0$$ $$\iff \mathrm{Var}[b_{*}\vert X]-\mathrm{Var}[b_{OLS}\vert X]\preceq0$$ a contradiction.