Regression – is there any linear unbiased estimator of E(y|x) with a lower variance than the OLS solution

linear regressionprobabilityregressionstatistics

In linear regression, Gauss-Markov theorem states that the $\hat{\beta}$ (estimator of $\beta$) by OLS has the minimum variance than other linear unbiased estimators.
So I was wondering that if it can also applied for the estimator of E(y|x)?
In other words, does $\hat{y}$ = $X\hat{\beta}$ have the minimum variance among any other linear unbiased estimator of E(y|x)?
And how to proof this?

Any suggestion is grateful.
Thanks!

Best Answer

If $E[Y|X=x]$ is linear, i.e., for example $\beta_0 + \beta_1x$, then $\hat{\beta}_0 + \hat{\beta}_1x$ the it is still unbiased and have minimal variance (as the best linear predictor). However, if $E[Y|X=x] = g(\beta; x)$ is nonlinear, then $g(\hat{\beta}; x)$ is simply biased by the Jensen inequality, thus clearly is not a minimal variance unbiased estimator.

Related Question