Solved – GLS estimator that has lower variance than OLS for sum of parameters in linear model under Gauss-Markov conditions

generalized-least-squaresleast squaresmultiple regression

I have a model $$Y=\beta_0 + \beta_1 x_1 + \beta_2x_2 +\epsilon$$

I would like the minimum variance unbiased estimate of $\gamma=\beta_1 + \beta_2$. Assuming the Gauss Markov conditions hold, but $x_1$ and $x_2$ are correlated, is there a more efficient way to estimate $\gamma$ than running OLS and adding the estimates of $\beta_1$ and
$\beta_2$?

Given that var($\hat{\theta}$)=var($\hat{\beta_1}$)+var($\hat{\beta_2}$)+2cov($\hat{\beta_1},\hat{\beta_2}$) I'm specifically wondering if there is a GLS estimator that reduces cov($\hat{\beta_1},\hat{\beta_2}$) faster than it increases the variance of the individual estimates.

Best Answer

The Gauss-Markov theorem states that the covariance matrix of any unbiased estimator $\tilde{\beta} \ne \hat{\beta}_{OLS}$ exceeds that of $\hat{\beta}_{OLS}$ by a positive semidefinite matrix. Let's label the OLS covariance matrix $\Omega$ and the positive semidefinite matrix $D$. The variance of the sum of the OLS estimates can be written as $\iota' \Omega \iota$, where $\iota$ is a vector of ones of the appropriate length. For a non-OLS estimator, the variance of the sum is:

$\sigma^2_\Sigma = \iota' (\Omega + D) \iota = \iota' \Omega \iota + \iota' D \iota \geq \iota' \Omega \iota$

as $D$ is positive semidefinite. Therefore, the sum of the OLS parameter estimates is the minimum variance unbiased estimator of the true sum of the parameters.

In fact, this applies to any weighted sum of the parameter estimates, not just the unweighted sum.

Related Question