Solved – Restricted OLS have less variance than OLS

blueefficiencyleast squaresregression

According to Gauss-Markov Theorem, ordinary least squares (OLS) is the best linear unbiased estimator (BLUE). How then can restricted OLS have less variance?
Please tell me the reason.

Best Answer

The main intuition is that restricted OLS are generally biased. So there is a tradeoff between bias and variance: you reduce variance but you allow bias.

  • An example: suppose that you want to estimate the average height of the people in your state. You may have learned that if you have a random sample (say of 2000 individuals), a "reasonable" estimator is the sample average, which is unbiased. But you have "prophecy" skills and you know for sure that the average of the population is 175cm. Then there is no variance at all, it is zero, which is lower than any estimator you can come up with. But, except if you are a really good prophet (or your cheated with data) it is likely to be biased.

  • A more explicit answer to your question would be a direct comparison of the variances of the restricted and unrestricted estimator.

$Var(\beta_u) = \sigma^2(X'X)^{-1}$

$Var(\beta_c) = \sigma^2(X'X)^{-1} - \sigma^2(X'X)^{-1}R'(R(X'X)^{-1}R')^{-1}R(X'X)^{-1}$

Therefore the variance of the restricted estimator is always weakly smaller than the variance unrestricted estimator, with equality when the restrictions are true.

  • Advanced : You can see this early test discussing the topic: the condition so that

$MSE(\hat{\beta_u}) - MSE(\hat{\beta_c})$ is positive semidefinite [$\beta_u$ is the unconstrained, $\beta_c$ the constrained estimator]

The condition is met iff $\lambda< 1/2$, where

$\lambda = \frac{1}{2\sigma^2}(R\beta-r)'(R(X'X)^{-1}R')^{-1}(R\beta-r)$, where $R\beta = r$ is the vector of contraints you impose.