Independence of Least Squares Estimators (LSE)

covarianceleast squareslinear algebrastatisticsvariance

Consider the linear regression equation $y_i = \beta_0 + \beta_1x_i + \beta_2x_i^2 + \epsilon_i$ in the quadratic polynomial model with the independent normal noise $\epsilon_i \sim N(0, \sigma^2)$. Assume that the observations of the response values $y_i$ are available at the design points $x_1 = -10, x_2 = -9, … , x_{20} = 9, x_{21} = 10$. Prove that the Least Squares Estimator (LSE) of $\beta_1$ is independent of the LSE of $\beta_2$.

Best Answer

We know $$\widehat{\beta}=\beta+\left(X'X\right)^{-1}X'\epsilon$$

which clearly mans that $\widehat{\beta}$ is normally distributed with mean zero and covariance matrix $$\Sigma=\sigma^2\left(X'X\right)^{-1}.$$ Now note that the (2,3) element of $\Sigma$ is zero, therefore the covariance between $\widehat{\beta}_1$ and $\widehat{\beta}_2$ is zero and since these two quantities are jointly normally distributed, they are by definition independent.

Related Question