[Math] Finding the covariance matrix of a least square estimator

least squareslinear algebramatricesprobabilitystatistics

So given that the least squares estimator of $\beta$ is:

$$ \mathbf{\hat{\beta}} = (\mathbf{X}^T \mathbf{X})^{-1}\mathbf{X}^T \mathbf{Y} $$

And $\mathbf{Y} = \mathbf{X} \mathbf{\beta} + \epsilon$, where $\epsilon$ is a vector of independent zero-mean normals all with the same variance $\sigma^2$.

What is the covariance matrix? I've done this before, but I decided to attempt this differently this time.

Here is my attempt, according to the solutions the answer should be:
$(X^T X)^{-1}\sigma^2$ but I am not getting that:

enter image description here

Does anyone know where my mistake is?

Best Answer

Seems like you have some stuff backwards.

Remember that $\mathrm{Cov}(\hat\beta)$ should be $p\times p,$ so if you're using the convention where $\beta$ is $p\times 1$ and $Y$ is $n\times 1,$ you want to take $$ \mathrm{Cov}(\hat\beta)=E(\hat\beta\hat\beta^T) -E(\hat\beta)E(\hat\beta)^T.$$

Then you have $$ \hat\beta\hat\beta^T = ((X^TX)^{-1}X^T)YY^T(X(X^TX)^{-1})$$ and the last term from the expansion of $E(\hat\beta\hat\beta^T)$ will have the form $$((X^TX)^{-1}X^T)E(\epsilon\epsilon^T)(X(X^TX)^{-1}) = ((X^TX)^{-1}X^T)\sigma^2I(X(X^TX)^{-1})$$