Proving that $H_1-H_0$ is idempotent

idempotentslinear algebramatricesstatistics

I want to prove that for symmetric, idempotent matrices $H_1$ and $H_0$ (these are "hat matrices" of a linear regression model), $(H_1-H_0)=(H_1-H_0)^2$, in order to show a property of a distribution. So far, I've only found that the square equals $H_1-2H_0H_1+H_0$, where I have used the symmetry to change the order of matrix multiplication. But, would this not mean that I need to have $H_0-2H_0H_1=-H_0\iff H_0=H_0H_1\iff H_1=I$, which means that the matrix is, in fact, not idempotent, since $H_1$ is not necessarily equal to $I$? Thank you for your time.

Best Answer

You need an additional hypothesis: The column space of $H_0$ is a subset of the column space of $H_1.$

If $H_0$ and $H_1$ are $n\times n$ symmetric idempotent matrices and the column space of $H_0$ is a subset of the column space of $H_1,$ then $H_0 H_1 = H_1 H_0 = H_0.$

If $x$ is in the column space of a symmetric idempotent real matrix $H,$ then $Hx=x,$ and if $x$ is orthogonal to the column space, then $Hx=0.$

If $x$ is any of the columns of $H_0$ and the aforementioned additional hypothesis holds, then $H_1 x = x.$ The columns of $H_1H_0$ are therefore just the columns of $H_0,$ so $H_1H_0= H_0.$ And since these matrices are symmetric, we also have $H_0 H_1=H_0.$

If $H_0$ had a right inverse matrix $A,$ then we could write: $$ \require{cancel} \xcancel{ \begin{align} H_1 H_0 & = H_0. \\[6pt] (H_1 H_0) A & = H_0 A = I. \\[6pt] H_1 (H_0A) & = I. \\[6pt] H_1 I & = I. \\[6pt] H_1 & = I. \end{align}} $$ But no matrix with the same number of columns as rows has a one-sided inverse unless it has a two-sided inverse, and these don't.