Proving sum of square identity

linear algebraprobabilityprobability theorystatistics

Given $$SS_{reg}=y^T(H-\vec{1}(\vec{1}^T\vec{1})^{-1}\vec{1}^T)y=\sum\hat{y_i}^2-\frac{(\sum{y_i})^2}{n}$$

and $$SS_\text{tot}=y^TB^TBy$$

and $$SS_{res}=y^T(I-H)y$$

Where

$X$ is the design matrix

$H=X(X^TX)^{-1}X^T$ is the $p$ x $p$ hat matrix

I is the $n$ x $n$ identity matrix

$B=(I-\vec1(\vec1^T\vec{1})^{-1}\vec{1}^T)$

$y$ is an $n$ x $1$ vector

How does one show $SS_{tot}=SS_{reg}+SS_{res}$?

This question is based on
Prove that total sum of squares is given by $y^TB^TBy$
and How to show sum of squares regression formula?

Best Answer

First note that $B=B^{T}$, and that $B=BB$. Then \begin{align} SS_{reg}+SS_{res}&=y^T(H-\vec{1}(\vec{1}^T\vec{1})^{-1}\vec{1}^T)y+y^T(I-H)y\\ &=y^T(I-\vec{1}(\vec{1}^T\vec{1})^{-1}\vec{1}^T)y=y^TBy=y^TBBy\\ &=y^TB^TBy=SS_{tot}. \end{align}