Solved – Explained Sums of Squares in matrix notation

matrixmultiple regressionregression

I am currently reading Appendix C from Gujarati Basic Econometrics 5e.
It deals with the Matrix Approach to Linear Regression Model.

I am unable to decipher how the author went from equation 7.4.19 to C.3.17

enter image description here

Best Answer

In short, the author is not going from 7.4.19 to C.3.17.

C.3.17 is just a definition, from which we can construct 7.4.19.

The total sum of squares in pure matrix form is the following: \begin{align} y^TM_{\iota}y = y^T(I - \iota(\iota^T\iota)^{-1}\iota^T)y = y^Ty - n\bar{y}^2 = \sum_{i=1}^{n}(y_i - \bar{y})^2 \end{align}

Where $M_{\iota}$ is a orthogonal projection matrix and $\iota$ is a column of ones and $I$ is the identity matrix of size $n$.

The Explained Sum of Squares is defined in C.3.17, but I will start from a more familiar definition so it makes more sense how the author ended up there.

\begin{align} \sum_{i=1}^n(\hat{y_i} - \bar{y})^2 &= \hat{y}^TM_{\iota}\hat{y}\\ & = (X\hat{\beta})^TM_{\iota}(X\hat{\beta})\\ & = \hat{B^T}X^T(I - \iota(\iota^T\iota)^{-1}\iota^T)X\hat{\beta}\\ & = \hat{\beta}^TX^TX\hat{\beta} - \hat{\beta}^TX^T\iota(\iota^T\iota)^{-1}\iota^TX\hat{\beta}\\ & = \hat{\beta}^TX^TX((X^TX)^{-1}X^Ty) - n\bar{\hat{y}^2}\\ & = \hat{\beta}^Ty - n\bar{y}^2 \end{align}

Related Question