I have no access to the source, and searching for Gentle Matrix Algebra delivers many Gentle Introduction to Matrix Algebra entries...
So i will use the setting from
Moore-Penrose inverse
which seems to be similar, but the notation differs.
Let us do some computations, but of course in a setting without indices. Let
$$
\begin{aligned}
M&=
\begin{bmatrix}
A & B\\ C & D
\end{bmatrix}\text{ be a partitioned matrix, associate then}
\\
Z &:= D-CA^-B\ ,\text{ and inductively on block shape}\\
M^-&:=
\left[
\begin{array}{c|c}
A^- + A^-BZ^-CA^- & -A^-BZ^- \\\hline
-Z^-CA^- & Z^-
\end{array}
\right]
\ .\text{ Then we have:}
\\[3mm]
MM^-
&=
\begin{bmatrix}
A & B\\ C & D
\end{bmatrix}
\left[
\begin{array}{c|c}
A^- + A^-BZ^-CA^- & -A^-BZ^- \\\hline
-Z^-CA^- & Z^-
\end{array}
\right]
\\
&=
\left[
\begin{array}{c|c}
AA^- - (I-AA^-)BZ^-CA^- & (I-AA^-)BZ^- \\\hline
CA^- - (D-CA^-B)Z^-CA^- & (D-CA^-B)Z^-
\end{array}
\right]
\\
&=
\left[
\begin{array}{c|c}
AA^- - (I-AA^-)BZ^-CA^- & (I-AA^-)BZ^- \\\hline
CA^- - ZZ^-CA^- & ZZ^-
\end{array}
\right]
\ ,
\\
N:=MM^-M
&=
\left[
\begin{array}{c|c}
AA^- - (I-AA^-)BZ^-CA^- & (I-AA^-)BZ^- \\\hline
CA^- - ZZ^-CA^- & ZZ^-
\end{array}
\right]
\begin{bmatrix}
A & B\\ C & D
\end{bmatrix}
\\
&\text{ has then the entries}
\\[3mm]
N_{11}
&=
AA^-A - (I-AA^-)BZ^-CA^-A \\
&\qquad\qquad + (I-AA^-)BZ^-C
\\
&=A + (I-AA^-)BZ^-C(I-A^-A)\ ,\\[3mm]
N_{12}
&=AA^-B - (I-AA^-)BZ^-CA^-B \\
&\qquad\qquad
+ (I-AA^-)BZ^-D\\
&=AA^-B + (I-AA^-)BZ^-Z\ ,\\[3mm]
N_{21}
&= CA^-A - ZZ^-CA^-A \\
&\qquad\qquad+
ZZ^-C\\
&= CA^-A + ZZ^-C(I-A^-A)\ ,\\[3mm]
N_{22}
&= CA^-B - ZZ^-CA^-B \\
&\qquad\qquad+
ZZ^-D\\
&= CA^-B + ZZ^-(D-CA^-B)\\
&= CA^-B + ZZ^-Z\\
&= CA^-B + Z\\
&= D\ .
\end{aligned}
$$
So only the value $D$ is guaranteed. (Note that the value $I$ corresponds to two the one or each other identity matrix, such that $IB$, resp. $CI$ make sense.) We need more. There are some properties that can be used, but i could not find something to have a better comparison. (Please compare the above entries in $N$ with the four entries $A,B,C,D$ in $M$, building the difference. Some (non-commutative) factors occur, and sufficient conditions can be stated, but these are very restrictive for the given generality of the question.)
I was searching for a simple counterexample, so here is my first quick try.
The relation $MM^-M=M$ can not work in general (with my choices of minus-elements, alias generalized matrices.) For this, take $A=0$ (in some block dimension) and $D=0$ (in an other one), so $Z=0$, my choices are then $A^-=0$, $Z^-=0$ (of transposed shapes). Then we compute
$$
MM^-M
=
\begin{bmatrix}
0 & B\\ C & 0
\end{bmatrix}
\begin{bmatrix}
0 & 0\\ 0 & 0
\end{bmatrix}
\begin{bmatrix}
0 & B\\ C & 0
\end{bmatrix}
=
\begin{bmatrix}
0 & 0\\ 0 & 0
\end{bmatrix}
\ ,
$$
which is not $M$ with a high statistical confidence.
This was a special case, i know it is not fair to restrict to such a special choice, which is humanly generated. But in the general case we have a very good statistics for the formula.
It does have a left inverse.
$$\begin{pmatrix}
1 & 1 & 0 & 0 & 0 \\
0 & 1 & 1 & 0 & 0 \\
4 & 4 & 4 & 0 & 0
\end{pmatrix}$$
The usual proof Why is $A^TA$ invertible if $A$ has independent columns? that $A^TA$ is invertible uses dot products. But dot products don't have good properties over finite fields, because it is possible that $\langle x, x\rangle = 0$ when $x \ne 0$. The columns of $A^TA$ in your example all have this property.
Best Answer
\begin{align*} \begin{bmatrix} A^\ast & A^\ast A\\ I_m+AA^\ast & A \end{bmatrix}^{-1} &= \left(\begin{bmatrix} A^\ast A&A^\ast\\ A&I_m+AA^\ast \end{bmatrix} \begin{bmatrix} 0&I_n\\ I_m&0 \end{bmatrix}\right)^{-1}\\ &= \begin{bmatrix} 0&I_m\\ I_n&0 \end{bmatrix} \color{red}{\begin{bmatrix} A^\ast A&A^\ast\\ A&I_m+AA^\ast \end{bmatrix}}^{-1}. \end{align*} Since both $A^\ast A$ and the Schur complement $S=I_m + AA^\ast - A(A^\ast A)^{-1}A^\ast$ are invertible, you can compute the inverse on the last line by the matrix inversion formula $$ \begin{bmatrix}A&B\\ C&D\end{bmatrix}^{-1} = \begin{bmatrix}A^{-1}+A^{-1}BS^{-1}CA^{-1} & -A^{-1}BS^{-1}\\ -S^{-1}CA^{-1} & S^{-1} \end{bmatrix} $$ where $S=D-CA^{-1}B$ (to apply this formula, you should replace $A$ by $A^\ast A$ and $B$ by $A^\ast$ etc.).
Alternatively, if you perform a singular value decomposition $A=U_{m\times m}\begin{bmatrix}\Sigma_{n\times n}\\ 0_{(m-n)\times n}\end{bmatrix}V_{n\times n}^\ast$, where $U$ and $V$ are unitary, it is easy to see that $$ \color{red}{\begin{bmatrix} A^\ast A&A^\ast\\ A&I_m+AA^\ast \end{bmatrix}} = \begin{bmatrix}V\\ &U\end{bmatrix} \begin{bmatrix}\Sigma^2&\Sigma&0\\ \Sigma&I_n+\Sigma^2&0\\ 0&0&I_{m-n}\end{bmatrix} \begin{bmatrix}V^\ast\\ &U^\ast\end{bmatrix} $$ and hence $$ \color{red}{\begin{bmatrix} A^\ast A&A^\ast\\ A&I_m+AA^\ast \end{bmatrix}}^{-1} = \begin{bmatrix}V\\ &U\end{bmatrix} \begin{bmatrix}\Sigma^{-2}+\Sigma^{-4}&-\Sigma^{-3}&0\\ -\Sigma^{-3}&\Sigma^{-2}&0\\ 0&0&I_{m-n}\end{bmatrix} \begin{bmatrix}V^\ast\\ &U^\ast\end{bmatrix}. $$