If:
-
If $A$ has linearly independent columns, $A^+=\left(A^*A\right)^{-1}A^*$
-
If $A$ has linearly independent rows, $A^+=A^*\left(AA^*\right)^{-1}$
-
Otherwise, use the SVD decomposition.
Is it possible to avoid using SVD decomposition?
I've found the following method (https://www.omnicalculator.com/math/pseudoinverse#how-to-calculate-the-pseudoinverse):
- Start by calculating $AA^T$ and row reduce it to reduced row echelon form.
- Take the non-zero rows of the result and make them the columns of a new matrix $P$.
- Similarly, row-reduce $A^TA$ and use its non-zero rows for the columns of the new matrix $Q$.
- With your newly found $P$ and $Q$, calculate $M=P^TAQ$.
- Finally, calculate the pseudoinverse $A^+=QM^{-1}P^T$.
It works fine for most cases, but it doesn't work for $A=\left [ \begin{matrix}
0&1&0&-i\\0&0&1&0\\0&0&0&0\end{matrix} \right ]$.
Is the method wrong?
Best Answer
The method works just fine for the proposed matrix so long as appropriate care is taken with conjugates. I.e.,
In particular, with $$ A = \begin{bmatrix} 0 & 1 & 0 & -i \\ 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 0 \end{bmatrix} $$ we get $$ A A^* = \begin{bmatrix} 2 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 0 \end{bmatrix},$$ so we let $$ P = \begin{bmatrix} 1 & 0 \\ 0 & 1 \\ 0 & 0 \end{bmatrix}. $$
Similarly, $$ A^* A = \begin{bmatrix} 0 & 0 & 0 & 0 \\ 0 & 1 & 0 & -i \\ 0 & 0 & 1 & 0 \\ 0 & i & 0 & 1 \end{bmatrix}, $$ so $$ Q = \begin{bmatrix} 0 & 0 \\ 1 & 0 \\ 0 & 1 \\ i & 0 \end{bmatrix}. $$ Note the $i$ in the final row, not a $-i$, because of us taking the appropriate conjugate.
Then $$ M = P^* A Q = \begin{bmatrix} 2 & 0 \\ 0 & 1 \end{bmatrix}, $$ which is invertible. If the conjugates were not taken when constructing $p$ and $q$ (really only $q$ in this particular example), the first entry of $M$ would instead be $0$, which makes it not invertible.
Finally, $$ A^+ = Q M^{-1} P^* = \begin{bmatrix} 0 & 0 & 0 \\ 1/2 & 0 & 0 \\ 0 & 1 & 0 \\ i/2 & 0 & 0 \end{bmatrix} $$ which is indeed the pseudoinverse of $A$.