Right-inverse approximation in Frobenius Norm

linear algebralinear programmingmatrices

Let $A \in \mathbb{R}^{m\times n}$, with $m \geq n$, be a matrix of rank $r$, and suppose we have a SVD decomposition $A = U\Sigma V^t$.

We define the pseudo-inverse of $A$ as $A^{\dagger} := V\Sigma^{\dagger}U^t$, where $\Sigma^{\dagger}_{i, j} = 0$, if $\Sigma_{i, j} = 0$, and $\Sigma^{\dagger}_{i, j} = 1/\sigma_i$, if $\Sigma_{i, j} = \sigma_i$.

The question is to prove that $A^{\dagger}$ is the matrix with lower Frobenius norm that solves the problem

$$
\min_{B \in \mathbb{R}^{n \times m}} ||AB – I_m ||_F
$$

Best Answer

If $B=[b_1,\ldots,b_m]$ where $b_i$ are the columns of $B$ and $e_i$ are the columns of the identity $I_m$, you can write $$ \|AB-I_m\|_F^2=\sum_{i=1}^m\|Ab_i-e_i\|_2^2, $$ so the Frobenius norm minimization can be turned into $m$ independent minimizations $$ \min_{B}\|AB-I_m\|_F^2=\sum_{i=1}^m\min_{b_i}\|Ab_i-e_i\|_2^2. $$ Now we have $m$ linear least squares problems on the right-hand side and if you know that the solution to a linear LS is the pseudo-inverse solution, we have $b_i=A^\dagger e_i$, that is, the $i$th column of $B$ is the $i$th column of the MP pseudo-inverse and hence $B=A^\dagger$.