[Math] Is the Frobenius norm minimizer the same as the $2$-norm minimizer

least squareslinear algebramatricesnormed-spacesoptimization

Given matrices $A \in \mathbb{R}^{n \times m}$ and $B \in \mathbb{R}^{n \times k}$, consider the least squares minimizer

$$\arg \min_{X \in \mathbb{R}^{m \times k}} \| AX – B \|_F$$

where $\| M \|_F$ denotes the Frobenius norm, $ \bigl(\sum_i \sum_j M_{ij}^2 \bigr)^{1/2}$.

I was wondering if this is identical to the minimizer

$$\arg \min_{X \in \mathbb{R}^{m \times k}} \| AX – B \|_2$$

where $\| M \|_2$ denotes the matrix 2-norm, $\sigma_1 (M)$.

Since $\sqrt{rank(M)} \cdot \| M \|_2 \geq \| M \|_F \geq \| M \|_2$, I feel there must be a simple argument to show these minimizers are identical…

If it is not so, can you describe what assumptions on $A, B$ are necessary for the minimizers to be identical?

Thanks.

Best Answer

Good question. Yes there is a simple argument. Let $P_A$ denote the orthogonal projection matrix $A(A'A)^{-1}A'$ and $M_A=I-P_A$. Then

$$(AX-B)'(AX-B) - B'M_AB = (AX-B)'P_A(AX-B) \succeq 0,$$

where $\succeq 0$ means that the left hand side is positive semidefinite. Hence $X=(A'A)^{-1}A'B$ is the minimizer for both norms.

I'm using the minor simplifying assumptions that $A'A$ is invertible. Should that fail then you can achieve the same result using Moore Penrose inverses.

In response to comment:

If $Y-Z$ is positive semidefinite then $v'(Y-Z)v\geq 0$ for all vectors $v$. So if $v$ is an eigenvector of $Z$ normalized to length one and corresponding to an eigenvalue $\lambda$ then $v'Yv\geq v'Zv=\lambda$. Since this is true for all eigenvectors of $Z$, both the largest eigenvalue of $Z$ and the sum of $Z$'s eigenvalues are less than the corresponding quantities of $Y$.