Low rank approximation of non-symmetrix square matrix

matrix-rankpositive definitesvd

I have asked this in a previous post, and perhaps the question was not properly posed. I will try again.

I have a square, real, non-symmetric matrix $A$ which satisfies $A + A^T \geq 0$. From $A$, I want to find a low-rank approximation matrix $B$ so that it also satisfies $B + B^T \geq0$.

Will Singular Value Decomposition do the job? I know it would for the symmetric case, but I am not sure I can assume the same for the non-symmetric case.

For the time being, I am trying to see if I can avoid heuristics/etc, and I was hoping that the condition $A + A^T \geq 0$ does help in appying SVD or similar.

Thanks!

Best Answer

By $A+A^T\geq 0$ do you mean the symmetric part $A+A^T$ is positive semidefinite, or do you mean that it is pointwise nonnegative?

If you mean $A+A^T$ is positive semidefinite I the rank truncated SVD will also be positive semidefinite.

The key is that if $A+A^T$ is positive semidefinite then $A$ satisifes: $x^TAx \geq 0$ for all $x$. This is true since for real $A$ and $x$ we have that $x^TAx = (x^TAx)^T = x^TA^Tx$. Therefore, $$x^T(A+A^T)x = x^TAx + x^TA^Tx = 2x^TAx$$

Geometrically, $x^TAx \geq 0$ means that the image of $x$ under $A$, and $x$ have positive inner product. Basically, $Ax$ cannot end up in a direction "too far away" from $x$.

In terms of the SVD this means $v_i^Tu_i \geq 0$ for all $i$. Truncating the SVD will not alter this property.

We can prove this. Indeed, Write the SVD of $A$ as $$ A = U\Sigma V^T = \sum_{i=1}^{n} \sigma_i u_iv_i^T $$

First note that, $$ 0\leq x^TAx = \sum_{i=1}^{n} \sigma_i (x^Tu_i)(v_i^Tx) $$

Since $v_i$ is orthogonal to $v_j$ when $i\neq j$, $$ 0\leq v_i^TAv_i = \sigma_i (v_i^Tu_i)(v_i^Tv_i) $$

Write $y$ as a linear combination of $v_i$, $y=\sum_i c_iv_i$. Then, since $|c_i|^2 \geq 0$, $$ y^T(\sigma_i u_iv_i^T)y = \sigma_i |c_i|^2 (v_i^Tu_t)(v_i^Tv_i) \geq 0 $$

Therefore the rank $r$ truncation $B$ also satisifes $x^TBx \geq 0$.