[Math] SVD alternatives for symmetric matrices

linear algebramatrices

Given any symmetric real valued matrix $A \in \mathbb{R}^{n\times n}$, I can decompose $A$ as the product of two complex matrices

$$
A = E'E
$$

Practically this can be done easily using SVD decomposition; however, under the copmutational point of view, SVD might be very slow on big matrices (its running time is $O(n^3)$).

Is there any known faster alternative to decompose such symmetric matrices?

Best Answer

I tried to comment on this question since it is rather badly posed. Since I do not have enough credits yet on mathoverflow I was not allowed to comment. Unfortunately, it appears that nobody with enough computational matrix theory knowledge has edited it or commented on it.

(a) You have not said why you want to do this decomposition. Numerical computational experts have design many alternative solutions depending on the usage. As a start, please read "Matrix Computations" by Gene Golub and Charles Van Loan before trying to reinvent the wheel. If software is required then google "LAPACK". The original package is in Fortran but has been translated to almost all popular high level languages.

(b) You have not defined what is meant by $E'$. Is it the standard transpose operation on $E$ or the complex conjugate transpose operation?

(c) If it is the transpose operation, then the product $E'E$, in general, is not real if $E$ is complex.

(d) Is the symmetric matrix $A$ positive definite or at least positive semi-definite? If $A$ is positive definite, then $E$ can be taken as a real upper triangular matrix and the decomposition $A = E'E$ is known as the Cholesky factorisation. If $A$ is positive semi-definite, then a permutation matrix $P$ can be used for the decomposition $PAP' = E'E$ where $E$ is upper triangular. Equivalently, $A = P'E'EP$.

(e) If the matrix $A$ is sign indefinite, then the matrix $E$ cannot be taken as real. If this is the case, then we use the $E'DE$ decomposition (sometimes known as the $RDR'$ decomposition) where $E$ is upper triangular and $D$ is a block diagonal matrix where the blocks are $1 \times 1$ or $2 \times 2$. Note that a blocks of the form $$ \left[ \begin{array}{cc} 0 & 1 \\ 1 & 0 \end{array} \right] $$ cannot be diagonalised without complex arithmetic. Normal convention is to avoid complex arithmetic, as much as possible, for real matrix problems. Thus, $2 \times 2$ blocks which cannot be diagonalised without complex arithmetic are left alone. Of course, if you really want to find the square-roots then operate only on these blocks.

(f) As far as I know, this problem cannot be solved using the SVD. Please indicate the steps if this problem can be solved using the SVD.

(g) The complexity of the problem (based on multiplication counts) does not always reflect the running time. There are very efficient programs based on level 3 BLAS (Basic Linear Algebra Subroutines) operations which optimally utilises modern hardware. Usually, cache management is more important than basic multiplication counts since certain arithmetic operations can be done in parallel.

Related Question