MATLAB: Hi! How to decrease the computation time of calculating eigenvalues and eigenvectors of large sparse matrices like 10000X10000(I need all the eigenvalues so I can’t use eigs)

eigeigs

I need to find all the eigenvalues of a large matrix which is 10000X10000, and I have to loop it for some 1000 times.My computer has sufficient memory, the problem is it's taking too much time. It is taking 1min or so to calculate the eigenvalues for one time. I cannot use parallel computing because I will go to the next loop based on the eigenvalues I got in the previous loop. But I know the nature of the eigenvalues that I get. The eigenvalues are symmetric(This may not be the proper term) what I mean is for example 2, -2, 1, -1, 2+3i, -2-3i, 3.5, -3.5. So is there any way I could make use of this information to run my code faster?
Thanks in advance.

Best Answer

The simple answer is, I doubt it. But that might not be the complete answer, or even the correct one. And see that any answer that requires any significant amount of computation on the matrix will probably slow down things enough that just calling eig on the complete matrix will be faster. Regardless, it is worth a bit of thought.
But your question reduces to computing the roots of the characteristic polynomial, IF you knew that all of the roots were duplicates, but with a sign difference. So if lambda is a root, then so is -lambda.
In that case, you could write your polynomial in the form
(x - lambda1)*(x + lambda1)*(x - lambda2)*(x + lambda2)* ...
Of course, this reduces to
(x^1 - lambda1^2)*(x^2 - lambda2^2)* ...
Essentially your polynomial will have ALL even powers of x, and ONLY even powers of x. If there are any odd powers at all with non-zero coefficients, then your claim about the roots fails completely.
But if you do have only those even powers, then there is a simple transformation available. Just use y=x^2. So create a new polynomial where you divide all the even powers by 2. This new polynomial will be of lower degree than the original, just as if you were able to turn a 10kx10k matrix into a 5kx5k matrix. Compute the roots, and take the square root, apply a +/- to that root, and you are done.
Note this is not equivalent to computing the eigenvalues of A*A. See that A*A will not only be far less sparse than is A, but A*A will still be of the same size as is A. If A has the desired property of signed symmetric eigenvalues, then A*A will have eigenvalues of multiplicity 2.
The problem is, knowing only that the characteristic polynomial must be of the form that it only has even powers of lambda is not sufficient to determine the patterning of your matrix, or for me to know how to reduce your matrix to one of half the size. This must be the case, since there are lots of matrices which will result in the same characteristic polynomial. However, it may be the case that you know some factor of importance here that you are not telling us. For example, how is it that you know your matrix has eigenvalues with the property that you claim it does?