Hi, everyone!
I have a problem of computing the dominant eigenvector. When I want to approximate the dominant eigenvector of a large sparse matrix via the famous Arnoldi method, I am wondering how to choose the reduced order $k$ (i.e., the number of Arnoldi iterations). I know that the approximation accuracy is relevant to the choice of $k$. As $k$ approaches to $n$ (the dimension of the full space) , the accuracy is very high. Now the problem is that given an accuracy $e$, can we find an approperate $k$ (depending on $e$) such that the difference between my approximate dominant eigenvector via Krylov subspace and the exact one is less than $e$? That's to say, I need to find an a-prior error bound, rather than an a-posterior error bound (however, I only found some a-posterior error bound results in some papers). Could you give me some suggestions? Thanks!
Best Answer
This is a very preliminary answer; my knowledge of these things is rather dated, but I got curious and found something of potential interest, so I thought I might as well share it.
There is a recent paper
(In case you do not have access to the journal there is a technical report that is similar to be found here http://www-users.cs.umn.edu/~saad/reports.html (year 2007). However, the discussion below referes to the article.)
At the end of Section 4.1 (where they analyse the algorithm under some assumptions) one can find the following:
And at the beginning of the conclusions:
So, it seems to me, that the a priori bounds that you seek in general might not exists/be known. In case you know something on the matrix (in particular, symmetric/hermitian) then the situaton seems to be better.
Of course, I quoted the 'negative' passages of the paper only, and it mainly contains 'positive' results (so, if you are not yet aware of it, might well be worth looking at), but as far as I could see not of the strength you seem to be hoping for.