You could use the Arnoldi Iteration algorithm. This algorithm only requires the matrix A for matrix-vector multiplication. I'm expecting that you will be able to black-box the function v→Av. What you generate is an upper Hessenberg matrix H whose eigenvalues whose can be computed cheaply (by a direct method or Rayleigh quotient iteration) and which approximate the eigenvalues of A. Arnoldi Iteration will give the best approximation to the dominant eigenvalue so I suspect you won't have to do many iterations before you have a good estimate.
An excellent introduction to this is: "Numerical Linear Algebra" by Trefethen and Bau. (p250)
The basic algorithm can be found here: http://en.wikipedia.org/wiki/Arnoldi_iteration
Now the only thing that is required to make this a fully functional algorithm is a termination condition. Since you don't seem to need the dominant eigenvalue to a high degree of accuracy I would not worry and just stop when the dominant eigenvalue estimate doesn't change too much.
If you have Matlab you can always use the built in function eigs(Afun,n,...) where Afun is the black-box function handle that computes Av.
The information you have does not determine the dominant eigenvector.
Let $G$ be the graph with vertex set $\{0,1,\ldots,7\}$ and adjacency matrix
$$
\left(\begin{array}{rrrrrrrr}
0 & 1 & 1 & 0 & 0 & 0 & 0 & 0 \\\\
1 & 0 & 1 & 0 & 0 & 0 & 0 & 0 \\\\
1 & 1 & 0 & 1 & 0 & 0 & 0 & 0 \\\\
0 & 0 & 1 & 0 & 1 & 1 & 0 & 0 \\\\
0 & 0 & 0 & 1 & 0 & 1 & 0 & 0 \\\\
0 & 0 & 0 & 1 & 1 & 0 & 1 & 0 \\\\
0 & 0 & 0 & 0 & 0 & 1 & 0 & 1 \\\\
0 & 0 & 0 & 0 & 0 & 0 & 1 & 0
\end{array}\right)
$$
Construct a second graph $H_2$ by joining a new vertex to the vertex 2, and a third
graph $H5$ by joining a new vertex to vertex 5. Then
$$
(A(H_2)^k e)_2 = (A(H_5)^k e)_5
$$
for all $k$. (For $k=0,\ldots,8$ the actual numbers are
$$
1,\\ 4,\\ 8,\\ 25,\\ 57,\\ 163,\\ 392,\\ 1073,\\ 2656)
$$
The Perron vectors are
$$
(1, 1, 1.579071, 1.460275, 1.019079, 1.168003, 0.5330099, 0.206667, 0.612263)
$$
for $H_2$ and, for $H_5$,
$$
(1, 1, 1.579071, 2.0725388, 1.631342, 2.134811, 0.974205, 0.377735, 0.827744)
$$
If you want positive matrices, take the sixth powers of $A(H_2)$ and $A(H_5)$. The relevant
property of $G$ is that the graphs $G\setminus2$ and $G\setminus5$ are cospectral, and their
complements are cospectral too.
All computations carried out in sage.
In a sense the problem is that you are getting a bit of information about each eigenspace,
whereas you want detailed information about a particular eigenspace.
Best Answer
See (68) here: if $\lambda$ is a simple eigenvalue, $Av=\lambda v$ and $v$ is normalized to be orthogonal, $$ \frac{\partial v}{\partial A_{jk}} = (\lambda I - A)^+ E^{jk} v, $$ where $E^{jk}$ is the matrix with $1$ in position $jk$ and $0$ everywhere else. That plus symbol is a Moore-Penrose pseudoinverse.
The formula is easy to compute explicitly because it only requires the eigenvector itself and the pseudoinverse. Moreover, If the eigenvector is simple then you know that $\lambda I-A$ has rank $n-1$, so you don't have a (potentially difficult) rank decision to make.
Also, related answer with a technique that can produce those formulas.