Well, if you add $c I$ to your matrix, for some reasonable value of $c,$ it will become nonsingular. As for the left inverse, since your matrix is sparse, to compute the backward iteration you can use the conjugate gradient method, which will be very fast (there are fancier "Krylov space" methods, but you need not go there, unless conjugate gradient fails).
First, the definition of $positive$ should be clarified. It could mean all entries strictly positive, or merely nonnegative, or that the matrix is primitive (all entries nonnegative and some power strictly positive).
I will use $\rho(M)$ to denote the spectral radius of a matrix $M$, which in case $M$ is nonnegative, is the Perron eigenvalue.
Note a weird example: if $M$ is strictly upper triangular with all entries above the diagonal strictly positive, then the spectral radius is $0$, and the same for $M^T$. However, $M+M^T$ is primitive, so its spectral radius exceeds zero. We can modify this to obtain strictly positive $M$ with $\rho(M)$ small, but $\rho(M+M^T)$ big, simply by adding a tiny amount to all the zero entries. In this case, $\rho(C)$ can be made much larger than $\rho(A) + \rho(B) $.
A well-known result ($well-known$ means I can't give a reference off the top of my head) is that if $C \geq A$ entrywise, $C \neq A$, and $A$ is strictly positive (or merely primitive), then $\rho(C) > \rho (A)$. (But the argument is easy: use an H-transform to convert $A$ to one with all column sums equal; adding anything will increase the spectral radius.) So if $A$ and $B$ are primitive, then $\rho(A+B) > \max \{\rho(A), \rho(B) \}$.
I doubt there's much of anything beside the obvious that can be said about the Perron eigenvectors.
Perhaps you have matrices in special forms in mind?
Best Answer
You use inverse iteration, as described very well in Sanghavi's UTexas notes.