Note that two things can go wrong for irregular matrices.
One thing that can go wrong is that the matrix is reducible. What this means for the Markov chain is that there is a non-zero possibility of staying within a proper subsets of the available states. In such a case, we'll end up with a space of steady-state vectors, corresponding to multiple "final distributions".
Another thing that can go wrong is that the matrix is periodic. What this means is that you're caught in cycles of a particular length, so that you can only get back to where you started on steps that are multiples of some period $n$. In such a case, you get most of the same properties that you had before, only now you'll have multiple (complex) eigenvalues of absolute value $1$. There is still, however, a unique steady-state vector.
It turns out that these are the only things that can go wrong, in some combination.
For more information on this and everything I've said here, see this wiki page and, in particular, the Frobenius theorem for irreducible matrices.
Consider the iterations $$v_{i+1}^T=v_i^TA$$
If $v_i$ indicates the probability distribution of states at time $i$, then $v_{i+1}$ indicate the distribution of states at time $i+1$.
In fact, convergence is not guaraneteed. For example, consider $\begin{bmatrix} 0 & 1 \\ 1 & 0\end{bmatrix}$, suppose we let $v_0=\begin{bmatrix} 0.3 \\ 0.7\end{bmatrix}$, we can observe that the sequence of vectors will just iterate between $\begin{bmatrix} 0.3 \\ 0.7\end{bmatrix}$ and $\begin{bmatrix} 0.7 \\ 0.3\end{bmatrix}$ and hence it doesn't conveges. This is true for almost all the vectors besides $v_0 = \begin{bmatrix} 0.5 \\ 0.5\end{bmatrix}$.
Hence, the initial distribution does decide whether it converges and if so to which equilibrium distribution. There are also cases where equilibrium solution is unique and it is independent of initial distribution.
Short answer when things are nice:
Let's consider the special case where $\lim_{n \rightarrow \infty} A^n =B$ exists.
Notice that $B$ has the property that $BA=B$.
Let's consider the vector $y^T=v_0^TB$.
Then, $y^TA = v_0^TBA=v_0^TB=y^T$
Hence $y$ is an eigenvector. Each row of $B$ is a distribution and $v_0$ decides the weight for combinations of such distributions. In the events that each row of $B$ is identical, then it is independent of the choice of $v_0$. Ergodic unichain guarantees that each row of $B$ is identical.
Best Answer
Finding a full eigendecomposition costs $O(n^3)$ operations and $O(n^2)$ memory space, where $n$ is the side of the matrix. The size $n$ of the Markov matrix is the number of indexed web pages, which is of the order of $10^9$, so a full factorization would be prohibitive.