Does this mean, then, that the projection operator associated with λi can be related to the sum of outer products of eigenvectors with the same eigenvalue:
Yes. For a normal matrix, when $T$ is diagonalizable, it can be decomposed into:
$$
T = \lambda _1P_1 + \lambda _2P_2 + ...
$$
The Projection matrices $P_i$ or $q_j q_j^*$ form eigenspaces. For a repeated eigenvalue, the corresponding eigenvectors form the basis of an eigenspace. These eigenspaces are orthogonal to each other.
for instance, I believe that V needs to be an inner product space for the spectral decomposition to be formulated)
Spectral Theorem states that Every Normal matrix can be diagonalized by a complete set of orthonormal eigenvectors. So I don't think any other condition needs to be satisfied other than the normality condition.
According to PF theorem, the characteristic polynomial of $A$ is in the form $\chi_A(x)=(x-\lambda)f(x)$ where $\lambda>0$ and the roots of $f$ have modulus $<\lambda$. Moreover, there is a (unique up to a factor) vector $v>0$ s.t. $Av=\lambda v$. Since $A^T$ is primitive, there is a (unique up to a factor) vector $u>0$ s.t. $u^TA=\lambda u^T$. Since $u^Tv>0$, we can choose the above factors s.t. $u^Tv=1$.
There is a basis in the form $v,\cdots$ s.t., for this change of basis of matrix $P\in M_n(\mathbb{R})$, $A=Pdiag(\lambda,B_{n-1})P^{-1}$ where $\chi_B(x)=f(x)$; then $B$ has a spectral radius $\rho(B)<\lambda$, that is, $\rho(\lambda^{-1}B)=\mu<1$.
Thus $(\lambda^{-1}A)^k=Pdiag(1,(\lambda^{-1}B)^k)P^{-1}$ tends, when $k\rightarrow\infty$, to the rank $1$ projector $M=Pdiag(1,0_{n-1})P^{-1}$; moreover, for every $\epsilon>0$, $||(\lambda^{-1}A)^k-M||=O((\dfrac{\mu+\epsilon}{\lambda})^k)$.
A projector is uniquely defined by $im(M)$ and $\ker(M)=(im(M^T))^{\perp}$, that is, by $im(M), im(M^T)$.
Notice that $Mv=\lim_k (\lambda^{-1}A)^kv=v,u^TM=\lim_k u^T(\lambda^{-1}A)^k=u^T$. Then the rank $1$ projector $M$ is uniquely defined by $im(M)=span(v),im(M^T)=span(u)$.
Now $R=vu^T$ is also a projector with image $span(v)$ and $im(R^T)=span(u)$. Then $M=vu^T$.
Best Answer
I ended up finding out the answers for myself, so recording here for posterity.
Firstly, recall that one definition for an eigenvector $v$ with eigenvalue $\lambda$ of a matrix $A$ is that $$(A-\lambda I)v = 0$$ The eigenvectors provide an orthonormal set of vectors, so since $$\mathbf{y} = \sum^n_{i=1}\alpha_i \mathbf{u}_i$$
$$(A+\beta I)\mathbf{y} = A\mathbf{y}+\beta I\mathbf{y} = \sum_{i=1}^n\sum_{j=1}^n \alpha_j \lambda_i \mathbf{u}_i \mathbf{u}_i^T \mathbf{u}_j + \beta\sum_{i=1}^n\alpha_i \mathbf{u}_i$$
and since because of the orthonormal property $\mathbf{u}_i^T \mathbf{u}_j = 1$ for $i=j$ and 0 otherwise, this reduces to
$$(A+\beta I)\mathbf{y} = \sum_{i=1}^n \alpha_i(\lambda_i + \beta)\mathbf{u}_i$$
so $(A+\beta I)$ is equivalent to a matrix $A'$ with the same eigenvectors as in $A$, but with eigenvalues $\lambda_i' = \lambda_i + \beta$.
Secondly, as shown in this MSE question the eigenvectors of a matrix $M$ are the same as those for its inverse $M^{-1}$, with reciprocal corresponding eigenvalues. So $$M^{-1} = \sum_{i=1}^n \frac{1}{\lambda_i} \mathbf{u}_i \mathbf{u}_i^T$$ where $\mathbf{u}_i$ and $\lambda_i$ are the $i$th eigenvector and eigenvalue, respectively, of $M$. So using the above composite matrix $A'=A+\beta I$ for $M$, we have $$(A+\beta I)^{-1} = \sum_{i=1}^n \frac{1}{\lambda_i + \beta} \mathbf{u}_i \mathbf{u}_i^T$$