I think it really comes down to looking at the lattice of invariant subspaces for each $A_i$. I believe this is what Igor basically is saying.
For a single real linear transformation $V \rightarrow V$ with distinct characteristic roots, $V$ decomposes as a sum of 1 and 2-dimensional invariant subspaces, and these generate the lattice of invariant subspaces. When you perturb the matrix by a known amount, the subspaces can wander around over a certain range --- how much they shift depends on how close together are the characteristic roots, and also on the angles between the subspaces. For simplicity,
think of the case with all real distinct eigenvalues. The projectivized action, on
$\mathbb{RP}^{n-1}$, then has $n$ fixed points. The first derivative of the projectivized map is simple to determine, given the eigenvectors and eigenvalues --- the eigenvalues of the derivative at a point for eigenvalue $\lambda$ are the ratios of the other eigenvalues to $\lambda$. Therefore, if you change the map by adding multiples of the other eigenvectors as a linear function of the projection to the eigenspace under consideration
you shift your given eigenvector by a known linear function of the coefficients.
If you have several matrices with all real distinct eigenvalues that are near enough to a commuting set of matrices, then presumably the eigenvectors are also near to each other, so you can tell which eigenvectors need to line up with each other. You have a good measurement of how hard it is to move each eigenvector in a given direction, so you can
find a point that's approximately equally hard for each (do this separately, for each matching collection of eigenvectors.) Then adjust each of the eigenvectors to go to that point, and repeat on the other eigenspaces.
When there are complex roots, there are 2-dimensional invariant subspaces, which appear as invariant circles in $\RP^{n-1}$. You can do much the same thing with these.
When there are multiple real or complex roots, then it's necessary to look at larger-dimensional invariant subspaces. One strategy is to first fix up invariant subspaces
associated with the product of all factors of the characteristic polynomial whose roots are in a small neighborhood, or a pair of complex conjugate neighborhoods. The goal is to get such subspaces to agree with, to contain, or to be contained in similar suspaces for the other matrices. If there are repeated roots, then first see if there's a small perturbation that makes multiple eigenvectors (or multiple 2-dimensional invariant subspaces) --- when this is possible, it gives a bigger target for matching commuting matrices. If not, look at the largest invariant subspace it contains.
I doubt if there's an easy way other than looking at these invariant subspace structures. Perhaps in the actual applications you have in mind, they're not so complicated.
The answer to the first question is "yes", see p 21 of
http://arxiv.org/pdf/math/0703532v1
On that same page, if you look at conditions (6-8), that should answer your second question with a little work...
Best Answer
Suppose for simplicity's sak that $A$ and $B$ are diagonalizable over $\mathbb{R}$ and are non-singular.
Let $V_a$ be the $a$-eigenspace of $A$. Then by anti-commutativity, we find $BV_a\subseteq V_{-a}$ etc. As $A$ and $B^2$ commute then there is an eigenvector $v\in V_a$ with $B^2v=b^2 v$ for some $v$. If we let $w=bv+Bv$ then $Bw=bw$ so $b$ is real (assuming $B$ has real eigenvectors). On the space $W$ spanned by $v$ and $w=b^{-1}Bv$ the linear transformation $A+B$ has matrix $$\left(\begin{array}{rr} a&b\\\\ b&-a\\\\ \end{array}\right)$$ which has an eigenvectors with eigenvalues $\pm\sqrt{a^2+b^2}$.