Johnson graphs do not cause difficulty to existing programs. Actually they are rather easy; nauty can handle them up to tens of millions of vertices, and so can other programs such as Traces and Bliss.
The difficulty that Babai refers to is more theoretical. The Johnson graph $J(v,t)$, which has $n=\binom vt$ vertices, has the property that $\Theta(n^{1/t})$ vertices need to be fixed in order for partition refinement to produce a partition with all cells at most $cn$ in size ($c<1$), nor does the group have a nice recursive structure like a wreath product. Babai says that a major part of his breakthrough was to realise that only the Johnson graphs (and closely related graphs) have these undesirable properties. With that knowledge, special processing takes care of them. Previously it could have been that some other families of graphs exist with similar properties.
I think this description is reasonably close. I recently listened to Babai speaking on his algorithm for 5 hours but some parts remain unclear to me.
Btw, the automorphism group of $J(v,t)$ for $t\lt v/2$ is $S_v$ acting on $t$-sets. Programs find them easy because of this rich group.
About the "supplementary problem", fixing $O(\log n)$ vertices helps very little. It is necessary to match $\Theta(n^{1/t})$ vertices in each graph before elementary methods will provide much further information about isomorphisms. It's hard to define "elementary methods", but they include all the obvious things like classifying vertices according to their adjacencies with the fixed vertices and more generally fixed-dimensional Weissfeiler-Lehman refinement.
Babai's statement that you quote is easy to misunderstand. It isn't the Johnson graphs themselves that were a scourge to theorists. It's more that the Johnson graphs showed that certain difficult structural properties (mentioned above) are possible and nobody knew a general method for handling these properties. Once Babai showed that only the Johnson graphs caused that structure, the difficultly vanished.
Best Answer
The first place to look at is E.M. Luks algorithm for polynomial GI for graphs of bounded degree. The algorithm is group-theoretic, and utilizes the fact that subgroups of products of $S_k$ for bounded $k$ always have subgroups of bounded index, which allows to divide-and-conquer over cosets of an automorphism subgraph stabilizer efficiently (since for bounded degree graphs it is contained within products of $S_k$ induced by automorphisms of neighbourhoods of each vertex).
A recent breakthrough of Babai solves GI in $\mathrm{exp}(O((\log n)^c))$ time. One of the main ideas (which is not particuraly novel to this paper) is that either the (primitive) quotient of an automorphism subgraph stabilizer either has subgroups of small index (which allows to apply the previous result), or contains an automorphism group of a large Johnson scheme. The paper then proceeds to treat the second case with an efficient symmetry-breaking procedure, which is novel and highly non-trivial.
It would probably be an overly simplified and very imprecise statement, but this implies that the hardest case for GI are graphs obtained from a Johnson graph in a non-trivial way.
More about regular graphs (as asked in OP): all hard cases are regular of course, but some regular graphs are more susceptible than others. One usual method of solving GI is to promote several vertices by giving them unique colors, and doing WL-refinement or other tricks; if promoting a small set allows to refine the most of the graph, then this graph is easy for GI. The majority of regular graphs are easy with respect to this approach. The Johnson graph is hard in the following sense: to reduce the size of the largest equivalence class of vertices below $cn$ for $c < 1$, we may need to promote $\Omega(n^{c'})$ vertices, which prevents the trivial "try all sets" method to work within reasonable time.