Why aren’t the eigenvectors of a graph’s adjacency matrix useful for the graph isomorphism problem

computational complexitygraph theorygraph-isomorphismspectral-graph-theory

I know eigenvalues of a graph's adjacency matrix are very heavily studied because of their relationship to the structure of the graph, but how come no one studies the eigenvectors? Even if they don't say much about the structure, isn't it possible that they might say enough to avoid the problem of isospectral graphs?

Best Answer

You have to realize that graph isomorphism is very easy for the typical case, and hard in very unusual cases: where we are comparing two graphs that are very similar, and also highly symmetric.

In the typical case, if the graphs are not isomorphic, the eigenvalues are already likely to tell us that. (Usually, that's not how graph isomorphism algorithms proceed, though; they try to identify canonical ways to partition vertices into different types, and refine those partitions, in a less algebraic and more combinatorial way.)

In highly symmetric cases, looking at the eigenvectors is much more difficult, because the eigenvectors can be rewritten in many ways:

  1. As already pointed out in the comments, they need to be compared up to permuting the vertices; if we knew which order to compare them, we'd already be done.
  2. Highly symmetric graphs have many repeated eigenvalues. For example, the $4$-dimensional hypercube graph has eigenvalues $-4, -2, 0, 2, 4$ with multiplicity $1, 4, 6, 4, 1$ respectively. So we're really comparing two eigenspaces, which have many different bases. (Even for a $1$-dimensional eigenspace, the eigenvector is only defined up to scaling - or up to sign, if we normalize.)

Asking "are these two sets of eigenvectors the same, up to the various symmetries we have?" is in many ways the same problem as asking "are these two adjacency matrices the same, up to a permutation of the vertices?" So we haven't made the problem easier by finding the eigenvectors.

I can imagine that, situationally, eigenvectors might be useful. For example, if you compute the eigenvalues and find one (but not one of the useless ones) with multiplicity $1$, then you could look at its eigenvector and label each vertex with the corresponding value. The values might not all be different, but this would give us a starting partition to feed into a stronger isomorphism algorithm.

I don't know that these algorithms don't do that already! But it seems like one of many possible heuristics with many branching cases, and a very complicated isomorphism algorithm might want to pull this one out if the circumstances warrant.

Related Question