[Math] Spectrum of adjacency matrix of complete graph

eigenvalues-eigenvectorsgraph theoryspectral-graph-theory

Fooling around in matlab, I did an eigenvalue decomposition of the adjacency matrix of $K_5$.

 A = [0     1     1     1     1;
      1     0     1     1     1;
      1     1     0     1     1;
      1     1     1     0     1;
      1     1     1     1     0];

The eigenvalues are $4, -1, -1, -1, -1$. It now seems obvious that the largest eigenvalue for the adjacency matrix for $K_n$ would be $n-1$. What I don't understand is why $-1$ is an eigenvalue of multiplicity $n-1$. Is there any special meaning to this?

Best Answer

It is possible to give a lower bound on the multiplicities of the eigenvalues of the adjacency or Laplacian matrix of a graph $G$ using representation theory.

Namely, the vector space of functions $G \to \mathbb{C}$ is a representation of the symmetry group, and both the adjacency and Laplacian matrix respect this group action. It follows that each irreducible subrepresentation lies in an eigenspace of the adjacency and Laplacian matrices, so their dimensions give a lower bound on the multiplicities.

In this particular case, $K_n$ has symmetry group the full symmetric group $S_n$, and the corresponding representation decomposes into the trivial representation (this corresponds to the eigenvalue $n-1$) and an $n-1$-dimensional irreducible representation, so there can only be one other eigenvalue and it must have multiplicity $n-1$. Moreover, the sum of the eigenvalues is $0$ because the trace of the adjacency matrix is zero, so the remaining eigenvalue must be $-1$.

This is all just a very fancy way of saying that

Any permutation of an eigenvector of $K_n$ is also an eigenvector with the same eigenvalue. If the eigenvector doesn't have all of its entries equal, then its permutations span a vector space of dimension $n-1$, so the remaining eigenvalue has multiplicity $n-1$.

Which is in turn just a very fancy way of saying that

Any vector $v$ all of whose entries sum to zero is an eigenvector of the adjacency matrix $A_n$ of $K_n$; moreover, it is easy to see that $(A_n + I)v = 0$ for any such $v$, hence $A_n v = -v$.

That is a straightforward explanation but it misses the bigger lesson, which is that

Highly symmetric graphs have eigenvalues of high multiplicity.


Another argument proceeds using the observation that, on the one hand, $\text{tr}(A^k)$ counts the number of walks of length $k$ which begin and end at the same vertex and, on the other hand, $$\text{tr}(A^k) = \sum_{i=1}^n \lambda_i^k$$

where $\lambda_i$ are the eigenvalues. This sequence uniquely determines the eigenvalues (exercise), so to prove the desired result it suffices to prove that $$\text{tr}(A^k) = (n-1)^k + (n-1)(-1)^k.$$

Now, the total number of walks of length $k-1$ is just $n(n-1)^{k-1}$, where the first $n$ counts the number of possible starting vertices and each factor of $n-1$ counts the possible next vertices. If such a walk ends at the original vertex (so is itself a closed walk of length $k-1$), it cannot be extended to a closed walk of length $k$; otherwise, it can be uniquely extended as such. Thus it follows that $$n(n-1)^{k-1} = \text{tr}(A^{k-1}) + \text{tr}(A^k)$$

and then the desired result follows by induction from the initial condition $\text{tr}(A^0) = n$.

Related Question