It is possible to give a lower bound on the multiplicities of the eigenvalues of the adjacency or Laplacian matrix of a graph $G$ using representation theory.
Namely, the vector space of functions $G \to \mathbb{C}$ is a representation of the symmetry group, and both the adjacency and Laplacian matrix respect this group action. It follows that each irreducible subrepresentation lies in an eigenspace of the adjacency and Laplacian matrices, so their dimensions give a lower bound on the multiplicities.
In this particular case, $K_n$ has symmetry group the full symmetric group $S_n$, and the corresponding representation decomposes into the trivial representation (this corresponds to the eigenvalue $n-1$) and an $n-1$-dimensional irreducible representation, so there can only be one other eigenvalue and it must have multiplicity $n-1$. Moreover, the sum of the eigenvalues is $0$ because the trace of the adjacency matrix is zero, so the remaining eigenvalue must be $-1$.
This is all just a very fancy way of saying that
Any permutation of an eigenvector of $K_n$ is also an eigenvector with the same eigenvalue. If the eigenvector doesn't have all of its entries equal, then its permutations span a vector space of dimension $n-1$, so the remaining eigenvalue has multiplicity $n-1$.
Which is in turn just a very fancy way of saying that
Any vector $v$ all of whose entries sum to zero is an eigenvector of the adjacency matrix $A_n$ of $K_n$; moreover, it is easy to see that $(A_n + I)v = 0$ for any such $v$, hence $A_n v = -v$.
That is a straightforward explanation but it misses the bigger lesson, which is that
Highly symmetric graphs have eigenvalues of high multiplicity.
Another argument proceeds using the observation that, on the one hand, $\text{tr}(A^k)$ counts the number of walks of length $k$ which begin and end at the same vertex and, on the other hand,
$$\text{tr}(A^k) = \sum_{i=1}^n \lambda_i^k$$
where $\lambda_i$ are the eigenvalues. This sequence uniquely determines the eigenvalues (exercise), so to prove the desired result it suffices to prove that
$$\text{tr}(A^k) = (n-1)^k + (n-1)(-1)^k.$$
Now, the total number of walks of length $k-1$ is just $n(n-1)^{k-1}$, where the first $n$ counts the number of possible starting vertices and each factor of $n-1$ counts the possible next vertices. If such a walk ends at the original vertex (so is itself a closed walk of length $k-1$), it cannot be extended to a closed walk of length $k$; otherwise, it can be uniquely extended as such. Thus it follows that
$$n(n-1)^{k-1} = \text{tr}(A^{k-1}) + \text{tr}(A^k)$$
and then the desired result follows by induction from the initial condition $\text{tr}(A^0) = n$.
Your $C_n$ is a circulant matrix. All $n\times n$ complex circulant matrices share a common set of eigenvectors, and there is an explicit formula for their eigenvalues (see Wikipedia). In your specific case, the eigenvalues are $\omega_j+\omega_j^{n-1}$ for $j=0,1,2,\ldots,n-1$, where $\omega_j=e^{2\pi ij/n}$.
Best Answer
Think of a cycle permutation on $n$ vertices, or a cyclic directed graph with edges pointing only one way. For example, such a cycle matrix $M_4$ on four vertices would look like $$ M_4 = \begin{pmatrix} & & & 1 \\ 1 & & & \\ & 1 & & \\ & & 1 & \end{pmatrix} $$ You can see that for this matrix $M_n$ that we have $M_n^n = I$, and furthermore $M_n^k \neq I$ for any $1 \leq k < n$. So the characteristic polynomial of this matrix is the same as the minimal polynomial, and is $(x^n - 1)$. Therefore the eigenvalues of $M_n$ are the $n$th roots of unity, $1, \omega, \omega^2, \ldots, \omega^{n-1}$ where $\omega = \exp(2 \pi i / n)$.
Now you need to somehow see that the eigenvectors of this matrix $M_n$ are of the form $(1, 1, \ldots, 1)$, $(1, \omega, \omega^2, \ldots, \omega^{n-1})$, $(1, \omega^2, \omega^4, \ldots, \omega^{2n - 2})$ with eigenvalues $1, \omega, \omega^2$ and so on. (This is not hard to do if you explicitly write out a vector, declare it to be an eigenvector with a certain eigenvalue and just apply the matrix to it and check the conditions).
Now comes the last part. The matrix $M_n^{-1}$ has the same eigenvectors, with inverse eigenvalues. So the vector $(1, \omega^2, \omega^4, \ldots, \omega^{2n - 2})$ is an $M_n$-eigenvector with eigenvalue $\omega^2$, and an $M_n^{-1}$-eigenvector with eigenvalue $\omega^{-2}$. Therefore their sum $M_n + M_n^{-1}$ has eigenvalues $1 + 1 = 2$, $\omega + \omega^{-1} = 2 \cos (2 \pi / n)$, $\omega^2 + \omega^{-2} = 2 \cos(4 \pi / n)$, and so on. Furthermore, $M_n + M_n^{-1}$ is the adjacency matrix of the cycle graph.