The abstract point of view is that the graph is characterized by its adjacency matrix up to permutation. When we view the adjacency matrix as an abstract linear transformation, we are looking at it up to conjugation, which is stronger. So we lose some information, but the information that is left is still interesting. And any time we can apply linear algebra to a situation, that is a good thing because linear algebra is really easy compared to almost anything else. (This sounds trite, but it is one of the most-used principles in mathematics.)
The eigenvalues of the adjacency matrix describe closed walks on the graph. More precisely, if $A$ is the adjacency matrix, then the total number of closed walks on the graph of length $n$ is given by $\text{tr } A^n$, which is the same as $\sum_i \lambda_i^n$ where $\lambda_i$ are the eigenvalues. So knowing the eigenvalues says something about how closed walks grow. Unfortunately this perspective does not provide a direct interpretation of the eigenvectors.
Spectral methods apply particularly well to graphs with a lot of structure, such as strongly regular graphs. In the best case one can write down a matrix equation the adjacency matrix satisfies, and analyzing what this says about the eigenvectors and eigenvalues puts strong constraints on the graph. This is the mechanism behind the standard proof of the classification of Moore graphs of girth $5$.
There is a fairly concrete interpretation of the eigenvalues and the eigenvectors of the Laplacian matrix, as I explained in this math.SE answer. The short version is that one can set up three natural differential equations using the Laplacian, and these all give concrete interpretations of the eigenvalues and eigenvectors:
- The wave equation. Here the eigenvectors are the "harmonics" of the graph: they describe the standing wave solutions of the wave equation, and the eigenvalues describe the frequencies at which these waves resonate.
- The heat equation. Here the eigenvectors are still the "harmonics" of the graph, but the eigenvalues describe the rate at which heat decays for each harmonic. Morally this is about "Brownian motion" on the graph, so it is related to random walks.
- The Schrödinger equation. Here the eigenvectors are energy eigenstates of a continuous-time quantum random walk, and the eigenvalues are (up to a constant) energy eigenvalues.
When the graph is regular, the Laplacian and adjacency matrix are related in a very simple way; the eigenvectors are the same, and the eigenvalues can be easily related to each other. Things are more complicated in the irregular case.
The adjacency matrix is symmetric, so it has an orthonormal basis of eigenvectors. The first of these eigenvectors, a multiple of $[1\;1\;\cdots\;1]$, is orthogonal to all others, which is just the condition that the entries of any other eigenvector must sum to $0$.
Best Answer
This is probably not an ideal strategy for checking if a graph is connected, for a couple of reasons.
First, checking if an $n$-vertex graph is connected by a depth-first search will take $O(n^2)$ time when the graph is stored as an adjacency matrix. This is probably better than you're going to do finding eigenvalues or computing matrix powers.
Second, if you're going to be finding eigenvalues of some matrix, find the eigenvalues of the Laplacian matrix $L_G$ instead. More precisely, find the multiplicity of the $0$ eigenvalue: this tells you the number of connected components $G$ has. This can be done just by row-reducing $L_G$ to put it in row-echelon form, then counting the number of zero rows.
Gaussian elimination takes $O(n^3)$ arithmetic operations, so this is still worse than a non-algebraic approach. However, it is better to do this than to implement a strategy where you have to find all the eigenvalues and then do some more work with them.