[Math] Connectedness of a regular graph and the multiplicity of its eigenvalue

graph theorylinear algebraspectral-graph-theory

Suppose $X$ is a $k$-regular graph with adjacency matrix $A$. I wish to show that if $k$ has multiplicity $1$ as an eigenvalue of $A$ then $X$ is connected.

By way of contradiction I assume that X is not connected. Now A can be block diagonalized, $A=diag(A_1,A_2\cdots A_t)$ where each $A_i$ is the adjacency matrix of a connected component and is of order $n_i\times n_i$. Furthermore if $v$ is the eigenvector corresponding to $k$ then the first $n_1$ entries in $Av$ are obtained from linear combinations of the columns $A_1$, the next $n_2$ entries from linear combinations of the columns of $A_2$ and so on. At this point I am stuck. Ideally I would like to construct two linearly independent eigenvectors corresponding to $k$ to derive a contradiction. It's not clear to me how I can do that.

Can someone help please.

PS: My intuition is that the vector consisting of first $n_1$ entries of $v$ followed by zeroes, is an eigenvector, and $n_1$ zeroes followed by the corresponding $n_2$ entries of $v$, followed by zeroes is another eigenvector, independent of the first. Is this correct?

Best Answer

Your intuition is totally correct.

$v = \mathbf{1}$ is an eigenvector to eigenvalue $k$, since $A \mathbf{1} = k \mathbf{1}$.

Now, let $v_i$ denote the characteristic function of the $i$'th connected component, i.e., $v_i(x)$ is $1$ for all vertices $x$ within the $i$'th component and $0$ else, thus, it is $\mathbf{1} = \sum_{i=1}^t v_i$.

For any $i = 1, \ldots, t$ you get that $A v_i = k v_i$.

Thus, as you argued, if $X$ is not connected, then you can use the block diagonalized form to show that the $v_i$'s are $t>1$ distinct (even orthogonal) eigenvectors to the eigenvalue $k$.

Clearly, $\mathbf{1}$ is still an eigenvector, but it is composed by multiple other eigenvectors that form the $t$-dimensional eigenspace to the eigenvalue $k$.

Further, you do not even need the block diagonalized form, although it clearifies what's going on. In general, right-multiplication to the adjacency matrix with a vector $w \in \{0,1\}^n$ gives a vector that collects amount $1$ from all $1$-valued neighbors. So, the positions of the zero-entries in $A w$ match the positions of the zero-entries in $w$ if and only if $w$ is a characteristic vector of some connected component.

Related Question