Smallest eigenvalue of a nearest neighbor matrix in $2$ dimensions.

eigenvalues-eigenvectorslinear algebramatricessymmetric matrices

Consider a 2D square lattice with $n \times n$ lattice sites. A matrix $M_n$ of size $n^2 \times n^2$ is constructed by setting $M_{ij} = u$ (where $0 \leq u \leq 1$) if sites $i$ and $j$ are nearest neighbors, and all the diagonal elements $M_{ii} = 1$.

For example, with $2 \times 2$ lattice sites, we have
$$M_2 = \begin{pmatrix} 1 & u & u & 0 \\ u & 1 & 0 & u \\ u & 0 & 1 & 0 \\ 0 & u & u & 1 \end{pmatrix}$$ (i.e. lattice 1 has nearest neighbors 2,3 and lattice 2 has nearest neighbors 1,4 etc…). The smallest eigenvalue of $M_2$ is $\lambda_\text{min}^{(2)} = 1-2u$, for $M_3$ it is $\lambda_\text{min}^{(3)} = 1-2\sqrt{2}u$, for $M_4$ it is $\lambda_\text{min}^{(4)} = 1-(1+\sqrt{5})u$. Numerically I seem to get $\lambda_{\text{min}}^{(N)} \to 1-4u$ as $N \to \infty$, but I am not sure how to prove it.

Using the Gershgorin circle theorem, I am able to get the bound $\lambda_{\text{min}}^{(N)} \geq 1-4u$ so it seems like the matrix here saturates the lower bound. Is there a way to prove this?

Best Answer

It is clear that when $u = 0$, $M_n$ has $1$ as its only eigenvalue. With that in mind, I will consider only the consider only the case where $0<u\leq 1$. $I_k$ denotes the $k\times k$ identity matrix.

Note that the matrix $A_n := \frac 1{u} (M_n - I_{n^2})$ is the adjacency matrix of the square grid graph. Because this square grid is the Cartesian product of the length-$n$ path with itself, we have $$ A_n = B_n \otimes I_n + I_n \otimes B_n, $$ where $B_n$ denotes the adjacency matrix of the path graph on $n$ vertices and $\otimes$ denotes a Kronecker product. It follows that the eigenvalues of $A_n$ are of the form $\lambda_i + \lambda_j$ for $1 \leq i,j \leq n$, where $\lambda_1,\dots,\lambda_n$ are the eigenvalues of $B_n$. It is known that the eigenvalues of $B_n$ are given by $$ \lambda_j = 2 \cos \left(\frac{\pi j}{n+1} \right), \quad j = 1,2,\dots,n. $$ It follows that the smallest eigenvalue of $B_n$ is given by $\lambda_n = 2 \cos\left(\frac{\pi n}{n+1} \right) = -2\cos\left(\frac{\pi}{n+1}\right)$. So, the smallest eigenvalue of $A_n$ is $2 \lambda_n = -4\cos\left(\frac{\pi}{n+1}\right).$ So, the smallest eigenvalue of $M_n = I_{n^2} + uA_n$ is given by $$ \lambda_{\min} = 1 - 4u \cos \left(\frac{\pi}{n+1} \right). $$ The conclusion follows.


Proof of the Cartesian product formula:

Suppose that we have graphs $G_1$ over nodes $[m]=\{1,\dots,m\}$ and $G_2$ over nodes $[n] = \{1,\dots,n\}$. Let $A^i$ denote the adjacency matrix of graph $G_i$ (for $i = 1,2$) and let $A$ denote the adjacency matrix of the Cartesian product $G = G_1 \square G_2$. Take the elements of $[m]\times[n]$ in lexicographical order, so that the entry $(e_i^{(m)} \otimes e_j^{(n)})^TA(e_p^{(m)} \otimes e_q^{(n)})$ is equal to $1$ iff the nodes $(i,j)$ and $(p,q)$ are adjacent within the Cartesian product $G$ (where $e_1^{(n)},\dots,e_n^{(n)}$ is the standard basis of $\Bbb R^n$).

We assume that $G_1$ and $G_2$ contain no loops, so $i = j$ and $i \sim j$ cannot simultaneously be true.

By the definition of the Cartesian product, we have $(e_i^{(m)} \otimes e_j^{(n)})^TA(e_p^{(m)} \otimes e_q^{(n)}) = 1$ iff $i=p$ and $j \sim q$ ($j$ is adjacent to $q$) or $i \sim p$ and $j = q$. Verify that this is logically equivalent to saying that $$ (e_i^{(m)} \otimes e_j^{(n)})^TA(e_p^{(m)} \otimes e_q^{(n)}) = \delta_{ip} A^{2}_{j,q} + A^{1}_{i,p} \delta_{jq}. $$ On the other hand, $$ (e_i^{(m)} \otimes e_j^{(n)})^T[A^1 \otimes I_m + I_n \otimes A^2](e_p^{(m)} \otimes e_q^{(n)}) = \\ (e_i^{(m)} \otimes e_j^{(n)})^T[A^1 \otimes I_m ](e_p^{(m)} \otimes e_q^{(n)}) + (e_i^{(m)} \otimes e_j^{(n)})^T[I_n \otimes A^2](e_p^{(m)} \otimes e_q^{(n)}) =\\ ((e_i^{(m)})^T e_p^{(m)})((e_i^{(n)})^T A_2 e_p^{(n)}) + ((e_i^{(m)})^T A_1 e_p^{(m)})((e_i^{(n)})^T e_p^{(n)}) =\\ \delta_{ip}A^2_{j,q} + A^{1}_{i,p} \delta_{jq}. $$ So, we indeed have $A = A^1 \otimes I_m + I_n \otimes A^2$.

Related Question