**Intro**

I'll present an incomplete interpretation framework.

Let's being by considering two points in one dimension. We can easily generalize the results to more than one dimension.

Our matrix is given by,

$$M=\begin{bmatrix} 0 & |x_1-x_2| \\ |x_1-x_2| & 0 \end{bmatrix}$$

The eigenvalues are given by
$$\lambda_1=-\lambda_2=|x_1-x_2|$$
The eigenvectors are given by
$$v_1=\begin{bmatrix} 1 \\ 1 \end{bmatrix} \quad v_2=\begin{bmatrix} 1 \\ -1 \end{bmatrix}$$

There are three parts to providing an interpretation of the above. First we need to know what $M$ does. Second, we need to be aware of what eigenvalues and eigenvectors are.

**Interpretation**

What does $M$ do? Well, the best way to figure this out is to express it in terms of the things we do know. Comparing this to a reflection matrix, we see that our matrix reflects vector components. In other words, the components of the vectors are switched. In addition, the matrix also scales the vector by the distance between the data points.

So our matrix $M$ operates on a vector $v$. What is $v$? Since it wasn't put in the question, I've taken the liberty of choosing the meaning of $v$. The vector $v$ denotes points the number of trips taken for each data point. As example, consider,

$$v=\begin{bmatrix} n_1 \\ n_2 \end{bmatrix}$$

We have $n_1$ to denote the number of trips between $x_1$ and the other data points. $n_2$ denotes the number of trips taken between $x_2$ to other data points. If we interpret $x_i$ to be actual points in a physical space, then $M(v)=d$ is a vector whose components $d_i$ denote the total distance traveled from $x_i$ to other points in space.

**Example**

Here's an explicit example. Imagine that you're a busy young adult thinking about where to live to juggle how trips between work, your mom's house, and a friend's house. Work is located at $x_1=55$, Mom's at $x_2=0$, and your friend's house is at $x_3=30$. We have,

$$M=\begin{bmatrix} 0 & 55 & 25 \\ 55 & 0 & 30 \\ 25 & 30 & 0 \end{bmatrix}$$

So lets imagine that you want to decide where to live. These are the only places you ever go, so you really want to pick a place close to one of the locations, effectively right next to one of them if possible. You know that you have to go to and from work 14 times a week wherever you live. You also know that mom's going to want to see you twice a week so you'll be making 4 trips to and from her place. Finally, you're in a band with your friend so you need three practice times a week so that's another six to and from trips. You also know that you'll always have to stop by where you live to get ready to go back out. Work clothes aren't good for band practice or dinner with mom, lugging a guitar around isn't convenient, and mom will be furious if you leave for work or to see your friend. Putting this together we have,

$$v=\begin{bmatrix} 14 \\ 4 \\ 6 \end{bmatrix}$$

So to get the total distance you'll have travel if you live at each of these locations, we need to multiply $M$ by $v$.

$$M \cdot v=d=\begin{bmatrix} d_1 \\ d_2 \\ d_3 \end{bmatrix}=\begin{bmatrix} 370 \\ 950 \\ 470 \end{bmatrix}$$

So if you live next to work, you'll have to travel $d_1=370$ units per week.
If you live next to mom, you'll have to travel $d_2=950$ units per week.
If you live next to your friends, you'll have to travel $d_3=470$ units per week.

So clearly, its best to live next to work. Note that this is very similar to what happens in real life!

However, it may also be important to find $v$ such that,

$$M \cdot v=\lambda \cdot v$$

For instance, imagine the scenario above with a slight modification. Instead of wondering where to live, our young adult had a less hectic job that let him make his own schedule, something more reasonable than working 7 days a week! In this case, they might want to make a number of visits to each location so that the total distance traveled, based at any location, is directly proportional to the number of trips you want to make. Sounds a little idealistic, however, its great for weekly planning.

**Generalization**

Generally speaking, the situation I talked about was a weighted graph. Each edge was weighted with the distance between vertices. Each vertex represented a location. The Distance Matrix, with applications, is simply the adjacency matrix for the graph with weights included.

Using this we can define new situation using graphs. In fact there are also others posts dealing with meaning of adjacency matrix eigenvalues. For instance, see here.

Here's a formal proof: suppose that $v$ denotes a length-$1$ eigenvector of the covariance matrix, which is defined by
$$
\Sigma = \Bbb E[XX^T]
$$
Where $X = (X_1,X_2,\dots,X_n)$ is a column-vector of random variables with mean zero (which is to say that we've already absorbed the mean into the variable's definition). So, we have $\Sigma v = \lambda v$ (for some $\lambda \geq 0$), and $v^Tv = 1$.

Now, what do we really mean by "the variance of $v$"? $v$ is not a random variable. Really, what we mean is the variance of the associated component of $X$. That is, we're asking about the variance of $v^TX$ (the dot product of $X$ with $v$). Note that, since the $X_i$s have mean zero, so does $v^TX$. We then find
$$
\Bbb E([v^TX]^2) = \Bbb E([v^TX][X^Tv]) = \Bbb E[v^T(XX^T)v] =
v^T\Bbb E(XX^T) v \\
= v^T\Sigma v = v^T\lambda v = \lambda(v^Tv) = \lambda
$$
and this is what we wanted to show.

## Best Answer

I dont know if this is the intuition you are looking for, but basically eigenvectors and diagonalization uncouple complex problems into a number of simpler problems.

In physics often stress or movement in one direction, the $x$ direction will cause a stress or movement in the $y$ and $z$ direction. An eigenvector is a direction where stress or movement in the eigendirection remain in the eigendirection, thus chosing an eigen basis replaces a complex $3$dimensional problem by three $1$-dimensional problems.

You talk about pairwise combinations of inputs, eigenvectors simplify this into function of just one input.

Look up weakly coupled oscillatory systems in a dynamics book. Systems wich complex oscillations are analysed into eigenvectors with periodic oscillations.