[Math] Adjacency matrix and changing order of vertices

discrete mathematicsgraph theory

I'm having trouble seeing the permuting of rows and columns when a vertex order is changed:
The definition given:
Let $G$ be a graph with an ordered listing of vertices $v_1,v_2,\ldots,v_n$.
The adjacency matrix of $G$ is the $n\times n$ matrix $A = [a_{ij}]$ with
$$a_{ij} = \#\text{edges connecting $v_i$ and $v_j$.}$$
The entries $a_{ij}$ depend on the order in which the vertices have been numbered.
Changing the vertex order corresponds to permuting rows and columns.
I'm given these graphs (left one is the first graph and second is the vertex $v_2$ and $v_3$ swapped:

The adjacency matrices are
$$A_1 = \begin{pmatrix}0 && 2 && 0 \\ 2 && 0 && 1\\ 0 && 1 && 1\end{pmatrix}$$
$$A_2 = \begin{pmatrix}0 && 0 && 2\\ 0 && 1 && 1 \\2 && 1 && 0\end{pmatrix}$$

I can't see the "permuting" of rows/columns here.
I see some values do swap around once (which makes sense), but it looks like we also permuted the main diagonal as well.
I want to try and understand this so I can try to understand the general case ($|V| = n$, and we swap say vertices $v_1,v_4,v_7$ around).

Best Answer

As aPaulT stated previously, this is a joint swap of the 2nd and 3rd rows, then columns (in either order). This can be represented by left- and right-multiplication of the adjacency matrix by appropriate permutation matrices.

Let $P$ be a left-multiplication permutation matrix (row-swapping). $P^T$ is therefore the right-multiplication permutation matrix (col-swap) with the same ordering as $P$:

$P = \begin{bmatrix} 1&0&0 \\ 0&0&1 \\ 0&1&0 \end{bmatrix}$

In this case, $P$ is equal to $P^T$ exactly; however, it is not hard to find an example of where this is not the case.

To produce the adjacency matrix of the node-swapped graph $A'$, perform the following operation:

$A' = PAP^T$

Related Question