I'm trying to follow a blog post about Graph Convolutional Neural Networks. To set up some notation, the above blog post denotes a graph $\mathcal{G}$, it's adjacency matrix $A$, and the degree matrix $D$.
A section of that blog post then says:
I understand how an adjacency matrix can be row-normalised with $A_{row} = D^{-1}A$, or column normalised with $A_{col} = AD^{-1}$.
My question: is there some intuitive interpretation of a symmetrically normalized adjacency matrix $A_{sym} = D^{-1/2}AD^{-1/2}$?
Best Answer
TL;DR: $\mathrm{A}_{sym}$ is doing some sort of average of your neighbours while taking into account their number of neighbours (being connected to a node connected to all nodes gives less information than if it's connected only to you). The square roots make sure that the largest eigenvaue is $\lambda_1=1$ to be able to stack a large number of layers.
As requested this answer will be based on intuition, although math stack exchange might not be the best place for those.
Preliminaries:
Answer:
Here is an intuitive explanation of what would happen for different $\hat{\mathrm{A}}_{?}$: