Weighted adjacency matrix normalization for GCN, how to normalize? what about self-loop values

graph theorygraph-neural-networkmachine learningnormalization

I am implementing a GCN that will work on a weighted graph. The edges' weights are in the range [1, 250]. When it comes to normalizing the adjacency matrix for GCNs, the standard formula of a convolutional layer is:

$$H^{(l+1)} = \sigma \left( \tilde{D}^{-\frac{1}{2}} \tilde{A} \tilde{D}^{-\frac{1}{2}} H^{(l)} W^{(l)} \right)$$

where

$$\tilde{A} = A + I_N$$
$$\tilde{D}_{ii} = \sum_j \tilde{A}_{ij}$$

In case of a weighted graph, the adjacency matrix could contain values >> 1. When adding self-loops (adding the identity matrix to the adjacency matrix), these will have a weight equal to 1. However, other entries not on the diagonal of $A$ could be far greater than one. This means that when multiplying the feature matrix with the adjacency normalized using the degree matrix, the attributes of the nodes for which we are calculating and embedding will have a smaller influence than the other neighbours. Especially if the connection of this node to the neighbours have a large value for the edge weights.

How to solve this problem? How is it done in the literature? Could you point me to some examples?
I couldn't find anything specific.

Thank you

Best Answer

I think this can be addressed in two ways:

  • Pick a suitable self-connection weight based on the meanings of the edges, e.g. in a correlation graph, where the edge strength measures the correlation between different nodes, the weight $1$ would make sense as a node's correlation with itself is $1$. In your case, you should decide how many times a node should be connected to itself. This may be achieved also by normalizing your graph beforehand, e.g. [1, 250] --> [1/250, 1]
  • Introduce a new hyperparameter (HP) $\lambda$, and reformulate your normalizing procedure as $\tilde A=A+\lambda I_N$. Pick this $\lambda$ as you would for any other HP.

Otherwise, it seems inevitable to hit the issue you're describing.

Related Question