[Math] Measuring the entropy of a graph representing a transition probability matrix of a first order markov chain

entropygraph theorymarkov chainsmatrices

There's a research project i'm currently working on which requires me to analyze various aspects of "worlds" represented by transition probability matrices, where the nodes represent objects in the world that our "learner" travels through.
I'm looking for a way to measure the degree to which the structure of the connections in a given a graph provide the learner with any predictive power, for example, in a graph where all the vertices have the same value, the learner has absolutely no ability to predict the next node/nodes given the current one.

The graphs are all weighted and directed, with the sum of outgoing connections from each node normalized to one.

I'm having a bit of trouble coming up with a good way to measure this and would really appreciate some advice,
Thanks,
Ron

Best Answer

Here is an example (not very nice though):

$P=\begin{pmatrix}1/6&1/2&1/3\\1/2&0&1/2\\0&1/3&2/3\end{pmatrix}$

Eigenvector for eigenvalue 1: $p=(1, 5/3, 7/2)$

Normalization: $p_\text{normalized}=(36/577, 60/577, 126/577)$

So the entropy (after simplifying a little bit by putting together terms) is:

$h(p,P)=-\frac{6}{577}\cdot\log(1/6)-\frac{54}{577}\cdot\log(1/3)-\frac{78}{577}\cdot\log(1/2)-\frac{84}{577}\cdot\log(2/3)$

This gives an approximate real value of:

$h(p,P)\approx 0.274\ldots$

One final thing: If your matrix $P$ has zero entries, the product $P_{i,j}\log P_{i,j}$ will be zero for the corresponding entry, even so the logarithm would be negative infinity (multiplying by zero wins here).

Related Question