Solved – the unit of entropy with different log base

entropyinterpretation

Entropy is the measure of randomness of a random variable.
$$H(X) = -\sum_{x\in X}p(x)\log_{2}(p(x))$$

The units when using the $\log_{2}$ is bits i.e., how many bits required to store the information present in the random variable $X$.

What will be the units of entropy when use other base for log i.e., natural log/log base 10 or any other? How we will interpret those units?

Best Answer

For natural log the units are called "nats". I believe it's just a convention to define entropy with natural log and it probably stems from thermodynamic entropy which uses nats for convenience: as wiki puts it "Physical systems of natural units that normalize Boltzmann's constant to 1 are effectively measuring thermodynamic entropy in nats".

As the main concern about entropy is its role in definition of mutual information between random variables, there's no practical effect of using different bases for log.

Related Question