Are there temperature, energy in mathematics

entropyinformation theoryrandom variablessoft-question

Temperature, energy, entropy, etc. are quantities in statistical mechanics, and are properties of specific physical systems.

However, recently I discovered that random variables have "entropy" $S$:

$\text{S}(X) = \mathbb{E}\left[ – \log p(X) \right] = -\sum_x p(x) \log(p(x)) $

This makes me wonder, do random variables/distributions also have defined terms like temperature and energy?

They usually hold the relation: $\dfrac{\partial S}{\partial U} = \dfrac 1T$

Best Answer

Entropy in one branch of mathematics, information theory, is a measure of the information content of a message. It's not actually related to the entropy in physics except by analogy. A completely ordered message consisting of just ones or just zeros has no entropy, whereas a totally random unpredictable message has high entropy.

As far as I know, there is no equivalent use of the words "temperature" or "energy" in information theory, for example the "temperature" of an information source or the amount of "energy" in a message.

Even if these terms were used, they would be analogies to physics concepts rather than meaningful terms.

Related Question