Information Theory – Can Entropy of a Countable Random Variable Be Infinite?

entropyinformation theorysequences-and-series

Consider a random variable $X$ taking values over $\mathbb{N}$. Let $\mathbb{P}(X = i) = p_i$ for $i \in \mathbb{N}$. The entropy of $X$ is defined by
$$H(X) = \sum_i -p_i \log p_i.$$
Is it possible for $H(X)$ to be infinite?

Best Answer

Consider the independent variables $ X_1, X_2 , X_3 \cdots$, where $X_k$ have non overlapping discrete uniform distributions over $2^k$ values: $X_1 \sim U[2\ldotp\ldotp3]$, $X_2\sim U[4\ldotp\ldotp7]$, $X_3 \sim U[8\ldotp\ldotp15]$, etc. The respective entropies are $H(X_k) = k$ bits.

Let $Y =X_M$ where $M$ is a random variable (independent of $X_k$) with pmf $m_k$ ($\sum_{k=1}^\infty m_k=1)$. This is known as a mixture, with mixing distribution $m_k$, and the resulting entropy (because the components do not overlap) is

$$\begin{array}{} H(Y)&=&H(M)+m_1 H(X_1)+ m_2 H(X_2) + m_3 H(X_3) +\cdots \\ &=& H(M) + m_1 + 2 \, m_2 + 3 \, m_3 \cdots \\&=& H(M) + \sum_{k=1}^\infty \, k \, m_k = H(M) + E(M) \end{array}$$

Hence, by chosing any mixing distribution $m_k$ with infinite mean ($E(M)=\sum k \, m_k=+\infty$), we attain infinite entropy. For example: $m_k=A/k^2$ (the example in David's answer essentially corresponds to this choice). Or, similarly, $m_k = \frac{1}{k}-\frac{1}{k+1}=\frac{1}{k^2+k}$