# Is entropy of a binomial distribution an increasing function of $n$

entropyinformation theoryprobabilityprobability distributionsrandom variables

I was thinking about the entropy of a binomial distribution.
The Wikipedia page on Binomial distribution says that the entropy of the Binomial(n,p) is asymptotically $$\frac{1}{2} \log _{2}(2 \pi e n p(1-p))+O\left(\frac{1}{n}\right)$$.

I noted that in non-asymptotic regime, there exists lower bounds and upper bounds for the binomial entropy as can be seen here.

I was wondering whether Binomial entropy is an increasing (strictly speaking, non-decreasing) function of $$n$$ in general? My intuitive answer is yes. But, I can't quite prove this. Can someone help me prove this.

My attempt: $$H(X+Y)=H(X)+H(Y)-H(X|X+Y)$$ for $$X,Y$$ IID Bernoulli RVs. Also conditioning reduces entropy. Iterate this argument $$n$$ times to reach Binomial. Is this right?

That's right.

A Binomial $$(n,p)$$ can be expressed as the sum of $$n$$ Bernoulli $$(p)$$ variables. Hence it also can be expressed as the sum of a Binomial $$(n-1,p)$$ and a Bernoulli $$(p)$$.

Now, if $$Z = X + Y$$ (for independent $$X,Y$$) we have

$$H(Z,Y)= H(Y|Z) + H(Z) = H(Z | Y) + H(Y) \implies \\ H(Z)=H(X+Y)= H(X) + H(Y)-H(Y|Z) \ge H(X)$$

because $$H(Y|Z) \le H(Y)$$.

Then, letting $$Z,X$$ be the Binomials and $$Y$$ the Bernoulli, we get the desired result.

Notice that the inequality actually is strict, if $$0.