Entropy – Upper Bound of Log of Multinomial Coefficient

entropymultinomial-coefficientsupper-lower-bounds

A well-known approximation for multinomial coefficients is using the entropy, see this questions: Expression for the size of type class, or multinomial coefficient.

Numerical examples suggest that we actually have the following upper bound
for any values with $x_1+\cdots+x_m = n$:
$$
\lg\left(\binom{n}{x_1,\ldots,x_m}\right)
\;\le\; n\cdot \mathcal H\left(\frac{x_1}n,\ldots,\frac{x_m}n\right)
\;=\;
n \lg n – \sum_{i=1}^m x_i \lg(x_i).
$$
How can we formally prove this?

I tried to insert Stirling's formula for $n!$ (or $\ln(n!)$ directly) and compute, but this either gives only an asymptotic statement – so that above inequality holds for large enough values for all $x_i$ only, which is not useful for me – or, when using the more precise inequality versions like the one by Robbins (bottom of section), leaves me with terms I cannot prove to imply above bound either.

Best Answer

Your conjecture seems correct, and I think you can find a proof of it in Csiszár & Shields: Information Theory and Statistics: A Tutorial (2004). Check Lemma 2.2 on pp. 429–430: if I get it correctly it proves the bounds $$n H(\textbf{x}/n) -\ln\binom{n+m-1}{m-1}\leq\ln\binom{n}{\textbf{x}} \leq n H(\textbf{x}/n),$$ where $\textbf{x} := x_1,\dotsc,x_m$.

Related Question