Find $\delta$ such that $ \delta \log \frac{1}{\delta} + (1-\delta) \log \frac{1}{1-\delta} 0$

entropyinequalityprobabilityupper-lower-bounds

Consider a random variable $$X = \begin{cases}x_0:&\text{with probability } \delta \\ x_1:& \text{with probability } 1-\delta \end{cases}$$ The entropy of $X$ is $$H(X) = \delta \log \frac{1}{\delta} + (1-\delta) \log \frac{1}{1-\delta}$$

How can we show that for all $\epsilon>0$, we can always find a $\delta > 0$ such that $H(X) < \epsilon$?

I'm not able to find a way to do this cleanly. For example, I've tried playing around with setting $\delta = c \epsilon$ for different values of $c$ (such as $\delta = \epsilon, \delta = \frac{\epsilon}{1000}$, etc) but I haven't been able to find a way to find the appropriate $\delta$ or $c$ that would do this.

Some other approaches I had included using the log-sum inequality ($\sum_i a_i \log \frac{a_i}{b_i} \geq (\sum_i a_i) \log \frac{\sum_i a_i}{\sum_i b_i}$) and other bounds on $\log \delta$ such as $1-\frac{1}{\delta} \leq \ln x \leq \delta-1$, but the best I've been able to do has been to show that $H(X) \leq 1$, which isn't useful.

Best Answer

This is equivalent to proving that $$\lim_{\delta \to 0^+}\delta \log\frac{1}{\delta}=0$$ and $$\lim_{\delta \to 1^-} \delta \log\frac{1}{\delta} = 0$$

The second limit is trivial (just substitute $\delta = 1$ since the function is continuous). For the first one you can write $$\delta \log\frac{1}{\delta} = \frac{\log\frac{1}{\delta}}{\frac{1}{\delta}}$$ and use L'Hospital's Rule.