[Math] An elementary proof for a bound on $x \log x$

information theoryreal-analysis

During one of our information theory classes, the Professor used the following bound to prove a result.

For any $x,y \in (0,1)$, $x \neq y$, show that
$$|x \log(x) – y \log(y) | \leq |x-y|\log \left(\frac{1}{|x-y|}\right)$$

Note that RHS is positive when put this way. Also note that base of the logarithm is immaterial here(I take base $e$ to simplify calculations). Looking at $|x-y|$, I thought of using mean value theorem. W.L.O.G let $x > y$. Then by applying mean value theorem to $u \log u$, we have

$$x \log(x) – y \log(y) = (x-y)(1 + \log z) = (x-y)(\log ez)$$

for some $z \in (y,x)$. Now I do not know how to show (after removing logs).

$$ez \leq \left(\frac{1}{x-y}\right)$$

I'd like to know if my approach has any promise as well as any alternate ways to get this. As always any hints are appreciated.

Edit: Looks like I made an error while posting this problem, apparently the result also required that $|x-y| \le 0.5$. Kindly note this.

Best Answer

Let we assume $x,y\in(0,1)$ and $|x-y|=h$. We want to prove:

$$ \sup_{\substack{x,y\in (0,1)\\ |x-y|=h}}\left|x \log x-y\log y\right|\leq -h\log h \tag{1}$$ but since $f(x)=x\log x$ is a convex function on $I=(0,1)$, it is enough to prove $(1)$ for $\min(x,y)\to 0^+$ and $\max(x,y)\to 1^-$. The first case is trivial, since $\lim_{x\to 0^+}x\log x = 0$.

So we just have to check: $$ \forall h\in(0,1),\qquad -(1-h)\log(1-h) \leq -h\log h \tag{2}$$ but that holds only if $\color{red}{h\leq\frac{1}{2}}$.

Related Question