Approximate a function of binary entropy

approximationentropyinformation theoryreal-analysis

Binary entropy of a probability value $p$ is defined as $$h(p)=-p\log_2(p)-(1-p)\log_2(1-p).$$

I have the following function:$$f(p_1,p_2)=\frac{p_1 h(p_2)-p_2 h(p_1)}{p_1-p_2}$$.

I am interested in finding an approximation to this function when $p_1=e^{-1/a}$ and $p_2=e^{-1/b}$. I know that $b/a=k$, where $k$ is a constant.

Can someone help me simplify $f(p_1,p_2)$?

If I assume $p_1$ or $p_2$ to be approximately $0$ or $1$, this is simple. But, that is not valid in my case.

Best Answer

If $k=b/a$ is not very small or very big then I'd parametrize in this way:

$$\begin{align} p_1 &= p^{1+x} \\ p_2 &= p^{1-x} \\ x &= \frac{\log(p_1/p_2)}{\log(p_1 p_2)} = \frac{b-a }{b+a} =\frac{k-1}{k+1} \\ p &= \sqrt{p_1 p_2}=\exp\left(- \frac{a+b}{2ab}\right)= \exp\left(- \frac{1}{a} \frac{1}{1+x}\right) \end{align} $$

Notice that $p$ is somewhere between $p_1$ and $p_2$, and if (say) $\frac{1}{9}<k<9$ then $-0.8<x<0.8$.

Doing a Taylor expasion of $f$ around $x=0$, we get the approximation

$$ f \approx -\log{\left( 1-p\right) }+\frac{p \left( 2 {{p}}-1\right) {{\log^{2}{(p)}}} }{6 (1-p)^2}{{x}^{2}} $$

(I'm using natural logarithms here, entropy in nats - to get it in bits just divide everything by $\log(2)$)

The approximation seems quite good, assuming $|x|$ is not very near $1$ and $p$ is not too small. Here's a graph for three values of $p$ and $x\in [-0,8,0.8]$. The dashed lines correspond to the approximation.

enter image description here

If you need more precision you can add the next term, which is

$$\frac{p\, \left(7-28p +47 p^2 - 8 p^3 \right) {{\log^4{(p)}}}}{360 {{\left(1-p\right) }^{4}}} x^4$$