Binary entropy and mixing property to simplify a function

entropyinformation theoryprobabilityprobability theory

Let $ f(\vec{p},\vec{q})=h(\sum_{i=1}^{n}p_iq_i)-\sum_{i=1}^{n}p_ih(q_i)$ where $\vec{p}=[p_1,p_2,\ldots, p_n]$ and $\vec{q}=[q_1,q_2,\ldots,q_n].$

Here, $h(.)$ indicates the binary entropy function. Also $\sum_{i=1}^{n}p_i=1$. ($q_i's$ doesn't sum to 1 but $q_i's\lt$ 1).

I am looking for alternate representations or simplified forms of $ f(\vec{p},\vec{q})$.

One of my thoughts is to extend this identity which makes use of the mixing property of entropy. Is this extension possible?

Also, are there any other characterization of $ f(\vec{p},\vec{q})$?

Why I am interested in this:
In my case, $q_i's$ are exponential functions (say for example, $q_i= e^{-\frac{a}{i+b}}$) whereas $p_i's$ are coming from a binomial PMF. Hencein $ f(\vec{p},\vec{q})$, the $h(q_i)'s$ are hard to compute and therefore I am interested in alternate forms of $ f(\vec{p},\vec{q})$.

Best Answer

In case this helps:

Let $X_i$ be independent Bernoulli random variables with parameter $q_i = P(X_i =1)$.

Let $W$ be a random variable (independent of $X_i$) taking values on $i=1,2 \cdots n$ with $P(W=i)=p_i$.

Let's define a new Bernoulli variable $Y$, which corresponds to picking one of $X_i$ with prob $p_i$, i.e. $Y= X_W$

Then $$\begin{align} A &= h(\sum_{i=1}^{n}p_iq_i)-\sum_{i=1}^{n} p_i h(q_i) \\ &= H(Y) - H(Y|W) \\ &= I(Y;W) \\ &= H(W) - H(W|Y) \\ &= H(\{p_i\}) - b \, H\left(\frac{\{p_i q_i\}}{b}\right) -(1-b)H\left(\frac{\{p_i (1-q_i)\}}{1-b}\right) \end{align} $$

where $b= \sum p_i qi$

Added: In this related question I develop an approximation for nearly constant $q_i$.

Related Question