Entropy of bivariate negative binomial distribution

entropyprobabilityprobability distributionsstatistics

The probability mass function (PMF) of a bivariate negative binomial distribution [1] is given by:

$$P(X=x, Y=y) = \frac{(a + x + y – 1)!}{(a-1)! x! y!} p_0^a p_1^x p_2^y $$

where $a, p_0, p_1, p_2 > 0$ and $p_0 + p_1 + p_2 = 1$.


I would like to calculate the entropy

$$H(x, y) = – \sum_{x=0}^{\infty}\sum_{y=0}^{\infty} P(x, y) \log P(x, y)$$

of this distribution.


After some arithmetic manipulation, I arrived at the following expression:

\begin{multline}
H(x, y) = – a \log p_0 – \mathbb{E}[x] \log(p_1) – \mathbb{E}[y] \log(p_2) + \log\big((a-1)!\big) \\
– \mathbb{E}\big[\log\big((a + x + y – 1)!\big)\big] + \mathbb{E}[\log(x!)] + \mathbb{E}[\log(y!)]\end{multline}

The terms $\mathbb{E}[\log(x!)]$ and $\mathbb{E}[\log(y!)]$ can be calculated by solving a definite integral in the interval $[0, 1]$ as shown in [2] (equation 23). However, it is not clear to me how the same approach can be extended for the computation of expectation of the log factorial $\mathbb{E}\big[\log\big((a + x + y – 1)!\big)\big]$ over the joint.

Can anyone shed some light on this?


References:

[1] Dunn 1967, Characterization of the Bivariate Negative Binomial Distribution (pdf)

[2] Cheraghchi 2018, Expressions for the Entropy of Binomial-Type Distributions (pdf)

Best Answer

In your reference [1], it says that the marginal distribution for $\mathsf{X}+\mathsf{Y}$ is equally simple, at the bottom of page 79. The parameters that Dunn uses for the NBin and the bivariate NBin are such that $\mathsf{X}+\mathsf{Y}$ is negative binomial with $p=p_1+p_2$ and $A=a$. Note in reference [2] Cheraghchi calls $A$ by $r$. Then you should be able to calculate that by using the 2nd-to-last displayed equation in [2] page 14. Take $\alpha=a$ and note that $(x+y+a-1)!$ is $\Gamma(x+y+a)$. So in that formula, that $\alpha$ should be $a$ and also that $r$ should be $a$. And $p$ should be $p_1+p_2$.