Sum of Gamma distributions weighted by different multipoles

distributionsgamma distributioninverse gamma distributionnormal distributionprobability

1) Introduction :

I am interested in computing the variance of an observable
$$
O=\frac{\sum_{\ell=1}^{N} \sum_{m=-\ell}^{\ell} a_{\ell m}^{2}}{\sum_{\ell=1}^{N} \sum_{m=-\ell}^{\ell}\left(a_{\ell m}^{\prime}\right)^{2}}
$$

where $\left(a_{\ell m}, \ell \in\{1, \cdots, N\},|m| \leq \ell\right)$ and $\left(a_{\ell m}^{\prime}, \ell \in\{1, \cdots, N\},|m| \leq \ell\right)$ are independent random variables, with
$a_{\ell m} \sim \mathcal{N}\left(0, C_{\ell}\right)$ for each $|m| \leq \ell$ and $a_{\ell m}^{\prime} \sim \mathcal{N}\left(0, C_{\ell}^{\prime}\right)$ for each $|m| \leq \ell .$ We recall the properties of a few basic distributions. We have :

  1. $\mathcal{N}(0, C)^{2} \sim C\chi^{2}(1)=\Gamma\left(\frac{1}{2}, 2 C\right)$,
  2. $\langle\Gamma(k, \theta)\rangle=k \theta$ and $\operatorname{Var}(\Gamma(k, \theta))=k \theta^{2}$, and
  3. $\sum_{i=1}^{N} \Gamma\left(k_{i}, \theta\right) = \Gamma\left(\sum_{i=1}^{N} k_{i}, \theta\right)$ for independent summands.

2) Important precision : for each $\ell$, I have the relation $C_\ell=\dfrac{b}{b'}C'_\ell$ with $b$ and $b'$ being constants, I wonder how it could help for the rest of post.

3) Partial solution not finished (only mean $\langle O\rangle$ ) :

We have by points 1 and 3
$$
\begin{aligned}
\sum_{\ell=1}^{N} \sum_{m=-\ell}^{\ell} a_{\ell m}^{2} & = \sum_{\ell=1}^{N} \sum_{m=-\ell}^{\ell} \Gamma\left(1 / 2,2 C_{\ell}\right) \\
& = \sum_{\ell=1}^{N} \Gamma\left((2 \ell+1) / 2,2 C_{\ell}\right)
\end{aligned}\quad(1)
$$

where the summands are independent. Similarly, using points 1 and 3 again, we obtain
$$
\begin{aligned}
\sum_{\ell=1}^{N} \sum_{m=-\ell}^{\ell}\left(a_{\ell m}^{\prime}\right)^{2} & = \sum_{\ell=1}^{N} \sum_{m=-\ell}^{\ell} \Gamma\left(1 / 2,2 C_{\ell}^{\prime}\right) \\
& = \sum_{\ell=1}^{N} \Gamma\left((2 \ell+1) / 2,2 C_{\ell}^{\prime}\right)
\end{aligned}\quad(2)
$$

where the summands are independent. By independence of the sequences $\left(a_{\ell m}, \ell \in\{1, \cdots, N\},|m| \leq \ell\right)$ and $\left(a_{\ell m}^{\prime}, \ell \in\{1, \cdots, N\},|m| \leq \ell\right)$, equations (1) and (2), we obtain
$$
\begin{aligned}
\langle O\rangle &=\left\langle\sum_{\ell=1}^{N} \sum_{m=-\ell}^{\ell}\left(a_{\ell m}\right)^{2}\right\rangle\left\langle\left(\sum_{\ell=1}^{N} \sum_{m=-\ell}^{\ell}\left(a_{\ell m}^{\prime}\right)^{2}\right)^{-1}\right\rangle \\
&=\left\langle\sum_{\ell=1}^{N} \Gamma\left((2 \ell+1) / 2,2 C_{\ell}\right)\right\rangle\left\langle\left(\sum_{\ell=1}^{N} \Gamma\left((2 \ell+1) / 2,2 C_{\ell}^{\prime}\right)\right)^{-1}\right\rangle
\end{aligned}
$$

The first factor simplifies :

$$\left\langle\sum_{\ell=1}^{N} \Gamma\left((2 \ell+1) / 2,2 C_{\ell}\right)\right\rangle=\sum_{\ell=1}^{N}(2 \ell+1) C_{\ell}$$

As you can see, I can't conclude on the second factor (expectation of the inverse of sum of Gamma distributions), especially since I can't manage to simplify it.

I have looked for a solution on the web but none solution for the instant.

UPDATE 1:

From the following link Expectation of inverse of sum of random variables, if we have $X_i$'s ($i=1,..,n$) be i.i.d. random variables with mean $\mu$ and variance $\sigma^2$, there is a method that can be used to compute $\mathbb{E}[1/(X_1+…+X_n)]$ :

Assuming the expectation does exist, and further assuming $X$ to be positive random variables:
$$
\mathbb{E}\left(\frac{1}{X_{1}+\cdots+X_{n}}\right)=\mathbb{E}\left(\int_{0}^{\infty} \exp \left(-t\left(X_{1}+\cdots+X_{n}\right)\right) \mathrm{d} t\right)
$$

Interchanging the integral over $t$ with expectation:
$$
\mathbb{E}\left(\int_{0}^{\infty} \exp \left(-t\left(X_{1}+\cdots+X_{n}\right)\right) \mathrm{d} t\right)=\int_{0}^{\infty} \mathbb{E}\left(\exp \left(-t\left(X_{1}+\cdots+X_{n}\right)\right)\right) \mathrm{d} t
$$

Using iid property:
$$
\int_{0}^{\infty} \mathbb{E}\left(\exp \left(-t\left(X_{1}+\cdots+X_{n}\right)\right)\right) \mathrm{d} t=\int_{0}^{\infty} \mathbb{E}(\exp (-t X))^{n} \mathrm{~d} t
$$

So should you know the Laplace generating function $\mathcal{L}_{X}(t)=\mathbb{E}\left(\mathrm{e}^{-t X}\right)$ we have:
$$
\mathbb{E}\left(\frac{1}{X_{1}+\cdots+X_{n}}\right)=\int_{0}^{\infty} \mathcal{L}_{X}(t)^{n} \mathrm{~d} t
$$

How could I apply it in my case with $\Gamma$ distribution, i.e for the expectation $\Bigg\langle\left(\sum_{\ell=1}^{N} \Gamma\left((2 \ell+1) / 2,2 C_{\ell}^{\prime}\right)\right)^{-1}\Bigg\rangle$ ?

From 2) Important precision, the only thing I can reformulate is about the scale parameter $\dfrac{b}{b'}$ :

$$\Bigg\langle\left(\sum_{\ell=1}^{N} \Gamma\left((2 \ell+1) / 2,2 C_{\ell}^{\prime}\right)\right)^{-1}\Bigg\rangle=\Bigg\langle\left(\sum_{\ell=1}^{N} \dfrac{b'}{b}\Gamma\left((2 \ell+1) / 2,2 C_{\ell}\right)\right)^{-1}\Bigg\rangle$$

UPDATE 2:

I wonder if I should rather write only :

$$\begin{aligned} \sum_{\ell=1}^{N} \sum_{m=-\ell}^{\ell}\left(a_{\ell m}^{\prime}\right)^{2} & \sim \sum_{\ell=1}^{N} \sum_{m=-\ell}^{\ell} \Gamma\left(1 / 2,2 C_{\ell}\right) \\ & \sim \sum_{\ell=1}^{N} \Gamma\left((2 \ell+1) / 2,2 C_{\ell}\right) \end{aligned}$$

? what do you think about this slight modification but with important consequences on the following ?

Best Answer

$$O=\frac{\sum_{\ell=1}^{N} \sum_{m=-\ell}^{\ell} a_{\ell m}^{2}}{\sum_{\ell=1}^{N} \sum_{m=-\ell}^{\ell}\left(a_{\ell m}^{\prime}\right)^{2}}$$

$a_{\ell m}\sim N(0,C_{\ell})$ so

Preface: I assume that $C_{\ell}={Var(a_{\ell m})}$.

Let $a_{\ell m}\sim N(0,C_{\ell})$ so $\frac{a_{\ell m}}{\sqrt{C_{\ell}}}\sim N(0,1)$. Using this, we get $a_{\ell m}^2=C_{\ell}\cdot \left(\frac{a_{\ell m}}{\sqrt{C_{\ell}}}\right)^2\sim\chi^2_{C_{\ell}}$, Summing up, we get $\sum_{m=-\ell}^{\ell} {a_{\ell m}^{2}}\sim\chi^2_{(2\ell+1)C_{\ell}}$, and for the overall sum we get

$$\sum_{\ell=1}^{N} {\sum_{m=-\ell}^{\ell} a_{\ell m}^{2}}\sim\chi^2\left(\sum_{\ell=1}^{N}{(2\ell+1)C_{\ell}}\right)$$ which can also be written as $\Gamma(\frac{1}{2}\sum_{\ell=1}^{N}{(2\ell+1)C_{\ell}},2)$. For simplicity, denote $K=\sum_{\ell=1}^{N}{(2\ell+1)C_{\ell}}$, so the numerator as $\Gamma(0.5K,2)$ distribution. We can also note that $$2\sum_{\ell=1}^{N} {\sum_{m=-\ell}^{\ell} a_{\ell m}^{2}}\sim\Gamma(K,1).$$

Let's observe the denominator:

$$\sum_{\ell=1}^{N} \sum_{m=-\ell}^{\ell}\left(a_{\ell m}^{\prime}\right)^{2}\sim\Gamma(\frac{1}{2}\sum_{\ell=1}^{N}{(2\ell+1)C'_{\ell}},2)$$ as $C'_{\ell}=\frac{b'}{b}C_{\ell}$, we get $$2\sum_{\ell=1}^{N} {\sum_{m=-\ell}^{\ell} \left(a'_{\ell m}\right)^{2}}\sim\Gamma(\frac{b'}{b}K,1).$$

As these are two Gamma variables, the ratio $O=\frac{2\sum_{\ell=1}^{N} {\sum_{m=-\ell}^{\ell} a_{\ell m}^{2}}}{2\sum_{\ell=1}^{N} {\sum_{m=-\ell}^{\ell} \left(a'_{\ell m}\right)^{2}}}$ as a beta prime distribution:

$$\frac{2\sum_{\ell=1}^{N} {\sum_{m=-\ell}^{\ell} a_{\ell m}^{2}}}{2\sum_{\ell=1}^{N} {\sum_{m=-\ell}^{\ell} \left(a'_{\ell m}\right)^{2}}}\sim\beta'\left(K, \frac{b'}{b}K\right)$$

So, according to the properties of the beta prime distribution, $$E\left[ \frac{\sum_{\ell=1}^{N} \sum_{m=-\ell}^{\ell} a_{\ell m}^{2}}{\sum_{\ell=1}^{N} \sum_{m=-\ell}^{\ell}\left(a_{\ell m}^{\prime}\right)^{2}} \right]=\frac{K}{\frac{b'}{b}K-1}=\frac{bK}{b'K-b}=\frac{b\sum_{\ell=1}^{N}{(2\ell+1)C_{\ell}}}{b'\sum_{\ell=1}^{N}{(2\ell+1)C_{\ell}} - b}$$

and if $\frac{b'}{b}K > 2$,

$$Var\left( \frac{\sum_{\ell=1}^{N} \sum_{m=-\ell}^{\ell} a_{\ell m}^{2}}{\sum_{\ell=1}^{N} \sum_{m=-\ell}^{\ell}\left(a_{\ell m}^{\prime}\right)^{2}} \right)=\frac{K(K+\frac{b'}{b}K-1)}{(\frac{b'}{b}K-1)^2(\frac{b'}{b}K-2)}=\frac{b^2K(bK+b'K-b)}{(b'K-b)^2(b'K-2b)}=\frac{b^3K^2+b^2K(b'k-b)}{(b'K-b)^3-b(b'k-b)^2}.$$

Dirty, but that's it.