As a practical answer to the real questions you're addressing, such high quantiles will generally be quite sensitive to issues with model choice (especially such things as whether you model the right censoring and how heavy the tails are in the components).

But in any case - especially when dealing with high quantiles where ordinary simulation becomes impractical - that has great value in its own right; it's an interesting question from both theoretical and practical standpoints.

A couple of other approaches to this problem are (1) using the Fast Fourier Transform and (2) direct numerical integration.

One useful reference on this topic is Luo and Shevchenko (2009)$^{[1]}$.

In it they develop an adaptive direct numerical integration approach that's faster than simulation and competitive with FFT.

The more traditional approach in actuarial work was been (3) Panjer recursion, which can be found in numerous texts. Embrechts and Frei (2009)$^{[2]}$ discuss and compare Panjer recursion and FFT. (Note that both of these techniques involve discretization of the continuous distribution.)

On the other hand, doing a very unsophisticated version of simulation, and with no effort whatever made to be efficient, generating from a compound gamma-negative binomial isn't particularly onerous. This is timing on my kids' little laptop:

```
system.time(replicate(100000,sum(rgamma(MASS:::rnegbin(1,4,2),5,.1))))
user system elapsed
2.82 0.00 2.84
```

I think 2.8 seconds to generate 100$^{\,}$000 simulations of a compound distribution on a slow little laptop really isn't bad. With some effort to be efficient (of which one might suggest many possibilities), I imagine that could be made a good deal faster.

Here's the ecdf for $10^6$ simulations (which took about 29 seconds):

$\hspace{1cm}$

We see the characteristic discrete jump at zero you expect to see with a compound distribution.

[While it should be easy to make simulation a lot faster, all three alternatives mentioned here - if carried out sensibly - should be a lot faster still.]

You should note that the `actuar`

package supports computation with compound distributions, and offers several methods for calculation with them.

See, for example, this vignette which discusses this facility.

[Of possibly some further passing interest, note that there is an R package for the Poisson-lognormal distribution -- `poilog`

; if you need that distribution at some point it may be useful.]

Added in edit:

A potential quick approximation where the gamma shape parameter isn't changing -

In the gamma case, because a convolution of gammas with constant shape parameter is another gamma, you could write down the distribution of $Y|N=n$, and then evaluate the cdf and the density at a large number of grid-values at each $n$, then simply accumulate the sum directly (rather as one would for a KDE). The direct calculation only yields a lower bound to the true quantile, but if the negative binomial is not heavy tailed it should be quite rapid.

**References**:

[1]: Luo, X. and Shevchenko, P.V. (2009),

"Computing Tails of Compound Distributions Using Direct Numerical Integration,"

*Journal of Computational Finance*, **13** (2), 73-111.

[arXiv preprint available here]

[2]: Embrechts, P., and Frei, M. (2009),

"Panjer recursion versus FFT for compound distributions,"

*Mathematical Methods of Operations Research*, **69**:3 (July) pp 497-508.

[seems to be a pre-publication version here]

## Best Answer

The endpoints of a confidence interval are quantiles of the sampling distribution of the statistic.

So yes theoritically these are the 0.01- and 0.99-quantiles of the sampling distribution of the 0.01-quantile of a standard normal distribution. But isn't this a mouthful? Why not say this is a 98% two-sided confidence interval for the 0.01-quantile of the standard normal?