Compound Geometric Distribution – Explanation and Examples

exponential distributionprobability distributions

I'm given an exercise to find a distribution of a r.v. $S_N$ constructed as follows:
$$S_N = \sum_{k=1}^{N+1} X_k$$
where $N \sim Geom(p), X_k \sim Exp(\lambda)$ for $p \in(0,1), \lambda >0$.

I recall from Non-Life Insurance course the following property of Laplace transform/PGF:
$$L_{S_N}(t) = g_N(L_X(t))$$
Since $L_X(t) = \frac{\lambda}{\lambda+t}$ and $g_N(t) = \frac{p}{1-(1-p)t}$
, plugging in one expression into another I obtained:
$$L_{S_N}(t) = p+(1-p)\frac{p\lambda}{p\lambda+t}$$

So here is my actual question. Is it possible to somehow recover a cdf from the given Laplace transform of the corresponding distribution ? I found a script where it's done but simply as a "property" but I would like to know some more detail and the theory behind it.

Any hint would be highly appreciated 🙂

Best Answer

Fixed $n$, the R.V. $S_{n+1}$ has Erlang density, i.e., \begin{equation} \nonumber f_{S_{n+1}} (t) = \lambda^{n+1} \frac{t^n}{n!} e^{-\lambda t} \end{equation} which implies \begin{equation} \nonumber \Pr(S_{n+1}\le t) = \int_0^t \lambda^{n+1} \frac{z^n}{n!} e^{-\lambda z} \mbox{d}z. \end{equation} Therefore $$ \Pr(S_N\le t) = \Pr\left ( \bigcup_{n\ge 0} \left\{ S_N\le t \cap N=n\right\} \right)\\ = \sum_{n\ge 0} \Pr\left ( S_{N}\le t \cap N=n \right) \\ = \sum_{n\ge 0} \Pr\left ( S_{N}\le t| N=n\right) \Pr\left (N=n \right)\\ = \sum_{n\ge 0} \Pr\left ( S_{n+1}\le t\right) \Pr\left (N=n \right)\\ = \sum_{n\ge 0} \int_0^t \lambda^{n+1} \frac{z^n}{n!} e^{-\lambda z} \mbox{d}z \,p^n (1-p)\\ = \lambda (1-p) \int_0^t \sum_{n\ge 0} \frac{(p \lambda z)^n}{n!} e^{-\lambda z} \mbox{d}z \\ = \lambda (1-p) \int_0^t e^{p \lambda z} e^{-\lambda z} \mbox{d}z \\ = \lambda (1-p) \int_0^t e^{- \lambda(1-p) z} \mbox{d}z\\ = 1- e^{- \lambda(1-p) t}, $$ for which we recognize that $S_n$ is a random variable with the exponential law of parameter $\lambda(1-p)$. Note that I could invert the integral and summation because the involved terms are all positive.

The same conclusion can be obtained with a characteristic function argument.

Related Question