[Math] Maximum of a sum of random variables

independencenormal distributionprobability distributionsprobability theoryrandom variables

Let $X_1, \dots, X_n$ be independent and identically distributed random variables with $E(X_i) = 0$ and $$S_k = \sum_{i \leq k} X_i$$

  1. What is the probability distribution of $M_2 = \max \{ X_1, X_1+X_2 \}$?

We can suppose $X_i$ have normal distribution ; we have to note that $X_1$ and $X_1 + X_2$ are not independent, that's why all my attempts of computing $P(S \leq t)$ failed.

  1. What is the probability distribution of
    $M_3 =\max \{ X_1, X_1+X_2, X_1+X_2+X_3 \}$, and, more generally, of $M_n = \max\limits_{k \le n} {S_k}$ ?

Best Answer

Through the Spitzer identity, it is possible to find some kind of transform of the distribution of $M_n$. Well, not exactly. The Spitzer identity involves the expressions $M^+_n = \max_{0\le k\le n} S_k$, where $S_0 = 0$, $S_k = X_1 + \dots + X_k$, $k\ge 1$. So this translates to the positive part of expression you are interested in. But it is possible to reduce the original question to this one, since, as it was already mentioned in the previous answers, $M_n$ has the same distribution as $X + M_{n-1}^+$, and $X$ is independent of $M_{n-1}^+$.

Going back to the Spitzer identity, it reads $$ \sum_{n=0}^\infty s_n \mathsf E[e^{-\lambda M_n^+}] = \exp\left\{\sum_{n=1}^\infty \frac{s^n}n \mathsf E[e^{-\lambda S_n^+}] \right\}, $$ where $S_n^+ = \max\{S_n,0\}$.

The right-hand side can be calculated more or less explicitly in some cases. For example, if $X\simeq N(0,\sigma^2)$, then $S_n \simeq N(0,\sigma^2 n)$, so $$ \mathsf E[e^{-\lambda S_n}\mathbf{1}_{S_n\ge 0}] = \int_0^\infty e^{-\lambda \sigma x\sqrt{n}} e^{-x^2/2}\frac{dx}{\sqrt{2\pi}} = \sqrt{n} \int_0^\infty e^{-\lambda \sigma y n} e^{-y^2n/2}\frac{dy}{\sqrt{2\pi}}. $$ Therefore, $$ \sum_{n=1}^\infty \frac{s^n}n \mathsf E[e^{-\lambda S_n}\mathbf{1}_{S_n\ge 0}] = \sum_{n=1}^\infty \int_0^\infty \frac{s^n e^{-\lambda\sigma yn} e^{-y^2n/2}}{\sqrt{2\pi n}} dy = \int_0^\infty \operatorname{Li}_{1/2}(s e^{-\lambda\sigma y}e^{-y^2})\frac{dy}{\sqrt{2\pi}}, $$ where $\operatorname{Li}_{s}(x) = \sum_{n=1}^\infty{x^n}n^{-s}$ denotes the polylogarithm. Hence, $$ \sum_{n=1}^\infty \frac{s^n}n \mathsf E[e^{-\lambda S_n^+}] = \sum_{n=1}^\infty \frac{s^n}n \mathsf E[e^{-\lambda S_n}\mathbf{1}_{S_n\ge 0}] + \frac12 \sum_{n=1}^\infty \frac{s^n}{n}\\= \int_0^\infty \operatorname{Li}_{1/2}(s e^{-\lambda\sigma y}e^{-y^2})\frac{dy}{\sqrt{2\pi}} - \frac12\log(1-s). $$

This does not look too neat, but it is not hopeless as it might seem. From here you can calculate expectations (quite easily), variances (not so easily) and get some expressions for higher order moments of $M_n^+$.

Related Question