[Math] Maximum of a set of sums of iid random variables

pr.probability

Consider some probability distribution $D$ over non-negative reals with finite expectation $\mu$. Now for any positive $T$ consider sums of $T$ iid random variables drawn from $D$. A single sum of this sort would be $S(T) = \sum_{i = 1}^T x_i$ where each $x_i$ is a iid random sample from $D$.

We will consider $n$ such sums $S_1(T), S_2(T) \ldots S_n(T)$.

My question: Is it true that for any distribution $D$ and any finite positive $n$, there exists a finite positive $T$ (which may be a function of $n$ and $D$) such that $E(\max_{1 \leq j \leq n}S_j(T)) \leq 2 T \mu$?

This is true for, say, Bernoulli random variables, but I'd like to know the mildest condition under which a statement like this can be made. For example, is it true for all distributions with finite $4^{th}$ moments?

Best Answer

It is always true. Split $x_i=y_i+z_i$ where $y_i$ are bounded and $Ez_i\le \frac \mu{10n}$. You have no problems with $y_i$ because if they were alone,$ES_j$ would be concentrated in a very strong sense around $\nu T$ for large $T$ where $\nu=Ey_i\le\mu$ (see Didier's argument for details or recall the Bernstein inequality for exponential moments of sums of bounded variables). But you also have no problems with $z_i$ because even if you add them all up instead of taking the maximum at some point, you end up with mere $0.1 T\mu$.

Related Question