As part of an assignment, I am trying to prove a result using Kolmogorov's maximal inequality (for reference, here). I believe that I am quite close, however I am unable to justify what appears to be applying Kolmogorov's result over $\{M, \ldots, N\}$ for $N>M\geq1$ instead of the usual $\{1, \ldots, N\}$.
Being more specific let $\{X_i\}_{i\in\mathbb{N}}$ be independent random variables with zero mean and finite variances and define $S_n := X_1 + \ldots + X_n$, to be the $n$-th partial sum. For $\epsilon > 0$, I would like to show
$$P\left(\max_{M \leq n \leq N} |S_n – S_M| > \epsilon\right) \leq \frac{1}{\epsilon^2}\sum_{i=M+1}^{N} \textrm{Var}(X_i).$$
My approach so far has been to try and re-index the difference of partial sums on the LHS, so that it looks like
$$\max_{1 \leq n-M+1 \leq N-M+1} \left| \sum_{i=M+1}^{n} X_{i} \right| = \max_{1 \leq n-M+1 \leq N-M+1} \left| \sum_{i=1}^{n-M} X_{M+i} \right|.$$
At which point if we define $k:=n-M+1$, it follows that
$$\max_{1 \leq n-M+1 \leq N-M+1} \left| \sum_{i=M+1}^{n} X_{i} \right| = \max_{1 \leq k \leq N-M+1} \left| \sum_{i=1}^{k-1} X_{M+i} \right|.$$
Finally, defining $Z_i := X_{i+M}$, I get that
$$\max_{1 \leq n-M+1 \leq N-M+1} \left| \sum_{i=M+1}^{n} X_{i} \right| = \max_{1 \leq k \leq N-M+1} \left| \sum_{i=1}^{k-1} Z_i \right|.$$
The RHS in the last equation is very close to being able to apply Kolmogorov's inequality, but I can't quite massage it to be of the form specified in the inequality theorem. I would appreciate any hints or pushes in the correct direction, preferably without reference to martingale theory since we have not yet covered them in my class. Thanks!
Best Answer
Let $X'_i:=0$ for $i\leqslant M$ and $X'_i=X_i$ for $i\geqslant M+1$ and $S'_n=\sum_{i=1}^nX'_i$. Then $S'_n=S_n-S_M$ for $n\geqslant M$ and $S'_n=0$ for $n\leqslant M$ hence $$ \max_{M \leqslant n \leqslant N} |S_n - S_M|=\max_{1\leqslant n\leqslant N}\lvert S'_n\rvert. $$ Since $\left(X'_i\right)_{i\geqslant 1}$ is independent, Kolmogorov's maximal inequality applies.