Show independent identically distributed random variables have the same distribution

probabilityprobability distributions

This is a problem from Shiryaev's GTM 95 Volume 1 Chapter 2 Section 8 ex 20

Problem: Let $X,X_1,…,X_n$ be independent identically distributed random variables, $S_n=\sum_{i=1}^n X_i, S_0=0, \overline{M_n}=\max_{0 \leq j \leq n} S_j, \overline{M}=\sup_{n \geq 0} S_n$. Show the following:
(note the notation $\xi \stackrel{d}{=} \eta$ means $\xi$ and $\eta$ have the same distribution).

(a) $\overline{M_n} \stackrel{d}{=} (\overline{M_{n-1}}+X)^+, n \geq 1$;

(b) if $S_n \rightarrow -\infty (P-a.s.)$, then $\overline{M} \stackrel{d}{=}(\overline{M}+X)^+$

My thoughts for those two problems are

(a). If $\max_{ 0 \leq j \leq n-1} S_j+X>0, \max_{0 \leq j \leq n-1} S_j+X=S_k+X=S_{k+1}=\max_{1 \leq j \leq n} S_n$. So for $a>0$, $P((\max_{ 0 \leq j \leq n-1} S_j+X)^+ \leq a)=P(\max_{1 \leq j \leq n} S_n \leq a)$.

(b). If $S_n \rightarrow -\infty$, then $\overline{M_n}$ converges to $\overline{M}$. Let $n \rightarrow \infty$, then from (a), $\overline{M_n}=(\overline{M_{n-1}}+X)^+$ becomes $\overline{M}=(\overline{M}+X)^+ $.

I am not sure if I am on the right track. I need some guidance on this problem. Thanks!

For bounty: an answer is appreciated.

Best Answer

Let $\Psi(z_1,z_2,\ldots z_n):=\max\{\sum_{j=1}^k z_j \,:\, k=0,1,\ldots n\}$ with the convention that an empty sum equals zero. Then (a) means that $\Psi(X_1,X_2,\ldots, X_n)$ has the same distribution as $\Psi(X,X_1,X_2,\ldots,X_{n-1})$, which is obvious.

Part (b) holds even without the assumption that $S_n \to -\infty$ a.s., by the same argument, if you allow random variables to take the value $\infty$. Your proof also gives that conclusion. The additional assumption ensures that the variables on both sides of (b) are almost surely finite.

Related Question