Understanding proof of Azuma’s inequality

conditional-expectationmartingalesprobability

I am trying to understand the proof of Azuma's inequality, though one step isn't quite clear to me:
To give some context: $V_1,V_2,\dots$ is a martingale difference sequence with respect to the random variables $X_1,X_2,\dots$ . That means, for all $i\in\mathbb{N}$ the random variable $V_i$ is a function of $X_1,\dots,X_i$ and $\mathbb{E}[V_{i+1}|X_1,\dots,X_i] = 0$. Furthermore we write $S_k = \sum_{i=1}^k V_i$ for all k in the natural numbers. Now, using chernoffs bounding technique, the proof begins with $\forall t>0:$
$\begin{align*}
\mathbb{P}[S_n\geq\varepsilon] &\leq e^{-t\varepsilon}\cdot\mathbb{E}\left[e^{tS_n}\right]\\
&=e^{-t\varepsilon}\cdot\mathbb{E}\left[e^{tS_{n-1}}\mathbb{E}[e^{tV_n}|X_1,\dots,X_{n-1}]\right]
\end{align*}$

Can anybody explain the last equality to me? I feel like its not so complicated, but I just don't see it.

Best Answer

\begin{align*} \mathbb E[e^{t S_n}] &= \mathbb E[\mathbb E[e^{t S_n} \mid X_1, \dots, X_{n-1}]] & \text{(tower property of conditional expectation)} \\ &= \mathbb E[\mathbb E[e^{t S_{n-1}}e^{t V_n} \mid X_1, \dots, X_{n-1}]] & \text{($S_n = S_{n-1} + V_n$)} \\ &= \mathbb E[e^{tS_{n-1}} \mathbb E[e^{t V_n} \mid X_1, \dots, X_{n-1}]] & \text{($S_{n-1}$ is measurable w.r.t. $\sigma(X_1, \dots, X_{n-1})$)} \end{align*}

Related Question