If $(X_n)$ is a martingale, why do we have $\mathbb E[(H\cdot X)_n]=\mathbb E[X_0]$

martingalesprobability theory

Let $(X_n)_{n\in\mathbb N}$ a martingale w.r.t. the filtration $(\mathcal F_n)_n$. Set $$(H\cdot X)_n=\sum_{k=1}^nH_k(X_k-X_{k-1}),$$
where $H$ is predictable i.e. $H_n\in \mathcal F_{n-1}$. Why $$\mathbb E[(H\cdot X)_n]=\mathbb E[X_0] \ \ ?$$

Try

$$\mathbb E[(H\cdot X)_n]=\sum_{k=1}^n\mathbb E[H_k(X_{k+1}-X_k)]=\sum_{k=1}^n\mathbb E\big[H_k\mathbb E[X_k-X_{k-1}\mid \mathcal F_{k-1}]\big],$$
since $(X_n)$ is a Martingale, we have that $\mathbb E[X_k-X_{k-1}\mid \mathcal F_{k-1}]=0$, so at the end $$\mathbb E[(H\cdot X)_n]=0$$ and not $\mathbb E[X_0]$, I'm I right ?

(this come from the book Continuous martingale and Brownian motion third edition of Yor and Revuz).

Here the whole proof enter image description here

Best Answer

$\newcommand{E}{\mathbb E}$ In the proof of Proposition 1.3 on page 52 (the proposition before the one you've shown) the book defines $H \cdot X$ to be the process $Y$ where $$ Y_0 = X_0, \quad Y_n = Y_{n-1} + H_n (X_n - X_{n-1}). $$ This means we actually have $$ (H \cdot X)_n = X_0 + \sum_{k=1}^n H_k (X_k - X_{k-1}) $$ from which we get $\E[(H \cdot X)_n] = \E[X_0]$. I agree that this is not the standard convention though, and even this book changes convention when it formally defines the stochastic integral. In fact on page 138 it stresses that $K \cdot M$ vanishes at 0.