[Math] Determine a sequence of random variables is a martingale

martingalesprobability theory

I'm trying to solve a problem from an old exam. This is an easy but a bit lengthy exercise, divided into subproblems. Since they are based on each other and probably are quite short, I was hoping that this will be accepted. Any comments on what I've done so far would be appreciated. For the completely unsolved I would be more than satisfied with the basic ideas and perhaps some reference for results that could be needed.


Let $(X_n)_{n \geq 1}$ be a sequence of i.d.d. Bernoulli random
variables with $\mathbb{P}(X_n = 0) = 1-\mathbb{P}(X_n = 1) = 1-p$
with $p \in [0,1]$. Let $M_n = \sum_{k=1}^n (X_k – p)$ and let
$\mathcal{F}_n(X_1, \dots, X_n)$.

(a) Show that $(M_n)_{n\geq 1}$ is a martingale.

In order for $(M_n)$ to be a martingale, we need the following three properties.

  1. $M_n \in \mathcal{F}_n$ for all $n \geq 1$
  2. $M_n \in \mathcal{L}^{1}(\Omega, \mathcal{F}, \mathbb{P})$ for all $n \geq 1$
  3. $\mathbb{E}[M_{n+1} | \mathcal{F}_n] = M_n$ a.s.

Starting with 1. By construction, $X_k \in \mathcal{F}_k \subset \mathcal{F}_n$ for all $k \leq n$, so it is trivial that $M_n = \sum_{k=1}^n (X_k-p) \in \mathcal{F}_n$. Turning to 2. First observe that $\mathbb{E}X_n = p$ for all $n$. By the linearity and subadditivity of the expectation, we have that

$$ \mathbb{E}|M_n| = \left|\sum_{k=1}^n \mathbb{E}(X_k-p)\right| \leq \sum_{k=1}^n \mathbb{E}|X_k -p|\leq np + \sum_{n=1}^n \mathbb{E}X_k = 2np. $$

Hence $M_n$ is bounded in $\mathcal{L}^1$ for every $n$. Now 3. Using the fact that $\sigma(X_{n+1})$ and $\sigma(X_1, \dots, X_n)$ are independent, we get

$$\mathbb{E}[M_{n+1}-M_n | \mathcal{F}_n] = \mathbb{E}[X_{n+1}-p | \mathcal{F}_n] = \mathbb{E}[X_{n+1}]-p =0$$

and hence $(M_n)$ have the martingale property.


Let $\widetilde{M}$ be another martingale adapted to the filtration
and $m = \mathbb{E} \widetilde{M}_1$. Since $\widetilde{M}$ is
adapted, it is known that there exist functions $f_n$ such that
$\widetilde{M}_n = f_n(X_1, \dots, X_n)$.

(b) Show that the $f_n$ obey the backward recursion
$$f_n(x_1, \dots, x_n) = (1-p)f_{n+1}(x_1, \dots, x_n, 0) + pf_{n+1}(x_1, \dots, x_n, 1), $$
for $x_i \in \{0,1\}$.

I'm not sure how to do this. If I understand it correctly, combining the properties that $\widetilde{M}_n = f_n(X_1, \dots, X_n)$ and $\widetilde{M}_n \overset{a.s.}{=} \mathbb{E}[\widetilde{M}_{n+1}|\mathcal{F}_n]$, we have for any $x_i$ with $1 \leq i \leq n$, that

$$f_n(x_1, \dots, x_n) = \mathbb{E}[f_n(x_1, \dots, x_n, X_{n+1}) | \mathcal{F}_n] = \mathbb{E}[f_{n+1}(x_1, \dots, x_n, X_{n+1}] {=} \mathbb{P}(X_{n+1}=0)f_{n+1}(x_1, \dots, x_n, 0) + \mathbb{P}(X_{n+1}=1)f_{n+1}(x_1, \dots, x_n, 1)
= (1-p)f_{n+1}(x_1, \dots, x_n, 0) + pf_{n+1}(x_1, \dots, x_n, 1).$$


(c) Let $Y_n = f_{n}(X_1, \dots, X_{n-1}, 1) – f_{n}(X_1, \dots, X_{n-1}, 0)$. Show that $\widetilde{M}_n = m + (Y \cdot M)_n$, that is,
$$\widetilde{M}_n = m +\sum_{k=1}^nY_k(X_k -p).$$

I assume that $f_0 = 0$. Then

$$\tilde{M}_n = f_n – f_0 = \sum_{j=1}^{n}(f_{j}-f_{j-1})$$.
Furthermore,

$$f_j(X_1, \dots, X_{j}) – f_{j-1}(X_1, \dots, X_{j-1}) = f_j(X_1, \dots, X_{j})-(1-p)f_j(X_1, \dots, X_{j-1},0)-pf_j(X_1, \dots, X_{j-1},1) = [f_j(X_1, \dots, X_{j})-f_{j}(X_1, \dots, X_{j-1},0)]-p[f_j(X_1, \dots, X_{j-1},1)-f_j(X_1, \dots, X_{j-1},0)]$$.

Here I got stuck again. Is is right so far, and if, how should I complete this? I am also not entirely sure why we get the equation $f_{j+1}-f_j = a_jX_{j+1}+b_j$.


(d) Let $\widetilde{M}_n = (pe+1-p)^{-n}\text{exp}(\sum_{i=1}^nX_i), n\geq0$. Show that $\widetilde{M}$ is a martingale and that
$$Y_n = \frac{e-1}{pe+1-p}\widetilde{M}_{n-1}$$

We have that $\sum_{i=1}^n X_i \in \mathcal{F}_n$ for all $n\geq0$. Since exp is measurable, $\text{exp}(\sum_{i=1}^nX_i) \in \mathcal{F}_n$ and we can multiply with a constant. To show that $\tilde{M}_n \in \mathcal{L}^1$, we observe that

$$\mathbb{E}|\tilde{M}_n| = \mathbb{E}[\tilde{M}_n]=(pe+1-p)^{-n}\mathbb{E}[e^{\sum_{i=1}^n X_i}] = (pe+1-p)^{-n}\mathbb{E}[e^{X_i}]^n = 1$$. It remains to show that $(\tilde{M})$ has the martingale property. A computation shows that

$$\mathbb{E}[\tilde{M}_{n+1}|\mathcal{F}_n] = (pe+1-p)^{-n-1}\mathbb{E}[e^{X_{n+1}}e^{\sum_{i=1}^nX_i}|\mathcal{F_n}]\overset{(*)}{=} e^{\sum_{i=1}^nX_i}(pe+1-p)^{n-1}\mathbb{E}[e^{X_{n+1}}]=(pe+1-p)^{-1}e^{\sum_{i=1}^nX_i} = \tilde{M}_n \text{ (a.s.)}$$.
Why is ($*$) justified? That is, why we can factor out $e^{\sum_{i=1}^nX_i}$ from the conditional expectation.
The second part was just as told a matter of computation,

$$\frac{(e-1)}{pe+1-p}\tilde{M}_{n-1} = \frac{(e-1)}{(pe+1-e)^n}e^{\sum_{i=1}^nX_i} = (pe+1-p)^{-n}(e^{1+\sum_{i=1}^{n-1}X_i} + e^{0+\sum_{i=1}^{n-1}X_i})= f(X_1, \dots, X_{n-1}, 1) – f(X_1, \dots, X_{n-1},0) = Y_n.$$


And again, if this too much for a single question, please let me know and I will edit it. Keep in mind that I'm not specifically asking for solutions but rather be pointed in the right direction.

Best Answer

you are doing discrete stochastic integration & discrete clark ocone (martingale representation) . (a) & (b) are fine. For (c), write $\tilde M_n = \sum f_{j+1} - f_j$. For fixed $X_1,...,X_j$ the summand takes on only two values, according to $X_{j+1} = 1,0$ resp.. You can therefore match it, $ f_{j+1} - f_j = a_j X_{j+1} + b_j$. Solve for $a_j, b_j$ and find them to be what they are supposed to be. More specifically, $f_{n+1}(1) -f_n = a + b, f_{n+1}(0) -f_n = b, a = f_{n+1}(1) - f_{n+1}(0)$. So $f_{n+1} - f_n = (f_{n+1}(1) - f_{n+1}(0))X_{n+1} + f_{n+1} - f_n = (f_{n_1}(1) - f_{n+1}(0)(X_n - p) + p \times$, and i'll leave the rest to you . The 'constant term' vanishes because of the relation you proved in b. for d, just compute. It only depends on fact that $E(e^X) = pe + 1 - p$, (and the i.i.d. ness of the X_i) and then you have a formula for $Y_n$ into which you can plug. As to your question, in (*) the term you took out is $F_{n}$ measurable, this is sometimes called 'taking out what's known.

Related Question