Apply Martingale Convergence Theorem to Standard Gaussian Random Variables

martingalesnormal distributionrandom variablessobolev-spacesstochastic-processes

I am fairly new to stochastic process so my question might be a little fundamental….

Let $f_n$ denote an orthonormal basis of $H_0^1(D)$, where $D$ is an open, simply connected, proper subset of $\mathbb{C}$. Further let $X_n$ to be i.i.d. standard Gaussian random variables.

How do I show that for any $f\in H^1_0(D)$ and $\displaystyle h_N = \sum_{n=1}^NX_nf_n$, the series
$$
(h_N, f)_\nabla :=\sum_{n=1}^NX_n (f_n, f)_\nabla, \text{ where }(f, g)_\nabla = \int_D(\nabla f)\cdot (\nabla g)
$$

converges almost surely and in $L^2(\mathbb{P})$?

The lecture notes (page 14) used martingale convergence theorem but I don't quite know where is the martingale here to apply martingale convergence theorem on…

Best Answer

Let $(a_n)_{n \in \mathbb{N}}$ be a sequence of real numbers. Consider a sequence of independent random variables $(X_n)_{n \in \mathbb{N}}$ such that $E[|X_n|^2]=\sigma^2<\infty,\forall n$ and $E[X_n]=0,\,\forall n$; these also imply $E[|X_n|]<\infty,\,\forall n$. Define $S_n:=\sum_{1\leq \ell \leq n}a_\ell X_\ell$. Then, $S_n$ is a martingale with respect to the filtration $\mathscr{F}_n:=\sigma(X_k,k\leq n)$. To see this, note $E[|S_n|]\leq \sum_{1\leq \ell \leq n}|a_\ell|E[|X_\ell|]<\infty$ and $E[S_n-S_{n-1}|\mathscr{F}_{n-1}]=a_nE[X_n]=0$. We also have $E[S_n]=0,\,\forall n$. So $$E[|S_n|^2]=V[S_n]=\sum_{1\leq \ell \leq n}a_\ell^2V[X_\ell]=\sigma^2\sum_{1\leq \ell \leq n}a_\ell^2\implies M:=\sup_{n \in \mathbb{N}}E[|S_n|^2]\in [0,\infty]$$ If $M<\infty$, the de la Vallée Poussin condition of uniform integrability is satisfied for the family $(S_n)_{n \in \mathbb{N}}$ by choosing the convex nonnegative increasing function $\Phi(x)=x^2,x \in [0,\infty)$, which is s.t. $\Phi(x)/x \to \infty$. By (sub)martingale convergence, uniform integrability implies convergence a.s. of $S_n$ to a random variable $S\in L^1$.

Related Question