[Math] Levy’s theorem for characteristic functions.

characteristic-functionsmeasure-theoryprobabilityprobability theoryself-learning

The statement of Levy's theorem for characteristic functions is the following:

Let $\{X_n\}_{n=1}^{\infty}$ be a sequence of random variables such that the sequence of their characteristic functions, $\{\varphi_n\}_{n=1}^{\infty}$, converges to $\varphi(t)$ for each $t\in\mathbb{R}$, where $\varphi$ is continuous at $0$. Then there exists a random variable $X$ with characteristic function $\varphi$ and $X_n\stackrel{L}{\rightarrow} X$.

I was told to prove that $\{P_{X_n}:n\geq1\}$ is a tight family of probability measures, and then use the following criterion of weak convergence (which is a simple consequence of Prokhorov's theorem):

If $\{P_n\}_{n=1}^{\infty}$ is a tight sequence of probability measures, and every weakly convergent subsequence converges to the same probability measure $\mu$, then $P_n\stackrel{w}{\rightarrow}\mu$.

If $\{P_{X_{n_{k}}}\}_{k=1}^{\infty}$ is a subsequence that converges weakly to $\mu$, then $\varphi_n(t)\rightarrow\varphi_\mu(t)$, so $\varphi_\mu=\varphi$. By the criterion, $P_{X_n}\stackrel{w}{\rightarrow} \mu$.

My question is: how do I know that, for my fixed and underlying probability space $(\Omega,\mathcal{F},P)$, there exists a random variable $X$ such that $P_X=\mu$? By Kolmogorov's theorem, I know that we can find a probability space and a random variable with a certain law, but what happens if the probability space is previously fixed?

Best Answer

The existence of the random variable $X$ having the same distribution as $\mu$ is established in the answer by Anthony Quas.

Related Question