[Math] Prove that $X,Y$ are independent iff the characteristic function of $(X,Y)$ equals the product of the characteristic functions of $X$ and $Y$

characteristic-functionsmeasure-theoryprobability theory

Let

  • $(\Omega,\mathcal A,\operatorname P)$ be a probability space
  • $X$ and $Y$ be random variables on $(\Omega,\mathcal A,\operatorname P)$ with values in $\mathbb{R}^m$ and $\mathbb{R}^n$, respectively
  • $\varphi_Z$ denote the characteristic function of a random variable $Z$

Claim: $\;$ $X$ and $Y$ are independent iff $$\varphi_{(X,Y)}(s,t)=\varphi_X(s)\varphi_Y(t)\;\;\;\text{for all }s\in\mathbb{R}^m\;\text{and}\;t\in\mathbb{R}^n\tag{1}$$

Proof: $\;$ "$\Rightarrow$":

  • Let $Z:=(X,Y)$ and $u:=(s,t)\in\mathbb{R}^m\times\mathbb{R}^n$
  • $X$ and $Y$ are independent $\Rightarrow$ $e^{i\langle s,\;\cdot\;\rangle}\circ X$ and $e^{i\langle t,\;\cdot\;\rangle}\circ Y$ are independent $\Rightarrow $

\begin{equation}
\begin{split}
\varphi_Z(u)&\stackrel{\text{def}}{=}\operatorname E\left[e^{i\langle u,Z\rangle}\right]\\
&=\operatorname E\left[e^{i\langle s,X\rangle+i\langle t,Y\rangle}\right]\\
&=\operatorname E\left[e^{i\langle s,X\rangle}e^{i\langle t,Y\rangle}\right]\\
&=\operatorname E\left[e^{i\langle s,X\rangle}\right]\operatorname E\left[e^{i\langle t,Y\rangle}\right]\\
&\stackrel{\text{def}}{=}\varphi_X(s)\varphi_Y(t)
\end{split}
\end{equation}

"$\Leftarrow$":

  • Let $\tilde X\sim X$ and $\tilde Y\sim Y$ be independent
  • Since a finite measure on $\mathbb{R}^d$ is uniquely determined by its characteristic function, $$\varphi_X=\varphi_{\tilde X}\;\;\;\text{and}\;\;\;\varphi_Y=\varphi_{\tilde Y}\tag{2}$$
  • Thus, \begin{equation}
    \begin{split}
    \varphi_{(X,Y)}(s,t)&\stackrel{(1)}{=}\varphi_X(s)\varphi_Y(t)\\
    &\stackrel{(2)}{=}\varphi_{\tilde X}(s)\varphi_{\tilde Y}(t)\\
    &=\varphi_{(\tilde X,\tilde Y)}(s,t)
    \end{split}
    \end{equation} by "$\Rightarrow$"
  • Again, since the distribution of $(X,Y)$ is uniquely determined by $\varphi_{(X,Y)}$, we've got $$(X,Y)\sim (\tilde X,\tilde Y)$$
  • Especially, $Z:=(X,Y)$ and $\tilde Z:=(\tilde X,\tilde Y)$ have the same distribution function $F$

Now, I got stuck. From the definition of $F$ and the definition of independence, it seems to be obvious, that we can conclude the independence of $X$ and $Y$. However, how do we need to argue in detail?

Best Answer

You're saying that the pair $(X,Y)$ has the same distribution as the pair $(\bar X,\bar Y)$ and $\bar X,\bar Y$ are independent and you want to prove $X,Y$ are independent. \begin{align} & \Pr(X\in A\ \&\ Y\in B) \\[10pt] = {} & \Pr((X,Y)\in A\times B) \\[10pt] = {} & \Pr((\bar X,\bar Y)\in A\times B) & & \text{(since the joint distributions are the same)} \\[10pt] = {} & \Pr(\bar X\in A)\Pr(\bar Y\in B) & & \text{(since $\bar X,\bar Y$ are independent)} \\[10pt] = {} & \Pr(X\in A)\Pr(Y\in B) & & \text{(since $X\sim\bar X$ and $Y\sim\bar Y$)}. \end{align} Hence $X,Y$ are independent.

The part of this that took some work to prove is that the joint distributions are the same, and you seem to have done that part already.

Related Question