Multivariate normal random variables

gaussiannormal distributionprobabilityprobability distributionsprobability theory

Prove that $X\sim N(\mu,\Sigma)$ is a multivariate random vector if and only if $\langle X,x\rangle$ has a normal distribution for each $x\in\mathbb{R}^n$.

Hint: use Cramér–Wold theorem.

My idea for the $(\Leftarrow)$ part is to use the characteristic function but I do not know how to formalize my reasoning correctly. For the opposite direction I have no ideas on how to use the hint.

Can someone help me?

Edit:

Definition of a standard normal random vector: a random vector $X$ has standard normal distribution in $\mathbb{R^n}$ if it has density:
\begin{equation}
f(x)=\frac{1}{(2\pi)^{n/2}}e^{-\frac{\|x\|_2^2}{2}}
\end{equation}

we write $X\sim N(0,I_n)$

Definition: $X$ random vector has general normal distribution in $\mathbb{R}^n$ and we write $X\sim N(\mu,\Sigma)$ for $\mu\in\mathbb{R}^n$ and $\Sigma$ $n\times n$ positive semi-definite matrix if and only if $Z=\Sigma^{-1/2}(X-\mu)\sim N(0,I_n)$

Best Answer

Let's first consider the implication $(\Rightarrow)$. Note, that if $X \sim N(0,I_n)$, then $X=(X_1,\dots,X_n)^*$ where $X_i$ are independent $N(0,1)$ variables for $i=1,\dots,n$. In particular we get for any $x \in \mathbb{R}^n$, that $$\langle X , x \rangle = x^*X= \sum_{i=1}^n x_iX_i \sim N(0,\sum_{i=1}^{n} x_i^2)=N(0,||x||^2)$$ by properties of the $N(0,1)$ distribution. For general $X \sim N(\mu,\Sigma)$ note that \begin{align*}(x^*X-x^*\mu) &= x^*(X-\mu) \\ &= x^* \Sigma^{1/2} \Sigma^{-1/2} (X-\mu) \\ &=(\Sigma^{1/2}x)^* \Sigma^{-1/2}(X-\mu) \\ &= \langle Z,\Sigma^{1/2}x\rangle \\ &\sim N(0,||\Sigma^{1/2}x||^2) \\ &= N(0,x^*\Sigma x) \end{align*} were we used the established result for $Z = \Sigma^{-1/2}(X-\mu) \sim N(0,I_n)$. This gives us that $x^* X \sim N(x^*\mu , x^*\Sigma x)$, which concludes the proof of the first implication.

Now let's show $(\Leftarrow)$. Let $X$ be a random vector with the property that $\langle X , x \rangle$ has a normal distribution for all $x \in \mathbb{R}$ and let $Y \sim N(\mu,\Sigma)$ with $\mu = \mathbb{E}[X]$ and $\Sigma=\operatorname{Cov}(X)$. This implies for all $x \in \mathbb{R}^n$ that $$ \langle X,x\rangle \sim \langle Y,x \rangle \sim N(x^* \mu , x^*\Sigma x)$$ and thus, that $$\mathbb{E}[e^{i\langle X , x\rangle}] = \mathbb{E}[e^{i \langle Y,x \rangle}]$$
for all $x \in \mathbb{R^n}$, which means that $X$ and $Y$ have the same characteristic function and thus the same distribution, which is what we wanted to prove.

Related Question