[Math] The characteristic function of a multivariate normal distributed random variable

normal distributionprobability distributions

The characteristic function of a random variable $X$ is defined as $\hat{X}(\theta)=\mathbb{E}(e^{i\theta X})$. If $X$ is a normally distributed random variable with mean $\mu$ and standard deviation $\sigma\ge 0$, then its characteristic function can be found as follows:

$$\hat{X}(\theta)=\mathbb{E}(e^{i\theta X})
=\int_{-\infty}^{\infty}\frac{e^{i\theta x-\frac{(x-\mu)^2}{2\sigma^2}}}{\sigma\sqrt{2\pi}}dx=\ldots=e^{i\mu\theta-\frac{\sigma^2\theta^2}{2}}$$

(to be honest, I have no idea what to put instead of the "$\ldots$"; I've looked here, but that's only for the standard case. Anyway, this is not really my question, even if it is interesting and might be relevant)

Now, if I got it right, a random Gaussian vector $X$ (of dimension $n$) is a vector of the form $X=AY+M$ where $A$ is any real square matrix $n\times n$, $Y$ is a vector of size $n$ in which each coordination is a standard normally distributed random variable, and $M$ is some (constant) vector of size $n$.

I am trying to find the characteristic function of such $X$. The generalization of the formula for characteristic functions to higher dimensions is straight-forward:

$$\hat{X}=\mathbb{E}(e^{i<\theta,X>}),$$ where $<.,.>$ here is an inner product. So I can start with the following:

$$\hat{X}(\theta) = \mathbb{E}(e^{i<\theta,X>})
= \mathbb{E}(e^{i<\theta,AY>}\cdot e^{i<\theta,M>})\\
=e^{i<\theta,M>}\cdot \mathbb{E}(e^{i<\theta,AY>}) $$

And I'm left with an expectation of a complex product of random variables. That probably means that the covariance matrix of some random variables should be involved, but that touches the boundaries of my knowledge about probability.

Best Answer

You wouldn't want to use the bracket notation for inner product when you're essentially dealing with matrices. Instead, write $\mathbb{E}\left[e^{i\theta^{T}X}\right] = \mathbb{E}\left[e^{i\theta^{T}\left(AY+M\right)}\right] = e^{i\theta^{T}M}\mathbb{E}\left[e^{i\theta^{T}AY}\right]$. You're only left with computing the characteristic function of a multivariate Gaussian distribution. $$ \begin{align*}X &\sim \mathcal{N}\left(\mu, \Sigma\right)\\ \mathbb{E}\left[e^{is^{T}X}\right] &= \exp \left\{i\mu^{T}s - \frac{1}{2}s^{T}\Sigma s \right\} \end{align*} $$ Just find out the mean vector and the covariance matrix of $AY$ since Gaussian variables have the affine property which means they don't change under linear transformation (They're still Gaussian completely defined by the mean vector and covariance matrix). If $Y \sim \mathcal{N}\left(\mu_{Y}, \Sigma_{Y}\right)$, then $$ \begin{align*} \mathbb{E}\left[AY\right] &= A\mu_{Y} \\ \operatorname{Var}\left[AY\right] &= A\Sigma_{Y} A^{T} . \end{align*} $$

Using the relationship between $X$ and $Y$, $$ \begin{align*} AY &= X-M \\ \mathbb{E}\left[AY\right] &= \mu_{X} - M \\\operatorname{Var}\left[AY\right] &= \Sigma_{X}\\ \mathbb{E}\left[e^{i\theta^{T}AY}\right] &= \exp \left\{i\left(\mu_{X}-M\right)^{T}\theta - \frac{1}{2}\theta^{T}\Sigma_{X} \theta \right\} . \end{align*} $$ This is as far as I can get with the information you gave.

Related Question