Solved – concatenating two normal random variables

distributionsnormal distributionprobability

In linear measurement with noise, we denote the measurement as
$$y = Ax+v$$
where $x$ is the measure we want to estimate, $y$ is the measurement, $A$ characterizes some sensor, and $v$ is sensor noise. We assume that
$$x \sim \mathcal{N}(\bar{x}, \Sigma_x), v \sim \mathcal{N}(\bar{v}, \Sigma_v)$$
and that $x$ and $v$ are independent.
Then in the proof, it takes $x$ and $v$ and does the following:
$$\begin{bmatrix} x \\v \end{bmatrix} \sim \mathcal{N} \Big( \begin{bmatrix} \bar{x} \\ \bar{v} \end{bmatrix}, \begin{bmatrix} \Sigma_x, 0 \\ 0, \Sigma_v \end{bmatrix} \Big)$$

How do we know that combining the two in such a way results in a normal distribution? What exactly is this operation? Any link to a proof would be appreciated.

Best Answer

Here is a general idea: We can use the independence between $x$ and $v$ and write $p(x,v)=p(x)p(v)$. We know both $x$ and $v$ are multivariate normal, hence, we can write:

$p(x)p(v) = \dfrac{1}{\sqrt{(2\pi)^n|\Sigma_x|}}e^{-1/2(x-\bar{x})'\Sigma_x^{-1}(x-\bar{x})} \dfrac{1}{\sqrt{(2\pi)^n|\Sigma_v|}}e^{-1/2(v-\bar{v})'\Sigma_v^{-1}(v-\bar{v})}$

After simplifying the above equation, we can write:

$= \dfrac{1}{\sqrt{(2\pi)^{2n}|\Sigma_x||\Sigma_v|}}e^{-1/2(u-\bar{u})'\begin{bmatrix}\Sigma_x^{-1} & 0\\ 0 & \Sigma_v^{-1} \end{bmatrix}(u-\bar{u})}$, where $u = [x,v]$ and $\bar{u} = [\bar{x}, \bar{v}]$.

Note that now we can define $\Sigma_u = \begin{bmatrix}\Sigma_x & 0\\ 0 & \Sigma_v \end{bmatrix}$, i.e., a $\textit{Block Diagonal}$ matrix. Hence, its determinant and inverse is exactly appeared in the above equation. Hence, $p(x,y)=p(u)$ is Gaussian $N(\bar{u}, \Sigma_u)$.

Related Question