Product of Gaussian PDFs for different data points

bayesiannormal distributionprobabilityprobability distributions

It is well-known that the product of normal distributions is not normal, but the product of Gaussian PDFs has a gaussian form, meaning that for $\Phi(.)$ denoting the normal pdf:

$$ \Phi(x|\mu_1, \sigma^2_1) \times \Phi(x|\mu_2, \sigma^2_2) \propto \Phi(x|\mu^*, {\sigma^{*}}^2) ,$$ where $\mu^* = \dfrac{\mu_1 \sigma^2_1+ \mu_2 \sigma^2_2}{\sigma^2_1 +\sigma^2_2}$ and ${\sigma^{*}}^2 = \dfrac{1}{\dfrac{1}{\sigma^2_1}+ \dfrac{1}{\sigma^2_2}}$.

However, this happens considering that $x$ is the same in the equation. My question is: what would happen if we had for each normal, a $x_j$? Meaning, in different data points,
what would be:

$$ \prod_{j=1}^{N} \Phi(x_j|\mu_j, \sigma^2_j) $$

My guess is that it will be proportional to a normal as well. Is it a well-known result as well? This is particularly useful for Bayesian modelling, when I have $N$ groups and a random sample from different groups, and want to compute the posterior distribution.

As a starter, if I have $\mu_j=\mu$ and $\sigma_j = \sigma$ for all $j$, then my product becomes:

$$ \prod_{j=1}^{N} \Phi(x_j|\mu, \sigma^2) \propto (\sigma^2)^{-N/2} \exp\left\{-\sum_{j=1}^{N}\dfrac{(x_j-\mu)^2}{2\sigma^2}\right\} \propto (\sigma^2)^{-N/2} \exp\left\{- \dfrac{N(\overline{x}-\mu)^2}{2\sigma^2} – \dfrac{\sum_{j=1}^{N}(x_i – \overline{x})^2}{2\sigma^2} \right\}, $$

which seems to be proportional to a Gaussian form as well (I think this is a normal inverse gamma). Is there a result where it holds in general?

Best Answer

Let denote $\boldsymbol{\Sigma}_1,...,\boldsymbol{\Sigma}_n \in \Bbb R^{N \times N}$ and $\boldsymbol{\mu}_1,...,\boldsymbol{\mu}_n \in \Bbb R^{N }$ the covariance matrices and mean vectors of $N$ multivariate normal random variable $\boldsymbol{X}_i$, $1 \le i \le n$. These density function of $\boldsymbol{X}_i$ can be represented by $\phi_n(\boldsymbol{x};\boldsymbol{\mu}_i,\boldsymbol{\Sigma}_i)$ . We have the following result $$\prod_{1\le i\le N}\phi_n(\boldsymbol{x};\boldsymbol{\mu}_i,\boldsymbol{\Sigma}_i) \propto \phi_n(\boldsymbol{x};\boldsymbol{\mu},\boldsymbol{\Sigma}) $$ with $$\boldsymbol{\Sigma}^{-1} =\sum_{i=1}^N\boldsymbol{\Sigma}^{-1}_i$$ $$\boldsymbol{\mu}=\boldsymbol{\Sigma} \left(\sum_{i=1}^N\boldsymbol{\Sigma}^{-1}_i \boldsymbol{\mu}_i\right)$$

In other words, the product of PDF is proportional to the PDF of a multivariate normal distribution.


Remark: Your first example is the particular case where

  • $\boldsymbol{\Sigma}_i$ is a matrix with $s_{ii} = \sigma_1^2$ and all other elements are $0$
  • $\boldsymbol{\mu}_i$ is a vector with $v_{i} = \mu_i$ and all other elements are $0$

It suffices to prove that $$\sum_{i=1}^N (\boldsymbol{x}-\boldsymbol{\mu}_i)^T\boldsymbol{\Sigma}_i^{-1}(\boldsymbol{x}-\boldsymbol{\mu}_i) \propto (\boldsymbol{x}-\boldsymbol{\mu})^T\boldsymbol{\Sigma}^{-1}(\boldsymbol{x}-\boldsymbol{\mu}) \tag{1}$$

The LHS of $(1)$ is equal to $$ \begin{align} L &= \sum_{i=1}^N (\boldsymbol{x}-\boldsymbol{\mu}_i)^T\boldsymbol{\Sigma}_i^{-1}(\boldsymbol{x}-\boldsymbol{\mu}_i)\\ &= \sum_{i=1}^N \boldsymbol{x}^T\boldsymbol{\Sigma}_i^{-1}\boldsymbol{x} -2\sum_{i=1}^N \boldsymbol{x}^T\boldsymbol{\Sigma}_i^{-1}\boldsymbol{\mu}_i +\sum_{i=1}^N \boldsymbol{\mu}_i^T\boldsymbol{\Sigma}_i^{-1}\boldsymbol{\mu}_i\\ &= \boldsymbol{x}^T\left(\sum_{i=1}^N \boldsymbol{\Sigma}_i^{-1}\right)\boldsymbol{x} -2\boldsymbol{x}^T\left(\sum_{i=1}^N\boldsymbol{\Sigma}_i^{-1}\boldsymbol{\mu}_i\right) +\sum_{i=1}^N \boldsymbol{\mu}_i^T\boldsymbol{\Sigma}_i^{-1}\boldsymbol{\mu}_i \tag{2}\\ \end{align} $$ while the RHS of $(1)$ is equal to $$R = \boldsymbol{x}^T \boldsymbol{\Sigma}^{-1}\boldsymbol{x} -2\boldsymbol{x}^T\boldsymbol{\Sigma}^{-1}\boldsymbol{\mu} +\boldsymbol{\mu}^T\boldsymbol{\Sigma}^{-1}\boldsymbol{\mu} \tag{3}$$

From $(2),(3)$ we deduce that $$\boldsymbol{\Sigma}^{-1} =\sum_{i=1}^N\boldsymbol{\Sigma}^{-1}_i$$ $$\boldsymbol{\Sigma}^{-1}\boldsymbol{\mu}=\sum_{i=1}^N\boldsymbol{\Sigma}^{-1}_i \boldsymbol{\mu}_i$$ Q.E.D