Bayesian – Jeffreys Prior for Normal Distribution with Unknown Mean and Variance

bayesianjeffreys-priornormal distributionprior

I am reading up on prior distributions and I calculated Jeffreys prior for a sample of normally distributed random variables with unknown mean and unknown variance.
According to my calculations, the following holds for Jeffreys prior:
$$ p(\mu,\sigma^2)=\sqrt{det(I)}=\sqrt{det\begin{pmatrix}1/\sigma^2 & 0 \\ 0 & 1/(2\sigma^4)\end{pmatrix}}=\sqrt{\frac{1}{2\sigma^6}}\propto\frac{1}{\sigma^3}.$$
Here, $I$ is Fisher's information matrix.

However, I have also read publications and documents which state

as Jeffreys prior for the case of a normal distribution with unkown mean and variance.
What is the 'actual' Jeffreys prior?

Best Answer

I think the discrepancy is explained by whether the authors consider the density over $\sigma$ or the density over $\sigma^2$. Supporting this interpretation, the exact thing that Kass and Wassermann write is $$ \pi(\mu, \sigma) = 1 / \sigma^2, $$ while Yang and Berger write $$ \pi(\mu, \sigma^2) = 1 / \sigma^4. $$