Solved – Bayesian MCMC Metropolis-Hastings with uniform prior

bayesianmarkov-chain-montecarloprior

In Bayesian inference the following relationship is given between the posterior $P(\theta|X)$, the likelihood function $P(X|\theta)$ and the prior $P(\theta)$:

$$P(\theta|X) \propto P(X|\theta)P(\theta)$$

If the prior $P(\theta)$ is an improper uniform prior on $[-\infty,\infty]$ then does the Bayesian relationship simplify to:

$$P(\theta|X) \propto P(X|\theta)$$

If code is being written for the case of a uniform prior is it correct that the prior probability does not need to be coded?

In the case that $P(\theta)$ is uniform on $[L,U]$ can it be written (excuse my poor notation):

$$P(\theta|X) \propto P(\theta|X) \text{ for }L \leq X \leq U \text{, otherwise 0}$$

Again can the prior probability term need not be considered/coded?

Best Answer

Assuming you wrote $p(X)$ for $p(\theta)$ then indeed in case of a uniform $p(\theta)$: $$ p(\theta|x) \propto p(x|\theta) $$ as $p(\theta)=\alpha>0$ for all $\theta$, falls into the constant term (that must not include quantities depending from $\theta$). In other words, as you stated, your posterior can be evaluated up to a constant term as $p(x|\theta)$.

When the uniform is bounded on a fixed interval, we indeed get: $$ p(\theta|x) \propto p(x|\theta) \mbox{ for } \theta \in [L,U] \mbox{and 0 else } $$ for the same reason than previous.

Related Question