Solved – MCMC to handle flat likelihood issues

likelihoodmarkov-chain-montecarloposterior

I have a quite flat likelihood leading Metropolis-Hastings sampler to move through the parameter space very irregularly, i.e. no convergence can be achieved no matter what the parameters of proposal distribution (in my case it is gaussian). There is no high complexity in my model – just 2 parameters, but it seems that MH cannot handle this task. So, is there any trick around this problem? Is there a sampler that would not produce Markov chains moving very far to the posterior tails?

Update of the problem:
I will try to reformulate my question giving more details. First of all I will describe the model.
I have a graphical model with two nodes. Each node is governed by an auto-Poisson model (Besag, 1974) as follows:
$$p\left ( X_{j} |X_{k}=x_{k},\forall k\neq j,\Theta \right )\sim Poisson\left ( e^{\theta _{j}+\sum _{j\neq k}\theta _{kj}x_{k}} \right )$$
Or, since there just two nodes and assuming equal global intensities:
$$p\left ( X_{1} |X_{2}=x_{2},\theta, \alpha \right )\sim Poisson\left ( e^{\theta+\alpha x_{2}} \right )$$
$$p\left ( X_{2} |X_{1}=x_{1},\theta, \alpha \right )\sim Poisson\left ( e^{\theta+\alpha x_{1}} \right )$$

Since it is a Markov field, the joint distribution (or likelihood of realization $ X=[x_{1},x_{2}] $) is as follows:
$$ p\left ( X \right )=\frac{exp\left ( \theta \left ( x_{1}+x_{2} \right )+2 x_{1}x_{2} \alpha\right )}{Z\left ( \theta, \alpha \right )}=\frac{exp\left ( E\left ( \theta, \alpha, X \right ) \right )}{Z\left ( \theta, \alpha \right )} $$
Since I assumed flat priors for $\alpha$ and $\theta$, posterior is then proportional to
$$\pi(\theta, \alpha |X)\propto \frac{exp\left ( E\left ( \theta, \alpha, X \right ) \right )}{Z\left ( \theta, \alpha \right )}$$
Since $Z(\theta, \alpha)$ in general is very hard to evaluate (lots of lots of summations) I am using auxiliary variable method due to J. Moller (2006). According to this method, first I draw a sample of data ${X}'$ by Gibbs sampler (since conditionals are just poisson distributions) then I draw a proposal from Gaussian distribution and calculate accordingly the acceptance criteria $H({X}',{\alpha}',{\theta}'|X, \alpha, \theta)$.
And here I get a wild Markov chain. When I impose some boundaries within which the chain can move, the sampler seems to converge to some distribution, but once I move at least one boundary, resulting distribution also moves and always shows trancation.
I think that @Xi'an is wright – the posterior might be improper.

Best Answer

I find it surprising that a flat likelihood produces convergence issues: it is usually the opposite case that causes problems! The usual first check for such situations is to make sure that your posterior is proper: if not it would explain for endless excursions in the "tails". If the posterior is indeed proper, you could use fatter tail proposals like a Cauchy distribution... And an adaptive algorithm à la Roberts and Rosenthal.

If this still "does not work", I suggest considering a reparameterisation of the model, using for instance (i.e. if there is no other natural parametrisation) a logistic transform, $$ \varphi(x) = \exp(x)/\{1+\exp(x)\} $$ (with a possible scale parameter), which brings the parameter into the unit square.

Regarding the earlier answers, Gibbs sampling sounds like a more likely solution than accept-reject, which requires finding a bound and scaling the t distribution towards the posterior, which did not seem feasible for the more robust Metropolis-Hastings sampler...

Related Question