You have asked a number of questions here, but I will only answer the first (and the one in the title). The others you should separate into a different question, or work through on your own after you understand this answer. I think you need to know what a link function is to understand this answer. Let's say that you have process where $x$ is related to a function of $\theta$,
$$x \sim f(\theta) $$
The link function is simply the empirical $f$ that you use to link $\theta$ and $x$. The most common link function is the linear link,
$$ x = \beta\theta + \epsilon$$
This link is extremely versatile, but there are situations where it does not work very well. If you use an improper link function, your estimates of the parameter $\beta$ won't represent the real world situation (your estimate will be biased). The most common alternative process is the binary case, where $x$ can be one of only two values (0 and 1, for instance). In this situation, it makes sense to talk about the probability of $x =1$, and use instead the binomial logit link $\Lambda$,
$$ P(x|\beta \theta) = \Lambda(\beta\theta) = {e^{\beta\theta}\over1 + e^{\beta\theta}}$$
If $x$ can only be some collection of small, ordered, discrete values, it makes sense to use the Poisson link. In this case, the probability that $x$ takes the value $i$ is
$$ P(x | \beta \theta) = \dfrac{e^{\beta\theta}\beta\theta^x}{x!}$$
Say $x$ is the number of vehicles that a household owns, and $\theta$ are different socioeconomic variables. Some households own zero vehicles, some own one or two, but the probability of a household owning more than five or six is very low. This may be a good example of a Poisson process.
But there are other processes that might feel like a Poisson process, but where the number of zeros is much higher than a Poisson distribution allows. For example, let's say we are modeling the number of times a person visited a doctor in a year. For a large number of people, this is going to be zero. Thus the process is zero-inflated, and you should use your zero-inflated link function. If you use a basic Poisson link, your estimate of $\beta$ will be biased.
There are also Poisson-like processes where zeros are intuitively impossible, like the number of languages spoken by able humans. A few people speak many, many speak two or three, and everyone speaks at least one. In this case, you need to remove the zeros from your link distribution by using a zero-truncated link function. If you don't do this, your estimate of $\beta$ will again be biased (and most software won't even run if it doesn't see any zeros). Although the mathematical functions for both of your modified Poisson link functions look somewhat similar, they accomplish entirely opposite purposes.
If you have a process where the zeros are hyper-inflated (or hyper-deflated), you could combine the binary link and the zero-truncated Poisson link by using a hurdle model. One process models the probability of the outcome being positive, and another models the probability of each discrete outcome above zero. I am right now finishing a paper where we used a hurdle model to predict how many times people failed their vehicle emissions tests; 95% of people passed the first time, but others came back four or five times.
I figured out the answer, with help from Martyn Plummer. My code uses the inverse link for the gamma model (and no inverse of the predictors). Also, this code requires the 'glm' module for JAGS.
model{
# For the ones trick
C <- 10000
# for every observation
for(i in 1:N){
# define the logistic regression model, where w is the probability of occurance.
# use the logistic transformation exp(z)/(1 + exp(z)), where z is a linear function
logit(w[i]) <- zeta[i]
zeta[i] <- gamma0 + gamma1*MPD[i] + gamma2*MTD[i] + gamma3*int[i] + gamma4*MPD[i]*int[i] + gamma5*MTD[i]*int[i]
# define the gamma regression model for the mean. use the log link the ensure positive, non-zero mu
mu[i] <- pow(eta[i], -1)
eta[i] <- beta0 + beta1*MPD[i] + beta2*MTD[i] + beta3*int[i] + beta4*MPD[i]*int[i] + beta5*MTD[i]*int[i]
# redefine the mu and sd of the continuous part into the shape and scale parameters
shape[i] <- pow(mu[i], 2) / pow(sd, 2)
rate[i] <- mu[i] / pow(sd, 2)
# for readability, define the log-likelihood of the gamma here
logGamma[i] <- log(dgamma(y[i], shape[i], rate[i]))
# define the total likelihood, where the likelihood is (1 - w) if y < 0.0001 (z = 0) or
# the likelihood is w * gammalik if y >= 0.0001 (z = 1). So if z = 1, then the first bit must be
# 0 and the second bit 1. Use 1 - z, which is 0 if y > 0.0001 and 1 if y < 0.0001
logLik[i] <- (1 - z[i]) * log(1 - w[i]) + z[i] * ( log(w[i]) + logGamma[i] )
Lik[i] <- exp(logLik[i])
# Use the ones trick
p[i] <- Lik[i] / C
ones[i] ~ dbern(p[i])
}
# PRIORS
beta0 ~ dnorm(0, 0.0001)
beta1 ~ dnorm(0, 0.0001)
beta2 ~ dnorm(0, 0.0001)
beta3 ~ dnorm(0, 0.0001)
beta4 ~ dnorm(0, 0.0001)
beta5 ~ dnorm(0, 0.0001)
gamma0 ~ dnorm(0, 0.0001)
gamma1 ~ dnorm(0, 0.0001)
gamma2 ~ dnorm(0, 0.0001)
gamma3 ~ dnorm(0, 0.0001)
gamma4 ~ dnorm(0, 0.0001)
gamma5 ~ dnorm(0, 0.0001)
sd ~ dgamma(2, 2)
}
Best Answer
Here is a simple solution using the fact that the poisson will give you zeros when the lambda parameter is zero. Note however that JAGS tends to break if lambda is exactly zero, thus the "+ 0.00001".