Finding the log likelihood of poisson and normal models that have a log link function.

statistics

I have these two distributions,
$$
\begin{split}
Y_i &∼ \mathrm{Pois}(λ_i)\\
\log(λ_i) &= β_0 + β_1x_i
\end{split}
$$

and
$$
\begin{split}
Y_i &∼ N(µ_i, σ^2)\\
\log(µ_i) &= γ_0 + γ_1x_i
\end{split},
$$

and I would like to find the log likelihood of both of these distributions.
I think the log likelihood for Poisson normally is
$$
l(λ_i;x_i)=\sum-nλ_i-\sum \log(x_i!)+\sum \log(λ_i)x_i
$$

Would I then just substitute in $\log(λ_i)=β_0+β_1x_i$ or is this incorrect?

Similarly, for the normal distribution
$$
l(u_i;y_i)= -\frac{n}{2} (\log(2*pi))-\frac{n}{2} (\log(\sigma^2))-\frac{1}{2}\sigma^2\sum (y_i-u_i)^2
$$

do I also just substitute in $\log(µ_i) = γ_0 + γ_1x_i$?

Best Answer

I'll do this for Poisson: The log-likelihood function is

$$ l(λ_i;x_1 \ldots x_N ; \beta_0,\beta_1)=\sum_{ i=1 }^N \log p(Y_i \vert x_1 \ldots x_N,\beta_0,\beta_1)p(x_1\ldots x_N,\beta_0,\beta_1) $$ Now, most probably your data $x_1 \ldots x_N$ are non-stochastic, hence $$ l(λ_i;x_1 \ldots x_N ; \beta_0,\beta_1)=\sum_{ i=1 }^N \log \bigg( p(Y_i \vert x_1 \ldots x_N,\beta_0,\beta_1)p(\beta_0,\beta_1) \bigg) \tag{1} $$ The question is this: "Are your $\beta_i$'s deterministic like the $x_i$'s or are they random?" If deterministic, then you can say $$ l(λ_i;x_1 \ldots x_N ; \beta_0,\beta_1)=\sum_{ i=1 }^N \log p(Y_i \vert x_1 \ldots x_N,\beta_0,\beta_1) = \sum-nλ_i-\sum \log(x_i!)+\sum \log(λ_i)x_i \tag{2} $$ where you could just replace $\log \lambda_i$ by $β_0 + β_1x_i$ and your done.

On the other hand, if stochastic, then you have to (according to equation $1$), multiply (or add after taking $\log$) the joint distribution $p(\beta_0,\beta_1)$, with what you have in equation $(2)$.