Finding the likelihood estimation of a Poisson distribution

estimationmaximum likelihoodpoisson distributionprobability distributions

I was given a dataset $\{x_1, x_2, …, x_N \}$ of size $N$, and I need to derive the maximum likelihood estimate for

$a)$ The parameter of a Poisson distribution, is
$$f(x;λ)=\{e^{-λ}\frac{λ^x}{x!}, \ \ x\ge0,,\ \ \ \ o\ \ \ \ x<0$$

assuming each $x_i\ge0$.

My attempt:

The $N$ observations are independent and the likelihood function is equal to the
$$L(λ_i;x_1,…..,x_N)=\pi^{N}_{j=1}\ \ \ f(x_j;λ)$$
$$\pi^{N}_{j=1}\ \ e^{-λ}\frac{1}{x_j!}λ^{x_j}$$

Now, the log likelihood function is

$$\log \bigg(\pi^{N}_{j=1}e^{-λ}\cdot\frac{λ^{x_j}}{x_j!}\bigg)$$
On further solving

$$\sum_{j=1}^N\bigg[-λ-\log_e(x_j!)+x_j\log_eλ\bigg]$$

The maximum likelihood estimate is the solution of the following maximisation problem:

$$λ=\arg\max l(λ;x_1,…..,x_N)=0$$

I'm stuck here. Can anyone explain how to solve this

Best Answer

Use derivatives. As a function of $\lambda$, you want to find a maximum of the function $$l(\lambda) = \sum_{j=1}^N\bigg[-λ-\log_e(x_j!)+x_j\log_eλ\bigg].$$ Consider the $x_j$'s to be constants.

If you use calculus, maximum (if it exists...) occurs at a point of zero derivative. The equation $$\frac{\text{d}l}{\text{d}\lambda} = \sum_{j=1}^N (-1 + \frac{x_j}{\lambda}) = 0.$$ has the only solution

$$ \lambda = \sum_j \frac{x_j}{N}. $$ Not surprisingly, this is the mean of the numbers $x_j$.