[Math] Consistent estimator for Poisson distribution

parameter estimationpoisson distribution

I want to prove that for $S_n=1/n\sum_{i=1}^n1_{\{X_i=0\}}$, $\log(1/S_n)$ is a consistent estimator for $\lambda$ where $P(X_i=k)=\lambda^k e^{-\lambda}/k!$ so Poisson distributed. Anyone have an idea?

I was thinking using Markov's inequality. I want to prove that $P(\log(1/S_n)-\lambda>0)\to0$. Therefore
$$P(\log(1/S_n)-\lambda>0)=P(\log(1/S_n)>\lambda)=P(1/S_n>e^\lambda)\le\frac{E(1/S_n)}{e^\lambda}$$
by the inequality. However I can't prove the last term goes to zero.

Best Answer

  1. Due to the law of large numbers, your $S_n$ converges in probability to $P(X_1=0)=e^{-\lambda}$.
  2. Since $\log$ is a continuous function, $\log S_n$ converges in probability to $\log P(X_1=0)$ and you're done!
Related Question