Show that the estimator $\hat\lambda_n=\frac{1}{n+a}\sum_{i=1}^n X_i$ is consistent for all $a>0$

parameter estimationstatistics

Let $X_1,X_2,\ldots$ be independent poisson random variables with $\lambda>0$. Show that for any $a>0$ the sequence $(\hat\lambda_n)_{n \in \mathbb N}$ is consistent given that: $$\hat\lambda_n=\frac{1}{n+a}\sum_{i=1}^n X_i $$

Question: How do I show this estimator is consistent? Am I on the right track with my apporach?

My attempt: I am guessing I am supposed to check if $\hat \lambda_n$ is a consistent estimator for the parameter $\lambda$. First I tried to calculate the expected value:

$$\begin{equation*}\begin{split}E[\hat\lambda_n] &= E\left[\frac{1}{n+a}\sum_{i=1}^n X_i
\right]
\\ &=\frac{1}{n+a} \sum_{i=1}^n E[X_i]
\\ &=\frac{n}{n+1} \lambda\end{split}\end{equation*}
$$

So for $n\to\infty$ we have: $$\lim_{n \to \infty} E[\hat \lambda_n]=\lim_{n \to \infty}\frac{n}{n+1} \lambda= \lambda$$

Therefore this estimator is asymptotically unbiased. I somehow wanted to use this fact to show that it is consistent. In order for the estimator to be consistent, the following must hold:

$$\lim_{n \to \infty}\Pr(\vert \hat \lambda_n-\lambda \vert \ge \varepsilon)=0$$

I was thinking of using chebyshev's inequality:

$${\displaystyle \Pr(|X-\mu |\geq \varepsilon )\leq {\frac {\sigma ^{2}}{\varepsilon ^{2}}},} \\ \implies \Pr(\vert \hat{\lambda}_n-\lambda\vert \ge \varepsilon ) \le \frac{\text{Var}(\hat{\lambda}_n)}{\varepsilon^2}$$

However the mean of $\hat \lambda_n$ is only $\lambda$ for $n\to \infty$ (what I showed above) so I am not sure if I can use chebyshev's inequality here. Also I wasn't sure how to get the variance of the sequence. Maybe someone can help me out with this part.

Best Answer

I'll give you another solution.

Def: An estimator $T_{n}=g_{n}(X_{1},...,X_{n})$ is consistent for the parameter $\theta$ if $\forall \varepsilon>0:$ $\lim_{n\rightarrow+\infty}\mathbb{P}(|T_{n}-\theta|<\varepsilon)=1$, or equivalently: $\lim_{n\rightarrow+\infty}\mathbb{P}(|T_{n}-\theta|\geq\varepsilon)=0$.

Theorem: Consider an asymptotically unbiased (for the parameter $\theta$) estimator $T_{n}$ with finite variance $\forall n$. Then if $\lim_{n\rightarrow+\infty}Var(T_{n})=0$ we can conclude that the estimator $T_{n}$ is consistent.

In fact (using the Chebyshev's inequality): $$0\leq\mathbb{P}(|T_{n}-\theta|\geq\varepsilon)\leq\frac{E(T_{n}-\theta)^2}{\varepsilon^2}=\frac{Var(T_{n})+(E[T_{n}]-\theta)^2}{\varepsilon^2}\rightarrow_{n\rightarrow+\infty}0,$$ since the estimator is asymptotically unbiased and $\lim_{n\rightarrow+\infty}Var(T_{n})=0$.

In your case you've already proved that the estimator is asymptotically unbiased. Let's compute the variance (bearing in mind that the random variables are independent and identically distributed): $$Var(\hat{\lambda_{n}})=\frac{1}{(n+a)^2}\cdot\sum_{i=1}^{n}Var(X_{i})=\frac{n\lambda}{(n+a)^2}\rightarrow_{n\rightarrow+\infty}0. $$

We can conclude that the estimator is consistent.