This question was taken from a practice exam in my statistics course.
Given a random sample $X_1, X_2, … X_n$ from a Poisson distribution with mean $\lambda$, can you show that $\bar{X}$ is consistent for $\lambda$?
We are told to use Tchebysheff's inequality. Which is: $Pr(|X-\mu| \geq k\sigma) \le \frac{1}{k^2}$
Also, as far as I know, consistency of an estimator is the property that as we increase the sample size of $\bar{X}$, our estimator should return values closer and closer to the actual value we want to estimate.
So the first thing I did was find the variance for $\bar{X}$ as follows:
$Var(\bar{X})=Var(\frac{\sum(X_i)}{n})=\frac{1}{n^2}Var(\sum(X_i))=\frac{\lambda}{n}$
I notice that as $n \rightarrow \infty $ the variance decreases to $0$, but how does this help me?
Now I guess we can use Tchebysheff's inequality where we need $Pr(|\bar{X}-\lambda| \geq \epsilon) = 0$ and that is where I get stuck…
Any help is appreciated.
Best Answer
Edit: Since it seems the point didn't get across, I'm going to fill in a few more details; it's been a while, so maybe I can venture a little more.
Start with definitions.
step 1: give a definition of consistency
Like this one from wikipedia's Consistent estimator article:
step 2: Note (hopefully!) that it relies on convergence in probability, so give a definition for that in turn (wikipedia article on Convergence of random variables).
step 3: Then write Chebyshev's inequality down:
(wikipedia article on Chebyshev's inequality)
step 4: now look at the rather strong similarity between two expressions in (2) and (3).
Does that not give you a huge clue about a way to approach this?
So let's start
From Chebyshev:
$Pr(|\bar{X}-\lambda| \geq k\lambda/n) \le \frac{1}{k^2}$ (you worked out all the parts of this but never actually wrote it down in your question. I can't fathom why you wouldn't)
Let $\epsilon=k\lambda/n$ (that's the obvious step from the "huge clue" you were supposed to see by comparing the two things I said to write down ... one had an $\epsilon$ where the other had a $k\lambda/n$ ... if you'd written them both down, as I'd been suggesting, I expect it would have been obvious as soon as you compared them).
So now, the only thing that's really left is to show that for any $\epsilon$, as $n\to\infty$, the RHS goes to 0.