Consistent estimator of binomial distributed random variables

estimationparameter estimationsolution-verificationstatistics

Let be $X_1, X_2\dots $ independent binomially distributed random variables with probability $p$ and length $m$. Both parameters $p$ and $m$ are unknown. Compute a consistent estimator for $m$.

My approach:

We know that the expected value is $\mathbb{E}(X_i)=mp$, so$\frac{\mathbb{E}(X_i)}{p}=m$. Hence, an intuitive estimator could be $\frac{1}{pn}\sum\limits_{i=1}^nX_i$ which is obviously is unbiased. As $X_1,X_2, \dots$ are all i.i.d with $\mathbb{E}(X_i)=mp$ we can apply the weak law of large number. Hence, for $\epsilon p>0$ we know that
$$
\lim\limits_{n\to\infty}P\left(\left|\frac{1}{n}\sum\limits_{i=1}^n X_i-mp\right|\geq \epsilon p\right)=\lim\limits_{n\to\infty}P\left(\left|\frac{1}{np}\sum\limits_{i=1}^n X_i-m\right|\geq \epsilon \right)=0.
$$

Is this correct?

Best Answer

Consider $$X_{(n)}=\max(X_1,X_2,...,X_n)$$ We have $$P(X_{(n)}< m)=P(\cap_{k\leq n}\{X_k<m\})=(1-p^m)^n\implies \sum_{n \in \mathbb{N}}{\underbrace{(1-p^m)}_{<1}}^n<\infty$$ By Borel-Cantelli I we have $P(X_{(n)}<m\textrm{ i.o.})=0$. This means that for a.a. $\omega,\exists N(\omega):\,\forall n \geq N(\omega)$ we have $X_{(n)}(\omega)=m$. But this means $X_{(n)}$ converges a.a. to $m$: a.a. convergence implies convergence in probability, so the estimator is consistent.

Related Question