Solved – Does asymptotically unbiased means that converges in probability

asymptoticsconsistency

i'm arguing with a friend and he doesn't have a solid argument for his statement.
I claim that if a random variable is asymptotically unbiased, then it is consistent and that implies that converges:

Example: Suppose i have a estimator $\hat{\theta}$ for a parameter $\theta$.
Suppose that $E(\hat{\theta}) = \frac{n}{n+1} \theta$

Then because $\lim_{n\to \infty} E(\hat{\theta}) = \theta$ i can claim that the estimator $\hat{\theta}$ converges in probability to $\theta$.

Is that wright? I can show this reasoning to my friend?
Thank's a lot!


Thank's guys! Now i have a better idea:

Suppose i can state what i previously wrote and i can state that the variance of the estimator is finite. Is that sufficient to state that there is convergence in probability?

I'm with a statemente that is the variance converges nummerically to cero, then the convergence is in cuadratic mean, but that is a stronger statement.

Best Answer

Let $Y_i \sim \mbox{i.i.d. } N(\mu,1)$. Then $\hat\theta_n=Y_n$ is an unbiased (and hence asymptotically unbiased) estimator of $\mu$. However, it does not converge in probability, and is not consistent: ${\rm Prob}[ |\hat\theta_n-\theta| > a] = 2 \Phi( -a )$ does not converge to zero.

Related Question