Solved – Show that $\hat\theta=\frac{2 \bar Y- 1}{1- \bar Y}$ is an unbiased estimator for $\theta$

method of momentsself-studyunbiased-estimator

Let $Y_1,Y_2,…,Y_n$ denote a random sample from the probability density function
$$f(y| \theta)= \begin{cases} ( \theta +1)y^{ \theta}, & 0 < y<1 , \theta> -1 \\ 0, & \mbox{elsewhere}, \end{cases}$$

Find an Estimator for $\theta$ by using the method of moments and show that it is consistent.

I have found the estimator but unsure how to show that it is consistent estimator.

E(Y)=$ \frac{ \theta +1}{ \theta +2}$ and $m_1'(u)= \frac{1}{n} \sum_{i=1}^{n}Y_i= \bar Y$

Now, E(Y)=$ \frac{ \theta +1}{ \theta +2}$=$m_1'(u)= \frac{1}{n} \sum_{i=1}^{n}Y_i= \bar Y$

So $\bar Y=\frac{ \theta +1}{ \theta +2} \to \hat\theta=\frac{2 \bar Y- 1}{1- \bar Y} $

Now I am unsure how to show that $\hat\theta=\frac{2 \bar Y- 1}{1- \bar Y}$ is an consistent estimator for $\theta$

Can someone please help me?

Best Answer

Given the answers and comments on both sites, you're probably done with your homework. The following is just a clarification for future visitors.

This is your statistical model: you have random variables $Y_1,\dots,Y_n$, which are independent and identically distributed, with $Y_i\sim\mathrm{Beta}(\theta +1,1)$, for $\theta>-1$.

An unbiased estimator $\hat{\theta}_n=\hat{\theta}_n(Y_1,\dots,Y_n)$ of the parameter $\theta$, by definition, must satisfy $\mathrm{E}_\theta[\hat{\theta}_n]=\theta$, for every $\theta$. The original question is wrong, because the estimator obtained by applying the method of moments to this problem is not unbiased.

An estimator $\hat{\theta}_n$ of the parameter $\theta$ is (weakly) consistent if $\hat{\theta}_n \stackrel{P_\theta}{\longrightarrow} \theta$, for every $\theta$, and strongly consistent if $\hat{\theta}_n \longrightarrow \theta$, a.s. $[P_\theta]$, for every $\theta$. In what follows, we compute the method of moments estimator for $\theta$ and prove that it is strongly consistent (yielding that it is weakly consistent).

To obtain the estimator, note that first population moment is $\mathrm{E}_\theta[Y_1] =(\theta+1)/(\theta+2)$, and the first sample moment is $\bar{Y}_n=(Y_1+\dots+Y_n)/n$. Equating both moments we find the estimator $$ \hat{\theta}_n = \frac{2\bar{Y}_n-1}{1-\bar{Y}_n} \, . $$ Since the function $t\mapsto (2t-1)/(1-t)$ is continuous in the appropriate domain, by the Strong Law of Large Numbers we have $$ \hat{\theta}_n \longrightarrow \frac{2\mathrm{E}[Y_1]-1}{1-\mathrm{E}[Y_1]} = \theta \, , $$ almost surely $[P_\theta]$, for every $\theta$. Therefore, the method of moments estimator $\hat{\theta}_n$ is a consistent estimator of the parameter $\theta$.

Related Question