Solved – Showing Bayes Estimator is Consistent

bayesianconsistencyestimators

$X_1, X_2, … , X_n$ are iid $N(0, \theta)$ random variables with $\theta$ in $(0, \infty)$. With the prior distribution $\pi(\theta)$=$\frac{4e^\frac{-2}{\theta}}{\theta^3}$, I calculated the posterior distribution and used the squared error loss function to obtain the Bayes estimator for $\theta$ (the mean of the posterior distribution). I found the Bayes estimator to be $\frac{4 + \sum x_i^2}{n+2}$. How do I show that this estimator is consistent?

I know showing the estimator is consistent has something to do with showing convergence in probability, however, I am completely lost on where to start.

Best Answer

You generally use the law of large numbers(https://en.wikipedia.org/wiki/Law_of_large_numbers) to prove consistency. The LLN gives convergence in probability, and if you can use it to show convergence in probability to what you want ($\theta$ here) then you're basically done.

Here, as $X_i \sim \mathcal{N}(0,\theta)$ are all iid the law of large numbers will apply. In particular, it tells you $$ \frac{1}{n}\sum_{i=1}^{n} X_i \overset{P}{\rightarrow} \mathbb{E}(X)=0, $$ and also $$ \frac{1}{n}\sum_{i=1}^{n} X_i^2 \overset{P}{\rightarrow} \mathbb{E}(X^2)=\theta. $$ This is almost exactly what you want. You just need to take care of the other terms in the Bayes estimator as $n\rightarrow \infty$.

Related Question