[Math] Bayes estimator of $\theta$ when $p_\theta(x) = \theta x^{\theta -1} 1_{[0,1]}(x)$ and $\pi(\theta) = e^{-\theta}$

bayesianparameter estimationprobability distributionsstatistical-inferencestatistics

Let $X_1,\ldots,X_n$ iid with density $p_\theta(x) = \theta x^{\theta -1} 1_{[0,1]}(x)$ and $\pi(\theta) = e^{-\theta}$ as prior on the parameter space $\Theta$. I have to calculate the posterior measure on $\Theta$ and the Bayes estimator of $\theta$. I did the following
$$
p_\theta(X)\pi(\theta) = e^{-\theta}\prod_{i=1}^n \theta x_i^{\theta -1}
$$

But I dont see what distribution this might be.

Please help 🙂

Best Answer

The posterior density $g_x$ is such that $g_x(\theta)$ is proportional to $p_x(\theta)\pi(\theta)$ with $x=(x_i)_{1\leqslant i\leqslant n}$. Writing each $x_i^{\theta-1}$ as $x_i^{-1}\mathrm e^{\theta\log x_i}$, one gets $$ g_x(\theta)=c(x)\,\theta^n\,\exp(-a(x)\theta),\qquad a(x)=1-\sum_{i=1}^n\log(x_i). $$ This is the gamma distribution with parameters $(n+1,a(x))$ hence $$ c(x)=\frac{a(x)^{n+1}}{n!}. $$ Irrespectively of the value of $c(x)$, the density $g_x$ is maximal at $n/a(x)$. On the other hand, the minimum mean square estimator of $\theta$, equal to the mean of the posterior distribution $g_x$, is $$ \hat\theta(x)=\int\theta g_x(\theta)\mathrm d\theta=c(x)\,\frac{(n+1)!}{a(x)^{n+2}}=\frac{n+1}{a(x)}. $$