Find a sufficient statistic $Y$ for $\theta$ then find the Bayes estimator $w(Y)$

bayesianparameter estimationstatistical-inferencestatistics

Let $X_1,…,X_n$ be a iid random sample having pdf $\theta x^{\theta-1}1(0 < x \le 1)$

Find a sufficient statistic $Y$ for $\theta$ then find the Bayes estimator $w(Y)$ based on this statistic using the loss function $L(\theta,a) = (a-\theta)^2$ where the prior distribution is exponential with mean $\frac{1}{\beta}$.

First sufficiency:

The likelihood function is $\displaystyle L(\theta) = \Pi_{i = 1}^n\theta x_i^{\theta -1} = \theta^n(x_1\cdots x_n)^\theta(x_1\cdots x_n)^{-1}$ thus by the factorization theorem we can take $Y = (x_1\cdots x_n)^{-1}$.

Bayes Estimator:

For square-error loss the estimator $w(Y) = \hat{\theta} = E[\theta \mid Y\,]$ i.e. the mean of the posterior.

For the posterior need to first solve $m(y) = \displaystyle \int_0^\infty \beta e^{-\beta \theta}y^{1-\theta}d\theta$ Is this a well known integral? I was trying to solve by u-substitution but I am making a mistake somewhere. I am trying $u = y^{-\theta}, du = -y^{-\theta}\log(y)d\theta$ but for some reason I can't see how to take care of $e^{-\beta\theta}$.

Before continuing would appreciate to know if this is correct:

EDIT: $y^{-\theta} = e^{-\theta \log(y)}$ so re-write as $\displaystyle \beta y \int_{0^\infty}e^{-\theta(\beta + \log(y))}d\theta$ and set $u = -\theta(\beta + \log(y)) $

Then we will have $\displaystyle -\frac{\beta y}{\beta + y}e^{-\theta(\beta + \log(y))} \bigg \vert_{\theta = 0}^\infty= \frac{\beta y}{\beta + y}$

Would still like to know if this a well known integral.

Now the next step is to solve $\displaystyle E[\theta \mid Y] = \int_0^\infty \theta \frac{y+\beta}{\beta y}\beta e^{-\beta \theta}\theta^ny^{1-\theta}d\theta$ correct? and this will give use the estimator we seek.

Best Answer

Using factorization theorem the sufficient statistic for $\theta$ is $y=\prod_i X_i$. This because the function $g(\theta,t(\mathbf{x}))$ depends on the data only through the statistic "t=product".

The function $\frac{1}{\prod_{i}X_{i}}$ you wrongly identfied as the sufficient statistic is the function of "x alone".

Then the posterior is the following (hint: when calculating the posterior discard any quantity that does not depend on $\theta$)

$$\pi(\theta|y) \propto e^{-\beta \theta}\theta^n y^{\theta-1}$$

$$\propto e^{-\beta \theta}\theta^n e^{(\theta-1) log y}$$

$$\propto \theta^ne^{-(\beta-logy)\theta}$$

...we immediately recognize in this posterior a Gamma distribution...

now you can kill the problem by yourself without solving the integral analitically