Derive Bayes estimator with a gamma prior

bayesianprobabilityself-learningstatistical-inference

Question

Finding a bayes estimator for a parameter $\theta$ with a gamma prior and a likelihood beta distributed.

Prior

I am trying to find a Bayes Estimator of $\theta$ if the prior $\pi(\theta)$ is distributed as a Gamma(a,b) :
\begin{align}
\pi(\theta) = \frac{1}{\Gamma(a)b^a}\theta^{a-1}e^{-\theta/b}
\end{align}

Likelihood of the data

And the likelihood of the observed data $X_1,..X_n$ is sampled from $Beta(\theta,1)$ is given as :
\begin{align}
\mathcal{L}(\theta,X_n) &= \theta^n \prod_{i=1}^n X_i^{\theta-1}
\end{align}

Posterior density

The posterior density $p(\theta|X_n)$ is proportional to the product of the prior density times the likelihood of the observed data.
\begin{align}
p(\theta|X_n) &\propto \underbrace{\theta^n \prod_{i=1}^n X_i^{\theta-1}}_{likelihood} \times \underbrace{\frac{1}{\Gamma(a)b^a}\theta^{a-1}e^{-\theta/b}}_{prior} \\
\Leftrightarrow p(\theta|X_n) &\propto \theta^{n+a-1} \prod_{i=1}^n X_i^{\theta-1} \times \frac{1}{\Gamma(a)b^a}e^{-\theta/b}
\end{align}

I am now stucked with the term : $\prod_{i=1}^n X_i^{\theta-1}$ If these term were not there, the posterior distribution would be gamma distributed but with this term, I don't see how to proceed next.

Finding the Bayes estimator

In addition, I am concerned with finding the Bayes estimator.

A Bayes estimator for a parameter $\theta$ is defined, in the case that the prior $\pi$ is continuous, as :
\begin{align}
\hat{\theta}_{n}^B = \frac{\displaystyle\int_{\Theta}\theta \Big[\prod_{i=1}^n f(X_i,\theta \Big]\pi(\theta)d\theta}{\displaystyle\int_{\Theta} \Big[\prod_{i=1}^n f(X_i,\theta \Big]\pi(\theta)d\theta}
\end{align}

So in this case =
\begin{align}
\hat{\theta}_{n}^B = \frac{\frac{1}{\Gamma(a)b^a}\displaystyle\int_{0}^1 \theta^{n+1} \Big[\prod_{i=1}^n X_i^{\theta-1}\Big]\theta^{a-1}e^{\theta/b}d\theta}{\frac{1}{\Gamma(a)b^a}\displaystyle\int_{0}^1 \theta^{n} \Big[\prod_{i=1}^n X_i^{\theta-1}\Big]\theta^{a-1}e^{\theta/b} d\theta}
\end{align}

But again I am stuck with the $\Big[\prod_{i=1}^n X_i^{\theta-1}\Big]$ term.

N.B : I am asking this question for a homework therefore I am looking for hints rather than full solutions.

Best Answer

I am asking this question for a homework therefore I am looking for hints rather than full solutions.

I am happy for your sincerity. Here is a very quick way to find your solution.

First, express your data wasting away any quantity not depending on $\theta$ thus

$$\begin{align} &\pi(\theta)\propto \theta^{a-1}e^{-\theta/b}\\ &p(\mathbf{x}|\theta)\propto \theta^n\prod_ix_i^{\theta}=\theta^n\cdot e^{\theta\Sigma_i\log x} \end{align}$$

and, as you know,

$$\pi(\theta|\mathbf{x})\propto \pi(\theta)\times p(\mathbf{x}|\theta)$$

Now, multiplying $\text{Prior}\times\text{Likelihood}$ you will recognize the kernel of a known density...(still a Gamma but with different parameters).

As Bayesian estimation is concerned, there is not a unique solution, but one possible result (assuming a quadratic loss function) is the Posterior's expectation.

$$\hat{\theta}=\mathbb{E}[\theta|\mathbf{x}]$$

Having characterized the posterior density, its expectation is a known expression


Observe that instead of $x^{\theta-1}$ I wrote $x^{\theta}$ because it is simpler and has the same information about $\theta$.

Any canceled quantity will be included in the normalizing constant but, recognizing the kernel of the desired density, all these tedious (and often difficult) calculations can be avoided

Another trick is to express $\Pi_i x_i^{\theta}$ in terms of $e^{\{\dots\}}$ so that when you will have to multiply prior and likelihood the result will be nice to recognize your Gamma posterior

Related Question