Let $X_1,…,X_n$ be iid from the uniform distribution on $(0,\theta)$
where $\theta>0$ is unknown. Suppose the prior distribution of
$\theta$ is log-normal with parameters $\mu, \sigma^2$, which are
known constants.
Find the posterior density of $\log \theta$.
Suppose the loss function is $L(\delta, \theta)=\begin{cases}0 & if \delta=0\\1&otherwise\end{cases}$. Find the Bayes estimator of
$\theta$ under this loss function. Hint: Part (1) is related. With 0-1
loss the Bayes estimator would be the Maximum A Posteriori (MAP)
estimator…in this case is the mode of the posterior.
We have $$\begin{split}\xi(\log\theta|x)&\propto f(x|\log\theta)\xi(\log\theta)\\
&=\frac 1 {\theta^n}\frac 1 {\sigma\sqrt{2\pi}}e^{-\frac 1 2\left(\frac{\theta-\mu}{\sigma}\right)^2}\end{split}$$
Is this correct; what distribution is this?
Best Answer
reading you posts I think you are a clever guy thus with a sketch I think you can do the rest by yourself.... I am just an amateur, but this is what I would do in this case...
$$p(\mathbf{x}|\theta)=\frac{1}{\theta^n}\cdot\mathbb{1}_{(0;\theta)}(x_{(n)})=\frac{1}{\theta^n}\cdot\mathbb{1}_{(x_{(n)};+\infty)}(\theta)$$
$$\pi(\theta)\propto \frac{1}{\theta}e^{-(\log\theta-\mu)^2/(2\sigma^2)}$$
First I find the posterior of $\theta$
$$\pi(\theta|\mathbf{x})\propto \frac{1}{\theta^{n+1}}e^{-(\log\theta-\mu)^2/(2\sigma^2)}\cdot\mathbb{1}_{(x_{(n)};+\infty)}(\theta)$$
Now I transform the posterior getting the density of its log with the standard transformation theorem
$$\lambda=\log\theta$$
$$\theta=e^{\lambda}$$
$$\theta'=e^{\lambda}$$
Thus
$$\pi(\lambda|\mathbf{x})\propto e^{-\lambda n}e^{-(\lambda-\mu)^2/(2\sigma^2)}$$
$$-\frac{1}{2\sigma^2}[2\sigma^2 n \lambda+\lambda^2+\mu^2-2\lambda\mu]$$
$$(\lambda|\mathbf{x}) >\log x_{(n)}$$