Finding Bayes estimator for $\theta$ of Unif$(0,\theta)$

bayesianorder-statisticsself-learningstatistical-inferencestatistics

Finding Bayes estimator for $\theta$ of Unif$(0,\theta)$

Let $Y = \max{X_i}$ where $(X_1,\ldots,X_n)$ is a random sample from Unif$(0,\theta)$. $Y$ is sufficient for $\theta$. Find the Bayes estimator $w(Y)$ for $\theta$ based on $Y$ using the loss function $L(\theta,a) = \lvert a- \theta\rvert$ The prior density of $\theta$ is $\displaystyle \pi(\theta) = \frac{2}{\theta^3}1_{(1 < \theta < \infty)}$

I am pretty unfamiliar with Bayesian inference.

From what I understand the posterior is given by $\displaystyle p(\theta \mid \underline{x}) = \frac{L(\theta \mid \underline x)\pi(\theta)}{\int L(\theta \mid \underline x)\pi(\theta) \, d\theta }\, ; $ where

$$
L(\theta \mid \underline{x})\pi(\theta) = \frac{1}{\theta^n}1_{(0 \le \min(x_i))}1_{(y \le \theta)}\frac{2}{\theta^3}1_{(1<\theta<\infty)}
$$

Aside from this I am not sure how I set this up to solve or where I use the loss function or how I base it off $Y$.

Best Answer

\begin{align} L(\theta) & = \begin{cases} \dfrac {\text{constant}} {\theta^n} & \text{if } \theta>y, \\[8pt] \,\,0 & \text{if } 0<\theta<y. \end{cases} \\[12pt] \pi(\theta)\, d\theta & = \begin{cases} \dfrac {\text{constant} \cdot d\theta}{\theta^3} & \text{if } \theta>1, \\[8pt] \,\,0 & \text{if } \theta<1. \end{cases} \\[12pt] \text{Therefore } \pi(\theta\mid y)\, d\theta & \propto \begin{cases} \dfrac{\text{constant}\cdot d\theta}{\theta^{n+3}} & \text{if } \theta> \max\{1,y\}, \\[8pt] \,\,\,0 & \text{otherwise.} \end{cases} \end{align} (Here I have written $\text{“}{>}\text{”}$ and $\text{“}{<}\text{”}$ rather than $\text{“}{\ge}\text{”}$ and $\text{“}{\le}\text{”}$ whereas if we have been doing maximum-likelihood estimation then I would have written $\theta\ge y.$) $$ \int_{\max\{1,y\}}^{+\infty} \frac{d\theta}{\theta^{n+3}} = \frac 1{(n+1)(\max\{ 1,y \})^{n+2}}. $$ So the posterior probability distribution is $$ \frac{ (n+1)(\max\{ 1,y \})^{n+2}}{\theta^{n+3}} \, d\theta \qquad \text{ for } \theta > \max\{1,y\}. $$ Theorem: With absolute-error loss, the Bayes estimator is the posterior median.

It you know the theorem above, then what remains is to solve the equation below for $m{:}$ $$ \int_{\max\{1,y\}}^m \frac{ (n+1)(\max\{ 1,y \})^{n+2}}{\theta^{n+3}} \, d\theta = \frac 1 2. $$

If you don't know the theorem above, then maybe that's the question you need to post.