Find the UMVUE of $H(\theta)$

order-statisticsparameter estimationstatistical-inferencestatistics

I've been working on the following problem:

Let $h(\cdot)$ be a given function that is strictly positive, finite and continuous on the real line and define $H(x)=\int_0^xh(y)dy$. Let $X_1,\ldots,X_n$ be a random sample from the distribution whose pdf is given by $f(x\mid\theta)=c(\theta)h(x)I_{(0<x<\theta)}$.

(a) Compute MLE of $\theta$; (b) Find a sufficient and complete statistic for $\theta$ and denote is by $S$; (c) Find an UMVUE of $H(\theta)$.

In my solution I found that $\hat{\theta}_{MLE}=X_{(n)}$ and that $S=X_{(n)}$.
I am not sure how to go about point c. Any suggestions will be appreciated

Best Answer

By the Lehmann-Scheffe Theorem, the UMVUE is an a.s. unique function of the complete sufficient statistic, $X_{(n)}$. Here are some parts of the puzzle that you will need to piece together:

1) The density of $X_{(n)}$ is given by $$g(x|\theta) = n[c(\theta)]^nH(x)^{n-1}h(x), \;\;\; 0 < x < \theta $$

2) $H(\theta) = \dfrac{1}{c(\theta)}$

3) $E[H(X_{(n)})] = a_nH(\theta)$, where $a_n$ is a constant depending ONLY on $n$, and not $\theta$. You can calculate $a_n$ quite easily.

By Lehmann-Scheffe, you can conclude that $T = \dfrac{1}{a_n}H(X_{(n)})$ is the UMVUE.