[Math] UMVUE of $\frac{\theta}{1+\theta}$ and $\frac{e^{\theta}}{\theta}$ from $U(-\theta,\theta)$ distribution

order-statisticsparameter estimationprobability distributionsstatistical-inferencestatistics

Let $X_1,X_2,\dots, X_n$ be rvs with pdf:
$$f(x\mid \theta)=\frac{1}{2\theta}I(-\theta<x<\theta)$$

Find UMVUE of $(i)\dfrac{\theta}{1+\theta}$ and $(ii)\dfrac{e^{\theta}}{\theta}$.

Note that, $(X_{(1)},X_{(n)})$ is complete sufficient statistic. But now I have to find unbiased estimator of $(i),(ii)$ of the form $g(X_{(1)},X_{(n)})$, then $g$ will become UMVUE. But I could not find such $g$. Thanks for any help.

I tried to find $E(X_{(1)}/X_{(n)})$, but it came out a total mess.

Here $X_{(1)}=\min(X_1,X_2,\dots, X_n)$ and $X_{(n)}=\max(X_1,X_2,\dots, X_n)$.

Best Answer

You have a $U(-\theta,\theta)$ population where $\theta\in\mathbb R^+$.

Joint density of the sample $\mathbf X=(X_1,X_2,\ldots,X_n)$ is

\begin{align} f_{\theta}(\mathbf x)&=\frac{1}{(2\theta)^n}\mathbf1_{-\theta < x_1, \ldots, x_n < \theta} \\&=\frac{1}{(2\theta)^n}\mathbf1_{0<|x_1|,\ldots,|x_n|<\theta} \\&=\frac{1}{(2\theta)^n}\mathbf1_{\max_{1\le i\le n}|x_i|<\theta} \end{align}

It is clear from Factorization theorem that a sufficient statistic for $\theta$ is $$T(\mathbf X)=\max_{1\le i\le n}|X_i|$$

One could verify that $|X_i|\sim U(0,\theta)$, so that the density of $T$ is $$g_{\theta}(t)=\frac{n}{\theta^n}t^{n-1}\mathbf1_{0<t<\theta}$$

That $T$ is a complete statistic for $\theta$ is well-known.

We simply have to find unbiased estimators of the parametric functions of $\theta$ based on the complete sufficient statistic. This would give us the UMVUE by the Lehmann-Scheffe theorem.

As the support of the complete sufficient statistic here depends on the parameter $\theta$, unbiased estimators can be directly obtained through differentiation.

Let $h_1(T)$ and $h_2(T)$ be unbiased estimators of $\theta/(1+\theta)$ and $e^{\theta}/\theta$ respectively, based on the complete sufficient statistic $T$.

That is, for all $\theta>0$,

\begin{align} \qquad\quad\frac{n}{\theta^n}\int_0^{\theta}h_1(t)t^{n-1}\,dt&=\frac{\theta}{1+\theta} \\\implies \int_0^{\theta}h_1(t)t^{n-1}\,dt &= \frac{\theta^{n+1}}{n(1+\theta)} \end{align}

Differentiating both sides wrt $\theta$,

\begin{align} h_1(\theta)\theta^{n-1}&=\frac{\theta^n(n\theta+n+1)}{n(1+\theta)^2} \\\implies h_1(\theta) &=\frac{\theta(n\theta+n+1)}{n(1+\theta)^2} \end{align}

Hence, $$h_1(T)=\frac{T(nT+n+1)}{n(1+T)^2}$$

Similarly for the second problem, for all $\theta>0$,

\begin{align} \qquad\quad\frac{n}{\theta^n}\int_0^{\theta}h_2(t)t^{n-1}\,dt&=\frac{e^\theta}{\theta} \\\implies \int_0^{\theta}h_2(t)t^{n-1}\,dt &= \frac{\theta^{n-1} e^\theta}{n} \end{align}

Differentiating both sides wrt $\theta$ yields

\begin{align} h_2(\theta)\theta^{n-1}&=\frac{e^{\theta}\theta^{n-2}(\theta+n-1)}{n} \\\implies h_2(\theta) &=\frac{e^{\theta}(\theta+n-1)}{n\theta} \end{align}

So, $$h_2(T)=\frac{e^{T}(T+n-1)}{nT}$$


In my initial answer, the following calculation for the UMVUE was rather unnecessary and complicated. Had the support not depended on the parameter, I might have tried this. I am keeping this part in the answer as I might be able to salvage the somewhat faulty argument on some further consideration :

For $k> -n$, we have

\begin{align} E_\theta(T^k)&=\frac{n}{\theta^n}\int_0^\theta t^{k+n-1}\,dt\\[8pt] & = \frac{n\theta^k}{n+k} \end{align}

This suggests that an unbiased estimator of $\theta^k$ based on $T$ is $$\left(\frac{n+k}{n}\right)T^k$$

For the first problem, one could write

\begin{align} \frac{\theta}{1+\theta}&= \begin{cases}\left(1+\frac{1}{\theta}\right)^{-1}=1-\frac{1}{\theta}+\frac{1}{\theta^2}-\frac{1}{\theta^3}+\cdots&,\text{ if }\theta>1\\\\\theta(1+\theta+\theta^2+\cdots)&,\text{ if }0<\theta<1\end{cases} \end{align}

For $0<\theta<1$, we have

$$E_{\theta}\left[\left(\frac{n+1}{n}\right)T+\left(\frac{n+2}{n}\right)T^2+\cdots\right]=\theta+\theta^2+\cdots$$

Or, $$E_{\theta}\left[\sum_{k=1}^\infty\left(\frac{n+k}{n}\right)T^k\right]=\frac{\theta}{1+\theta}$$

For $\theta>1$,

$$E_{\theta}\left[1-\left(\frac{n-1}{n}\right)\frac{1}{T}+\left(\frac{n-2}{n}\right)\frac{1}{T^2}-\cdots\right]=1-\frac{1}{\theta}+\frac{1}{\theta^2}-\cdots$$

That is, $$E_{\theta}\left[\sum_{k=0}^\infty\left(\frac{n-k}{n}\right)\frac{(-1)^k}{T^k}\right]=\frac{\theta}{1+\theta}$$

Hence by Lehmann-Scheffe theorem, UMVUE of $\theta/(1+\theta)$ is

\begin{align} h_1(T)&=\begin{cases}\displaystyle\sum_{k=1}^\infty\left(\frac{n+k}{n}\right)T^k&,\text{ if }0<\theta<1\\\\\displaystyle\sum_{k=0}^\infty\left(\frac{n-k}{n}\right)\frac{(-1)^k}{T^k}&,\text{ if }\theta\ge1 \end{cases} \\\\&=\begin{cases}\displaystyle\frac{T(n+1-nT)}{n(T-1)^2}&,\text{ if }0<\theta<1\\\\\displaystyle\frac{T(n+1+nT)}{n(T+1)^2}&,\text{ if }\theta\ge1\end{cases} \end{align}

However, upon verification of unbiasedness for some values of $n$, it looks like only $$h_1(T)=\displaystyle\frac{T(n+1+nT)}{n(T+1)^2}$$ should be the correct answer for all $\theta>0$. I am not quite sure why that happens.

For the second problem, we can use the power series expansion of $e^\theta$ to obtain

$$E_{\theta}\left[\sum_{k=-1}^{\infty}\left(\frac{n+k}{n}\right)\frac{T^k}{(k+1)!}\right]=\sum_{j=0}^\infty \frac{\theta^{j-1}}{j!}=\frac{e^{\theta}}{\theta}$$

So the UMVUE of $e^{\theta}/\theta$ is

\begin{align} h_2(T)&=\sum_{k=-1}^{\infty}\left(\frac{n+k}{n}\right)\frac{T^k}{(k+1)!} \\\\&=\frac{e^T(n-1+T)}{nT} \end{align}