Solved – Finding UMVUE of $\theta$ when $X_i\sim f(x\mid\theta)=\frac{\theta}{2}e^{-\theta|x|}$

estimatorsinferencemathematical-statisticsumvueunbiased-estimator

Let $X_1,X_2,…,X_n$ be a random sample from a population with probability density function
$$f(x\mid\theta)=\frac{\theta}{2}e^{-\theta|x|} \,,\,-\infty < x < \infty\,,\,\theta>0$$

Then a UMVUE of $\theta$ is ?

Can someone tell me if i did everything right or not in the following steps. I dont have answer written for this problem in my solution manual.

1.Its an even function that's why pdf changes as $$f(x\mid\theta)=\theta e^{-\theta x} ,0 < x < \infty,\theta>0$$
2.Using Rao blackwell theorem i found MVUE $\dfrac{n-1}{\sum_{i=1}^{n} X_i}$

Now MVUE is $\dfrac{n-1}{\sum_{i=1}^{n} X_i}$ or $\dfrac{n-1}{\sum_{i=1}^{n} |X_i|}$ ? Please give your thoughts and tell me where did i follow wrong track?

Best Answer

Yes, $f(x\mid\theta)$ is an even function of $x$, but how can the pdf 'change' completely in your calculations? You carried out calculations for an exponential density whereas you are given a Laplace distribution.

Indeed, the sample is drawn from a Laplace distribution with scale parameter $1/\theta$ and location parameter zero. We can conclude that the joint density $f_{\theta}$ belongs to the one-parameter exponential family. Using this, we can find a complete sufficient statistic for $\theta$.

Due to independence, joint density of $\mathbf X=(X_1,X_2,\cdots,X_n)$ is

\begin{align} f_{\theta}(\mathbf x)&=\prod_{i=1}^n\frac{\theta}{2}e^{-\theta\,|x_i|} \\&=\left(\frac{\theta}{2}\right)^ne^{-\theta\sum_{i=1}^n|x_i|} \\&=\exp\left[-\theta\sum_{i=1}^n|x_i|+n\ln\left(\frac{\theta}{2}\right)\right]\quad,\,\mathbf x\in\mathbb R^n\,,\,\theta>0 \end{align}

Clearly, a complete sufficient statistic for the family of distributions $\{f_{\theta}:\theta>0\}$ is $$T(\mathbf X)=\sum_{i=1}^n|X_i|$$

By Lehmann-Scheffe theorem, an unbiased estimator of $\theta$ based on $T$ will be the UMVUE of $\theta$.

It is a simple exercise to show that $|X_i|\sim\mathcal{Exp}(\theta)$ independently for each $i$, where $\theta$ denotes rate of the distribution. As such, we have $T\sim\mathcal{Ga}(\theta,n)$ with density $$g(t)=\frac{\theta^n e^{-\theta t}t^{n-1}}{\Gamma(n)}\mathbf1_{t>0}$$

We find that

\begin{align} E\left(\frac{1}{T}\right)&=\int_0^{\infty}\frac{1}{t}\frac{\theta^n e^{-\theta t}t^{n-1}}{\Gamma(n)}\,dt \\&=\frac{\theta^n}{\Gamma(n)}\frac{\Gamma(n-1)}{\theta^{n-1}} \\&=\frac{\theta}{n-1} \end{align}

So the UMVUE of $\theta$ is $$\frac{n-1}{T}=\frac{n-1}{\sum_{i=1}^n|X_i|}$$