[Math] Minimum Variance Unbiased Estimator for exponential distribution cases

probabilityprobability distributionsprobability theorystatistical-inferencestatistics

Exercises :

Let $X_1, \dots, X_n$ be a random sample from the exponential distribution with unknown parameter $\theta >0$.

i) Find a sufficient and complete statistic function $T$ for $\theta$.

ii) With the help of the theorem Rao-Blackwell, find a Minimum Variance Unbiased Estimator for $1/\theta$ and $1/\theta^2$.

iii) Find $\mathbb{E}(1/T^k), \; k \in \{1,2,\dots,n-1\}$.

Attempt :

i) The pdf of our sample is given by :

$$f(x;\theta) = \begin{cases} \theta e^{-\theta x}, \; x \geq 0 \\ 0, \; x < 0 \end{cases}$$

Thus, in a simple expression :

$$f(x;\theta) = \theta e^{-\theta x}\mathbb{I}_{[0,+\infty]}(x)$$

Note that the expression is of the form $c(\theta)e^{q(\theta)t(x)}h(x)$, thus it belongs in the Exponential Family of Distributions with $c(\theta) = \theta$, $q(\theta) = -\theta$, $t(x) = x$ and $h(x) = \mathbb{I}_{[0,+\infty]}(x)$.

Thus, the function $T := T(X) = \sum_{1}^n t(x_i) = \sum_1^nx_i$ is a sufficient and complete statistic function for $\theta$.

ii) It is $\mathbb{E}[X] = 1/\theta$. Moreover :

$$\mathbb{E}(T) = \sum_{i=1}^n\mathbb{E}[X_i]=n\frac{1}{\theta} \implies \mathbb{E}\bigg(\frac{T}{n}\bigg)=\frac{1}{\theta}$$

Thus, the function $T^*(X) = \frac{1}{n}\sum_{1}^n x_i$ is a MVUE for $1/\theta$ as a function of only the sufficient and complete statistic function $T$.

How to continue on finding in a similar way a MVUE for $1/\theta^2$ ?

iii) Being asked to calculate :

$$\mathbb{E}\bigg\{\bigg(\sum_{i=1}^n x_i \bigg)^{-k}\bigg\}$$

I don't know how to proceed with this one though.

Any help, tips or thorough solution would be much appreciated.

Best Answer

To find the UMVUE for $\frac{1}{\theta^2}$, one needs only a function $\psi(T)$ of the complete sufficient statistic $T$ to be unbiased for $\frac{1}{\theta^2}$. But instead of trying to find $\psi(T)$ from $E(\psi(T))=\frac{1}{\theta^2}$, let us find the moments of $T$. Noting that $T$ is a Gamma variate with pdf $f_T(t)\propto e^{-\theta t}t^{n-1}\mathbf1_{t>0}$, we have

\begin{align}E(T^r)&=\frac{\theta^n}{\Gamma(n)}\int_0^\infty e^{-\theta t}t^{n+r-1}\,\mathrm{d}t\\&=\frac{\theta^n\,\Gamma(n+r)}{\theta^{n+r}\,\Gamma(n)}\qquad,\,n+r>0\\&=\frac{1}{\theta^r}\frac{\Gamma(n+r)}{\Gamma(n)}\end{align}

So if I choose $r=2$, I get the required $1/\theta^2$ in the r.h.s.

Indeed, $\displaystyle E(T^2)=\frac{n(n+1)}{\theta^2}$, so that $\displaystyle E\left(\frac{T^2}{n(n+1)}\right)=\frac{1}{\theta^2}$.

Actually, we are using the Lehmann-Scheffe theorem here to be precise.

I guess this also kind of answers your last question.

Related Question