Translated Exponential Distribution – Completeness of Minimal Sufficient Statistic

complete-statisticscurved-exponential-familyinferencemathematical-statisticssufficient-statistics

Let $X_1, X_2…, X_n$ follows iid negative exponential distribution with pdf

$$f(x) = \frac{1}{\theta^2} \: e^{-\frac{(x-\theta)}{\theta^2}} \: \: I_{(x>\theta)} $$

I have to show whether the minimal sufficient statistic for this pdf is complete or not?
I have found that the minimal sufficient statistic is $T=\left( X_{(1)}, \sum_{i=1}^{n} (X_i – X_{(1)}) \right)$. If this minimal sufficient statistic is not complete then there exists a function $h(T)$ of the minimal sufficient statistic such that

$E_\theta [h(T)] =0$ for all $\theta>0$ where $h(T)$ is not identically zero.

Is this minimal sufficient complete or not? How can I find the function $h(T)$ of the minimal sufficient statistic?

Note that, $X_{(1)} $ is the first order statistic i.e., $min\{X_1,..X_n\}$.

I have calculated the pdf of $X_{(1)}$. Let $Y= X_{(1)}$ then the pdf of $Y$ is given by,

$$ f(y) = \frac{n}{\theta^2} \: e^{-\frac{n(y-\theta)}{\theta^2}} \: \: I_{(y>\theta)} $$

I have also calculated

$$\mathbb{E}[X]= \theta^2 + \theta $$ and $$\mathbb{E}[Y] = \mathbb{E}[X_{(1)}] = \frac{\theta^2}{n} + \theta$$

Now, please help me to find out $h(T)$ for which $E_\theta[h(T)] = 0$ for all $\theta>0$ if the minimal sufficient statistic is not complete or any other way to prove or disprove its completeness.

Best Answer

Lemma The minimal sufficient statistic $\left(X_{(1)},\sum_{i=2}^n \{X_{(i)}-X_{(1)}\}\right)$ is not complete.

Proof. The joint distribution of $$\left(X_{(1)},\sum_{i=2}^n \{X_{(i)}-X_{(1)}\}\right)$$ is the product of an Exponential $\mathcal E(n/\theta^2)$ translated by $\theta$ and of a $\mathcal Ga(n-1,1/\theta^2)$ [the proof follows from Sukhatme's Theorem, 1937, recalled in Devroye's simulation bible (1986, p.211)]. This means that $X_{(1)}$ can be represented as $$X_{(1)}=\frac{\theta^2}{n}\varepsilon+\theta\qquad\varepsilon\sim\mathcal E(1)$$ that $Y$ is scaled by $\theta^2$ since $$Y=\sum_{i=2}^n \{X_{(i)}-X_{(1)}\}=\theta^2 \eta\qquad\eta\sim\mathcal Ga(n-1,1)$$ and that $$\mathbb E_\theta\left[ Y^\frac{1}{2}\right]=\theta \frac{\Gamma(n-1/2)}{\Gamma(n-1)}$$ Therefore, $$\mathbb E_\theta\left[X_{(1)}-\frac{\Gamma(n-1)}{\Gamma(n-1/2)}Y^\frac{1}{2}\right]=\frac{\theta^2}{n}$$ eliminates the location part in $X_{(1)}$ and suggests dividing by $Y$ to remove the scale part: since $$\mathbb E_\theta\left[ Y^\frac{-1}{2}\right]=\theta^{-1} \frac{\Gamma(n-3/2)}{\Gamma(n-1)}\qquad \mathbb E_\theta\left[ Y^{-1}\right]=\theta^{-2} \frac{\Gamma(n-2)}{\Gamma(n-1)}$$ we have (for an arbitrary $\gamma)$ that $$\mathbb E_\theta\left[\frac{X_{(1)}-\gamma Y^\frac{1}{2}}{Y}\right]=\frac{\Gamma(n-2)}{n\Gamma(n-1)}+\frac{\theta^{-1}\Gamma(n-2)}{\Gamma(n-1)}- \frac{\gamma \theta^{-1}\Gamma(n-3/2)}{\Gamma(n-1)} $$ Setting $$\gamma=\frac{\Gamma(n-2)}{\Gamma(n-3/2)}$$ leads to $$\mathbb E_\theta\left[\frac{X_{(1)}-\gamma Y^\frac{1}{2}}{Y}\right]=\frac{\Gamma(n-2)}{n\Gamma(n-1)}$$ which is constant in $\theta$. Therefore this concludes the proof.

As pointed out by Sextus Empiricus, this is not the only transform of the sufficient statistic with constant expectation. His proposal $$\mathbb E_\theta\left[ X - \frac{1}{n(n-1)}Y- \frac{\Gamma(n-1)}{\Gamma(n-1/2)}Y^{1/2}\right] = 0$$is an alternative.