Solved – Minimal sufficient statistic for location exponential family

exponential distributionsufficient-statistics

Let $X_1,\dots,X_n$ iid with pdf $$f(x|\theta)=e^{-(x-\theta)},\,\,\,\theta<x<\infty,\,\,\,-\infty<\theta<\infty.$$ Part (b) of Problem 6.9 in Casella and Berger asks to find a minimal sufficient statistic for $\theta$.

Theorem 6.2.13 in the book states:

Let $f(\textbf{x}|\theta)$ be the pdf of a sample $\textbf{X}$. Suppose there exists a function $T(\textbf{x})$ such that, for every two sample points $\textbf{x}$ and $\textbf{y}$, the ratio $f(\textbf{x}|\theta)/f(\textbf{y}|\theta)$ is constant as a function of $\theta$ if and only if $T(\textbf{x})=T(\textbf{y})$. Then $T(\textbf{X})$ is a minimal sufficient statistic for $\theta$.

Now by independence of the sample we have $f(\textbf{x}|\theta)=e^{-\sum_i (x_i-\theta)}$. Thus $$\frac{f(\textbf{x}|\theta)}{f(\textbf{y}|\theta)}=e^{-\sum_i (x_i-\theta)+\sum_i (y_i-\theta)}=e^{\sum_i y_i-\sum_i x_i}$$ which is always constant in $\theta$. This would mean that the zero function is a minimal sufficient statistic for $\theta$ which intuitively means it is impossible to say anything about $\theta$ based on the sample, which doesn't make sense. Is there a mistake in my work?

Best Answer

I think I'm late trying to write it properly, but here it is: $$\frac{f(\textbf{x}|\theta)}{f(\textbf{y}|\theta)}=\frac{e^{-\sum_i (x_i-\theta)}\prod I(x_i>\theta)}{e^{-\sum_i (y_i-\theta)}\prod I(y_i>\theta)}=\frac{e^{-\sum_i x_i}I(\min(\boldsymbol{x})>\theta)}{e^{-\sum_i y_i}I(\min(\boldsymbol{y})>\theta)}$$.

And, this is independent of $\theta$ when $\min(\boldsymbol{x})=\min(\boldsymbol{y})$, so $T(\boldsymbol{x})=\min(\boldsymbol{x})$.

Related Question