[Math] Prove the order statistic is a minimal sufficient statistic for the logistic pdf $f(x|\theta)=\frac{e^{-(x-\theta)}}{(1+e^{-(x-\theta)})^2}$

order-statisticsstatistics

I'm going through Statistical Inference by Casella and Berger, and I'm currently on Chapter 6. In particular, I'm doing exercise 6.9 (c), and I'm trying to prove that if $X_1, …, X_n$ is a random sample from a population with pdf $f(x|\theta)=\frac{e^{-(x-\theta)}}{(1+e^{-(x-\theta)})^2}$, then the order statistic $T(X) = (X_{(1)},…,X_{(n)})$ is a minimal sufficient statistic for $\theta$.

The joint pdf of the sample is: $$f_X(\boldsymbol{x}|\theta)=e^{n(\theta-\bar{x})}\prod_{i=1}^n \frac{1}{(1+e^{-(x_{(i)}-\theta)})^2}.$$

I want to apply Theorem 6.2.13, so I want to verify that for every two sample points $\boldsymbol{x}$ and $\boldsymbol{y}$, the ratio $\frac{f(\boldsymbol{x}|\theta)}{f(\boldsymbol{y}|\theta)}$ is independent of $\theta$ if and only if $T(\boldsymbol{x}) = T(\boldsymbol{y})$. This ratio is:

$$e^{n(\bar{y}-\bar{x})} \prod_{i=1}^n \bigg(\frac{1+e^{-(y_{(i)}-\theta)}}{1+e^{-(x_{(i)}-\theta)}}\bigg)^2.$$

Now the first part is independent of $\theta$, so we just need to verify (dropping the square) that $$\prod_{i=1}^n \frac{1+e^{-(y_{(i)}-\theta)}}{1+e^{-(x_{(i)}-\theta)}}$$ is independent of $\theta$ if and only iff the $T(\boldsymbol{x})=T(\boldsymbol{y})$ are equal. In the solutions and everywhere else I've looked online this is just stated as true, and while the "if" direction is obvious, the "only if" one isn't. Or rather, sure, it looks like this should only be independent of $\theta$ if the order statistics are equal and I can't find a counterexample, but it doesn't seem necessarily obvious to me. I do believe it's true, but I can't prove it, and I can't even prove this for the case $n=2$, where the relevant part is

$$\frac{1+e^{\theta}(e^{-y_{(1)}}+e^{-y_{(2)}}) + e^{2\theta}e^{-y_{(1)}-y_{(2)}}}{1+e^{\theta}(e^{-x_{(1)}}+e^{-x_{(2)}}) + e^{2\theta}e^{-x_{(1)}-x_{(2)}}}.$$

I'm not sure if I'm missing something obvious, but any help with rigorously proving this would be appreciated (even if you can just answer the question for the special case $n=2$).

Best Answer

Claim: The order statistic $(X_{(1)},\cdots,X_{(n)})$ is equivalent with $\frac{f(x|\theta_j)}{f(x|\theta_0)},j=1\cdots,n+1$.

Now prove $\frac{f(x|\theta_j)}{f(x|\theta_0)},j=1\cdots,n+1$ is minimal sufficient:

Since $\frac{f(x|\theta_j)}{f(x|\theta_0)},j=1\cdots,n+1$ is a minimal sufficient statistic of $\mathcal{P}_0=\{f(x|\theta_1),\cdots,f(x|\theta_{n+1})\}\subset\mathcal{P}=\{f(x|\theta),\theta\in\mathbb{R}\}$, and sufficient for $\mathcal{P}$ (because order statistic is sufficient), thus $\frac{f(x|\theta_j)}{f(x|\theta_0)},j=1\cdots,n+1$ is a minimal sufficient statistic of $\mathcal{P}$ by lemma:

$T(X)$ minimal sufficient for $\mathcal{P}_0\subset \mathcal{P}$ and sufficient for $\mathcal{P}$, then $T(x)$ is minimal sufficient for $\mathcal{P}$

Proof of lemma: For any $U(X)$ sufficient for $\mathcal{P}$, $U(X)$ is sufficient for $\mathcal{P}_0$. Since $T(X)$ is minimal sufficient for $\mathcal{P}_0$, $T(X)$ is a function of $U(X)$. Thus sufficient statistic (w.r.t $\mathcal{P}$) $T(X)$ is a function of any sufficient statistic of $\mathcal{P}$, thus minimal sufficient.

Proof of claim: \begin{align*} (X_{(1)},\cdots,X_{(n)}) &= (X'_{(1)},\cdots,X'_{(n)})\\ \overset{u=e^{x}}{\iff} (u_{(1)},\cdots,u_{(n)}) &= (u_{(1)},\cdots,u_{(n)})\\ \overset{}{\iff} \prod_{i=1}^n\frac{1 + u_{(i)}\xi}{1 + u_{(i)}} &= \prod_{i=1}^n\frac{1 + u'_{(i)}\xi}{1 + u'_{(i)}}, \forall\xi\in\mathbb{R}^+\\ \overset{n-\text{polynomial agrees on n+1 values}}{\iff} \prod_{i=1}^n\frac{1 + u_{(i)}\xi_j}{1 + u_{(i)}} &= \prod_{i=1}^n\frac{1 + u'_{(i)}\xi_j}{1 + u'_{(i)}}, \forall j\in\{1,\cdots,n+1\}\\ \overset{\xi=e^{\theta}}{\iff} e^{n\theta_j}\prod_{i=1}^{n}\frac{1+e^{-x_{(i)}+\theta_j}}{1+e^{-x_{(i)}}} &= e^{n\theta_j}\prod_{i=1}^{n}\frac{1+e^{-x'_{(i)}+\theta_j}}{1+e^{-x'_{(i)}}}, \forall j\in\{1,\cdots,n+1\}\\\iff \frac{f(x|\theta_j)}{f(x|\theta_0)} &=\frac{f(x'|\theta_j)}{f(x'|\theta_0)},\forall j\in\{1,\cdots,n+1\}, \end{align*}

where $\theta_0=0$ and $\theta_j\text{ (distanct)}\in \mathbb{R},j=1\cdots,n+1$.