Finding an estimator of theta that is sufficient.

probabilityprobability distributionsstatistics

I'm posting to ask a question regarding sufficient statistics. Given the distribution: $f(x;\theta) = \theta x^{\theta -1}$, where $0<x<1$, and for some paramater $\theta > 0$, I am asked to find a sufficient estimator of $\theta$. I am also provided a hint: "Use the Fisher-Neyman factorization theorem".

My thoughts are as follows:

If $X_1…X_N$ is a random sample from the population, then $f_{X_1…X_N}=f_{X_1}*f_{X_2}*…*f_{X_N}=\theta x_1^{\theta – 1}*\theta x_2^{\theta – 1}*…*\theta x_N^{\theta – 1} = \theta^N(\prod x_i^{\theta – 1}).$

By the Fisher-Neyman factorization theorem, I want to factor $f$ into two functions, $g(\hat{\Theta},\theta)*h(x_1,x_2,…x_N)$, where $g$ depends on $\hat{\Theta}$ and $\theta$ and $h$ does not depend on $\theta$.

If anyone could provide me assistance on how to factor this, that would be greatly appreciated.

Best Answer

$f(x, \theta) = \theta x^{\theta-1}\cdot I_{0 < x < 1}$, hence $$f_{X_1, \ldots, X_N} = \theta^N (\prod_{k=1}^n x_k)^{\theta-1} I_{0 < \min x_i < \max x_i < 1}.$$

We have $\hat{\theta} = \prod_{k=1}^n x_k$ is a sufficient statistics. We may put $h = I_{0 < \min x_i < \max x_i < 1}$ and $g(u,v) = v^N u^{v-1}$.

Related Question