[Math] Is the Gamma Function a jointly sufficient statistic

probability distributionsself-learningstatistics

A random sample $X_{1},…,X_{n}$ are pulled from a gamma distribution. Are there jointly sufficient statistics based on these observations for the two unknown parameters?

The definition of a gamma distribution is f(x;$\alpha$,$\beta$)=$\frac{x^{\alpha-1 }}{\beta ^\alpha \Gamma(x){}}e^{\frac{-x}{\beta }}$

I kind of understand what a Jointly Sufficient Statistic is however I am not sure what to do from here. Possibly taking the product $\prod_{i=1}^{n}$ in front of the distribution. Can anybody help? Thanks!

Best Answer

First of all about the sufficient statistic, according to Wiki:

If the probability density function is $f_\theta(\vec{x})$, then $T$ is sufficient for $θ$ if and only if nonnegative functions $g$ and $h$ can be found such that $$ f_\theta(\vec{x})=h(\vec{x}) \, g_\theta(T(\vec{x})), \,\! $$ i.e. the density $f$ can be factored into a product such that one factor, $h$, does not depend on $θ$ and the other factor, which does depend on $θ$, depends on $\vec{x}$ only through $T(\vec{x})$.


Here we have $\theta=\{\alpha,\beta\}$. In our case: $$\begin{align}f(\vec{x})=f(x_1,\ldots,x_n) &= \prod_{i=1}^n \left({1 \over \Gamma(\alpha) \beta^{\alpha}} x_i^{\alpha -1} e^{-\frac{x_i} {\beta}} \right)= {1 \over \Gamma(\alpha)^n \beta^{n\alpha}}\left(\prod_{i=1}^n x_i\right)^{\alpha-1} e^{-{1 \over \beta} \sum_{i=1}^n{x_i}}.\end{align} \tag{1}$$ We can assume that $h(\vec{x})=1$ then the whole right hand part of $(1)$ is $g_{\alpha,\beta}(T(\vec{x}))$, i.e. $$g_{\alpha,\beta}(T(\vec{x}))= {1 \over \Gamma(\alpha)^n \beta^{n\alpha}}\left(\prod_{i=1}^n x_i\right)^{\alpha-1} e^{-{1 \over \beta} \sum_{i=1}^n{x_i}}.$$ And since $g_{\alpha,\beta}(T(\vec{x}))$ depends on the drawn sample only through $\prod_{i=1}^n x_i$ and $\sum_{i=1}^n{x_i}$ then they are the sufficient statistics, i.e. $$T(\vec{x})=\left(\prod_{i=1}^n x_i, \ \sum_{i=1}^n{x_i}\right).$$

Related Question