Gamma prior and Poisson for data $X_i$, why we only $\sum_i X_i$ for posterior

gamma distributionpoisson distributionposterior

I'm looking at problem 7.24 in Casella and Berger (see below)
enter image description here

to calculate the posterior of $\lambda$, I think we need to calculate $\pi(\lambda|X_1=x_1,…,X_n=x_n)$. However, the solution (see below) calculates $\pi(\lambda|\sum_{i}X_i=y)$, where the conditioning event contains less information. I'm wondering why we only need to calculate $\pi(\lambda|\sum_{i}X_i=y)$.

enter image description here

Best Answer

It's based on the following two facts:

fact1. if $T(x)$ is a sufficient statistic for $\lambda$, then posterior $\pi(\lambda|x)=\pi(\lambda|T(x))$, where $x$ is our data.

fact2. $\sum_i X_i$ is the sufficient statistic for $\lambda$, the parameter for Poisson distribution.

Fact 1 is true as

$\pi(\lambda|x)=\frac{f(x|\lambda)p(\lambda)}{\int_{\lambda}f(x|\lambda)p(\lambda)d\lambda}=\frac{g(T(x)|\lambda)h(x)p(\lambda)}{\int_{\lambda}g(T(x)|\lambda)h(x)p(\lambda)d\lambda}=\frac{g(T(x)|\lambda)p(\lambda)}{\int_{\lambda}g(T(x)|\lambda)p(\lambda)d\lambda}=\gamma(\lambda|T(x))$

Related Question