For $\alpha,\beta>0$, joint density of $(X_1,X_2,\cdots,X_n)$ is
\begin{align}
f_{\beta}(x_1,x_2,\cdots,x_n)&=\frac{1}{\left(B(\alpha,\beta)\right)^n}\left(\prod_{i=1}^n x_i\right)^{\alpha-1}\left(\prod_{i=1}^n(1-x_i)\right)^{\beta-1}\mathbf1_{0<x_i<1}
\\&=\exp\left[-n\ln B(\alpha,\beta)+(\alpha-1)\sum \ln x_i+(\beta-1)\sum \ln(1-x_i)\right]
\\&=\exp\left[\beta\sum_{i=1}^n \ln(1-x_i)+A(\alpha,\beta)+B(x_1,\cdots,x_n)\right]
\end{align}
for some function $A$ and $B$.
Clearly, the family of distributions $\{f_{\beta}:\beta>0\}$ belongs to the one-parameter exponential family. Hence, a minimal sufficient statistic for $\beta$ is $$H(X_1,\cdots,X_n)=\sum_{i=1}^n\ln (1-X_i)$$
Observe that
\begin{align}T(X_1,\cdots,X_n)&=\frac{1}{n}\left[\sum_{i=1}^n\ln\left(\frac{1}{1-X_i}\right)\right]^3
\\&=\frac{-1}{n}\left[\sum_{i=1}^n\ln(1-X_i)\right]^3
\end{align}
is a function of $H(X_1,\cdots,X_n)$.
And we know that a minimal sufficient statistic is a function of every other sufficient statistic.
So $T$ is a sufficient statistic for $\beta$.
We can also show that the joint density can be factored as $$f_{\beta}(x_1,\cdots,x_n)=g(\beta, T)h(x_1,\cdots,x_n)$$
where $g$ depends on $\beta$ and on $x_1,\cdots,x_n$ through $T$ and $h$ is independent of $\beta$.
You say you could show that $S=\prod_{i=1}^n (1-X_i)$ is sufficient for $\beta$. But then $T$ is a function of $S$ also. So we reach a similar conclusion from the Factorisation theorem.
Best Answer
First of all about the sufficient statistic, according to Wiki:
Here we have $\theta=\{\alpha,\beta\}$. In our case: $$\begin{align}f(\vec{x})=f(x_1,\ldots,x_n) &= \prod_{i=1}^n \left({1 \over \Gamma(\alpha) \beta^{\alpha}} x_i^{\alpha -1} e^{-\frac{x_i} {\beta}} \right)= {1 \over \Gamma(\alpha)^n \beta^{n\alpha}}\left(\prod_{i=1}^n x_i\right)^{\alpha-1} e^{-{1 \over \beta} \sum_{i=1}^n{x_i}}.\end{align} \tag{1}$$ We can assume that $h(\vec{x})=1$ then the whole right hand part of $(1)$ is $g_{\alpha,\beta}(T(\vec{x}))$, i.e. $$g_{\alpha,\beta}(T(\vec{x}))= {1 \over \Gamma(\alpha)^n \beta^{n\alpha}}\left(\prod_{i=1}^n x_i\right)^{\alpha-1} e^{-{1 \over \beta} \sum_{i=1}^n{x_i}}.$$ And since $g_{\alpha,\beta}(T(\vec{x}))$ depends on the drawn sample only through $\prod_{i=1}^n x_i$ and $\sum_{i=1}^n{x_i}$ then they are the sufficient statistics, i.e. $$T(\vec{x})=\left(\prod_{i=1}^n x_i, \ \sum_{i=1}^n{x_i}\right).$$