For $\alpha,\beta>0$, joint density of $(X_1,X_2,\cdots,X_n)$ is
\begin{align}
f_{\beta}(x_1,x_2,\cdots,x_n)&=\frac{1}{\left(B(\alpha,\beta)\right)^n}\left(\prod_{i=1}^n x_i\right)^{\alpha-1}\left(\prod_{i=1}^n(1-x_i)\right)^{\beta-1}\mathbf1_{0<x_i<1}
\\&=\exp\left[-n\ln B(\alpha,\beta)+(\alpha-1)\sum \ln x_i+(\beta-1)\sum \ln(1-x_i)\right]
\\&=\exp\left[\beta\sum_{i=1}^n \ln(1-x_i)+A(\alpha,\beta)+B(x_1,\cdots,x_n)\right]
\end{align}
for some function $A$ and $B$.
Clearly, the family of distributions $\{f_{\beta}:\beta>0\}$ belongs to the one-parameter exponential family. Hence, a minimal sufficient statistic for $\beta$ is $$H(X_1,\cdots,X_n)=\sum_{i=1}^n\ln (1-X_i)$$
Observe that
\begin{align}T(X_1,\cdots,X_n)&=\frac{1}{n}\left[\sum_{i=1}^n\ln\left(\frac{1}{1-X_i}\right)\right]^3
\\&=\frac{-1}{n}\left[\sum_{i=1}^n\ln(1-X_i)\right]^3
\end{align}
is a function of $H(X_1,\cdots,X_n)$.
And we know that a minimal sufficient statistic is a function of every other sufficient statistic.
So $T$ is a sufficient statistic for $\beta$.
We can also show that the joint density can be factored as $$f_{\beta}(x_1,\cdots,x_n)=g(\beta, T)h(x_1,\cdots,x_n)$$
where $g$ depends on $\beta$ and on $x_1,\cdots,x_n$ through $T$ and $h$ is independent of $\beta$.
You say you could show that $S=\prod_{i=1}^n (1-X_i)$ is sufficient for $\beta$. But then $T$ is a function of $S$ also. So we reach a similar conclusion from the Factorisation theorem.
Best Answer
A rigourous way to do this is to first show that $\max X_i$ is a minimal sufficient statistic by a corollary of the factorisation theorem, and then it follows immediately that $\frac{2}{n} \sum_{i=1}^n X_i$ is not sufficient, as the minimal sufficient statistic is not a function of it.