For $\alpha,\beta>0$, joint density of $(X_1,X_2,\cdots,X_n)$ is
\begin{align}
f_{\beta}(x_1,x_2,\cdots,x_n)&=\frac{1}{\left(B(\alpha,\beta)\right)^n}\left(\prod_{i=1}^n x_i\right)^{\alpha-1}\left(\prod_{i=1}^n(1-x_i)\right)^{\beta-1}\mathbf1_{0<x_i<1}
\\&=\exp\left[-n\ln B(\alpha,\beta)+(\alpha-1)\sum \ln x_i+(\beta-1)\sum \ln(1-x_i)\right]
\\&=\exp\left[\beta\sum_{i=1}^n \ln(1-x_i)+A(\alpha,\beta)+B(x_1,\cdots,x_n)\right]
\end{align}
for some function $A$ and $B$.
Clearly, the family of distributions $\{f_{\beta}:\beta>0\}$ belongs to the one-parameter exponential family. Hence, a minimal sufficient statistic for $\beta$ is $$H(X_1,\cdots,X_n)=\sum_{i=1}^n\ln (1-X_i)$$
Observe that
\begin{align}T(X_1,\cdots,X_n)&=\frac{1}{n}\left[\sum_{i=1}^n\ln\left(\frac{1}{1-X_i}\right)\right]^3
\\&=\frac{-1}{n}\left[\sum_{i=1}^n\ln(1-X_i)\right]^3
\end{align}
is a function of $H(X_1,\cdots,X_n)$.
And we know that a minimal sufficient statistic is a function of every other sufficient statistic.
So $T$ is a sufficient statistic for $\beta$.
We can also show that the joint density can be factored as $$f_{\beta}(x_1,\cdots,x_n)=g(\beta, T)h(x_1,\cdots,x_n)$$
where $g$ depends on $\beta$ and on $x_1,\cdots,x_n$ through $T$ and $h$ is independent of $\beta$.
You say you could show that $S=\prod_{i=1}^n (1-X_i)$ is sufficient for $\beta$. But then $T$ is a function of $S$ also. So we reach a similar conclusion from the Factorisation theorem.
Best Answer
Due to independence, joint density of the sample $\mathbf X=(X_1,X_2,\cdots,X_n)$ is
\begin{align} f_{\theta}(\mathbf x)&=\prod_{i=1}^n \frac{\beta}{\theta^{\beta}}x_i^{\beta -1}e^{-x_i^{\beta}/{\theta}^{\beta}}\mathbf1_{x_i>0} \\&=\frac{e^{-\frac{\sum x_i^{\beta}}{\theta^{\beta}}}}{\theta^{n\beta}}\beta^n \left(\prod_{i=1}^n x_i\right)^{\beta-1}\mathbf1_{x_1,\cdots,x_n>0}\quad,\theta,\beta>0 \\&=g(\theta,t(\mathbf x))h(\mathbf x) \end{align}
, where $g(\theta, t(\mathbf x))= \frac{e^{-\frac{\sum x_i^{\beta}}{\theta^{\beta}}}}{\theta^{n\beta}} $ depends on $\theta$ and on $x_1,x_2,\cdots,x_n$ through $t(\mathbf x)=\sum_{i=1}^n x_i^{\beta}$ and $h(\mathbf x)= \beta^n \left(\prod_{i=1}^n x_i\right)^{\beta-1} $ is independent of $\theta$.
Assuming $\beta$ is known, by the Factorization theorem, a sufficient statistic for $\theta$ would be
$$T(\mathbf X)=\sum_{i=1}^nX_i^{\beta}$$
Your answer is not quite right. But we can say that $e^{-T}$ is also sufficient for $\theta$, being a bijective function of $T$.