It is trivial to see the order statistics $T(X) = (X_{(1)},\dots,X_{(n)})$ are sufficient, hence we only need to prove one direction: that if the ratio is constant as a function of $\theta$, then $T(x) = T(y)$. That is, $x$ must be a permutation of $y$.
Suppose $p(x|\theta) \propto_\theta p(y|\theta)$. Since this proportion holds for all $\theta$, then divide each side by the case where $\theta=0$ so that the constant of proportionality cancels and we get
$$\frac{p(x|\theta)}{p(x|0)} = \frac{p(y|\theta)}{p(y|0)}$$
Taking reciprocal gives us
$$\prod_i \frac{1+(x_i-\theta)^2}{1+x_i^2} = \prod_i \frac{1+(y_i-\theta)^2}{1+y_i^2}$$
Note that since these polynomials are equal, they must have the same roots. Also, each polynomial is of degree $2n$ and so they can have at most $2n$ roots. But it should be clear in this form that the LHS polynomial has complex roots
$$x_i \pm i$$
Since $\big(x_i-(x_i\pm i)\big)^2 = (\pm i)^2 = -1$ While the RHS polynomial has roots
$$y_i \pm i$$
hence each side has $2n$ complex roots of that form, and since the polynomials share the same roots and cannot have more than $2n$ roots, it must be that $x$ and $y$ are permutations of one another, and so they have the same order statistics
$$T(y) = T(y)$$
Now I hate to be the one to answer my own question, but I feel that in the time it took me to formulate my question in MathJax, I might have arrived at the answer.
First, let's look at why the reduction of degree from two-dimensions to one-dimension for a (joint) sufficient statistic vector for $\theta$ of the Uniform distribution works for symmetrical arguments:
Suppose $X_1,X_2,...,X_n$ is a random sample from the symmetric Uniform distribution $Unif(-\theta,\theta)$. By the factorization theorem, it is easy to verify that the vector $\mathbf Y = (Y_1,Y_2)$ where $Y_1 = X_\left(1\right)$ and $Y_2=X_\left(n\right)$ is a joint sufficient vector of degree two for $\theta$, with $$K_1(Y_1,Y_2;\theta)=(\frac{1}{2})^n \cdot \mathbf 1_{(-\theta,\theta)}(Y_1) \cdot \mathbf 1_{(-\theta,\theta)}(Y_n)$$
From the two indicator functions and from the definition of order statistics, we have that $$-\theta<Y_1<Y_n<\theta \implies \theta>-Y_1 \land \theta>Y_n$$
This allows us to use the maximum function concurrently on $-Y_1$ and $Y_n$ to put a restriction on $\theta$, meaning that this result, $Y^* = max\{-Y_1,Y_n\}$, is such that $$\mathbf 1_{(-\theta,\theta)}(Y_1) \cdot \mathbf 1_{(-\theta,\theta)}(Y_n) = \mathbf 1_{(-\theta,\theta)}(Y^*)$$ is a valid equality.
On the other hand, suppose $X_1,X_2,...,X_n$ is a random sample from the Uniform distribution $Unif(\theta-1,\theta+1)$. By the factorization theorem, it is easy to verify that the vector $\mathbf Y = (Y_1,Y_2)$ where $Y_1 = X_\left(1\right)$ and $Y_2=X_\left(n\right)$ is a joint sufficient vector of degree two for $\theta$, with $$K_1(Y_1,Y_2;\theta)=(\frac{1}{2})^n \cdot \mathbf 1_{(\theta-1,\theta+1)}(Y_1) \cdot \mathbf 1_{(\theta-1,\theta+1)}(Y_n)$$
From the two indicator functions and from the definition of order statistics, we have that $$\theta-1<Y_1<Y_n<\theta+1 \implies Y_1+1>\theta \land Y_n-1<\theta$$
Because now we have $\theta$ sandwiched between two restrictions ("variables", for our purposes) and without the benefit of appealing to the symmetry of the situation, we have no tools available to ourselves to condense the information provided from $Y_1$ and $Y_2$ any further. Thus, we must concede that the joint sufficient statistics $Y_1$ and $Y_n$ are joint minimal sufficient statistics for $\theta$ for a non-symmetric Uniform distribution. On the other hand, we have also shown that $Y^*=max\{Y_1,Y_n\}$ is the single-dimensional and (thus) minimal sufficient sufficient statistic for $\theta$ for a symmetric Uniform distribution.
Best Answer
Just figured out!
Factorize it in another way!
${ \left( \frac { 1 }{ \theta } \right) }^{ n }{ I }_{ \left[ { Y }_{ 1 },\theta \right] }\left( { y }_{ n } \right)\prod _{ i=1 }^{ n }{ {I}_{\left[0,\infty\right)} }({x}_{i}) $