In the case where the parametric family is normal with unknown mean $\mu$ and variance $\sigma^2$, we already know that a sufficient statistic (indeed, joint MLE) is $T = (\hat \mu, \hat \sigma)$ where $$\hat \mu = \bar X, \quad \hat \sigma^2 = \frac{1}{n} \sum_{i=1}^n (X_i - \bar X)^2.$$ So when we have the additional constraint that $\mu = \sigma^2$, we immediately know that at least some data reduction is attainable through $T$, which excludes answer choice (d). Even more trivially, the sample itself is a trivial sufficient statistic in which no data reduction occurs.
Moreover, if you found that $\sum X_i^2$ is sufficient for $\theta$, then you already know that $(c)$ in addition to $(a)$ must be true, since knowledge of $(c)$ gives full knowledge of $(a)$.
The only remaining issue is determining whether $\sum X_i$ alone is sufficient for $\theta$; that is to say, the truth of choice $(b)$ must be ascertained. The joint density is, as you computed,
$$\begin{align}
f(\boldsymbol x) &= (2\pi)^{-n/2} \theta^{-n} \exp \left( -\sum_{i=1}^n \frac{(x_i - \theta)^2}{2\theta}\right) \\
&= (2\pi)^{-n/2} \theta^{-n} \exp \left( - \frac{1}{2\theta} \sum_{i=1}^n (x_i^2 - 2\theta x_i + \theta^2) \right) \\
&= (2\pi)^{-n/2} \theta^{-n} e^{-\sum X_i^2/(2\theta)} e^{\sum X_i} e^{-n\theta/2},
\end{align}$$
so for the choice $$h(\boldsymbol x) = (2\pi)^{-n/2} e^{\sum X_i}, \\ T(\boldsymbol x) = \sum X_i^2, \\ g(T \mid \theta) = \theta^{-n} e^{-T/(2\theta)} e^{-n\theta/2},$$
the factorization theorem shows that $\sum X_i$ is not sufficient for $\theta$; hence $(b)$ is false.
In case there is doubt about $(b)$ being false, we can easily construct two distinct samples for which the sample totals are equivalent, but the sum of squares are not. For instance,
$$\boldsymbol x = (1, 2, 3, 2, 6), \quad \boldsymbol x^* = (1, 2, 3, 4, 4)$$ both have a sample total of $14$, but the sum of squares are $54$ and $46$, respectively. Since we already proved that the sum of squares is sufficient for $\theta$, if I told you that the sample total is $14$, you could not tell me whether the sample's sum of squares is, for instance, $54$ or $46$, because both of these values could have arisen from a sample with total $14$. Consequently, having only the knowledge of the sample total is not enough to preserve all of the information about $\theta$ that was present in the sample itself.
Best Answer
First observe that $\frac{X}{\sqrt{\theta}}\sim N(0;1)$ and thus
$$ \bbox[5px,border:2px solid black] { \frac{X^2}{\theta}\sim \chi_{(1)}^2=Gamma\Big(\frac{1}{2};\frac{1}{2}\Big) \qquad (1) } $$
Now it is evident that
$$ \bbox[5px,border:2px solid black] { T=\frac{1}{\theta}\sum_{i=1}^{n}X_i^2\sim Gamma\Big(\frac{n}{2};\frac{1}{2}\Big) \qquad (2) } $$
Concluding, $Y=\theta T$,
thus
$$ \bbox[5px,border:2px solid black] { Y\sim Gamma\Big(\frac{n}{2};\frac{1}{2\theta}\Big) \qquad (3) } $$
To prove (1) and (3) use the fundamental transformation theorem (change of variable)
To prove (2) use MGF's properties
Some hints for the proofs:
For (1), use the change of variable
$Z=\frac{X^2}{\theta}$ then
$$F_Z(z)=\mathbb{P}[Z\leq z]=\mathbb{P}[X^2\leq z\theta]=\mathbb{P}[-\sqrt{z\theta}\leq X \leq \sqrt{z\theta}]=F_X(\sqrt{z\theta})-F_X(-\sqrt{z\theta})$$
derivate it and get you first PDF:
$$f_Z(z)=\frac{1}{\sqrt{2\pi}}z^{-\frac{1}{2}}e^{-\frac{z}{2}}$$
This can be rewritten in the following way:
$$f_Z(z)=\frac{\Big(\frac{1}{2}\Big)^{\frac{1}{2}}}{\Gamma\Big(\frac{1}{2}\Big)}z^{\frac{1}{2}-1}e^{-\frac{z}{2}}$$
...and the first step is done! $f(z)$ is evidently a $Gamma\Big(\frac{1}{2};\frac{1}{2}\Big)$
For step (2) very easily multiply the n identical MGF's
For step (3) same procedure as (1): change of variable.