First rewrite the density with the new parametrization
$$f(y|\theta)=\frac{ky^{k-1}}{\theta}e^{-\frac{y^k}{\theta}}$$
Calculate the likelihood
$$L(\theta)\propto \theta^{-n}e^{-\frac{\Sigma_i y_i^k}{\theta}}$$
proceeding in the calculation you find that the score function (derivative of the log likelihood with respect to $\theta$) is
$$l^*=-\frac{n}{\theta}+\frac{1}{\theta^2}\Sigma_i y_i^k$$
And thus
$$T=\hat{\theta}_{ML}=\frac{\Sigma_i y_i^k}{n}$$
To show that $\mathbb{E}[T]=\theta$ let's rewrite the score function in the following way
$$l^*=-\frac{n}{\theta}+\frac{nT}{\theta^2}$$
Now simply remembering that (First Bartlett Identity)
$$\mathbb{E}[l^*]=0$$
you get
$$\frac{n}{\theta}=\frac{n\mathbb{E}[T]}{\theta^2}$$
that is also
$$\mathbb{E}[T]=\theta$$
To calculate its variance, using II Bartlett Identity, that is
$$\mathbb{E}[l^{**}]=-\mathbb{E}[(l^*)^2]$$
This identity leads to
$$\mathbb{V}\Bigg[\frac{nT}{\theta^2}-\frac{n}{\theta}\Bigg]=-\mathbb{E}\Bigg[\frac{n}{\theta^2}-\frac{2nT}{\theta^3}\Bigg]$$
that is
$$\frac{n^2}{\theta^4}\mathbb{V}[T]=\frac{n}{\theta^2}$$
$$\mathbb{V}[T]=\frac{\theta^2}{n}$$
Alternative method to calculate expectation and variance of T
Simply transforming
$$W=Y^k$$
you get that $W\sim Exp\Big(\frac{1}{\theta}\Big)$ thus
$$T\sim Gamma\Big(n;\frac{n}{\theta}\Big)$$
thus immediately you get
$$\mathbb{E}[T]=\frac{n}{\frac{n}{\theta}}=\theta$$
$$\mathbb{V}[T]=\frac{n}{\Big(\frac{n}{\theta}\Big)^2}=\frac{\theta^2}{n}$$
The density for a sample from the exponential distribution is $f(\textbf x; \theta)=\theta^ne^{-\theta\sum_i x_i}$. From the Fisher-Neyman factorization theorem, $\sum_i x_i$ is a sufficient statistic for $\theta$. There is a fact that any one-to-one transformation of a sufficient statistic is also sufficient. So $\frac n{\sum_i x_i}$ is sufficient as well. The exponential distribution is from the exponential family of distributions. For one-dimensional exponential families, if the parameter space contains an open interval in $\mathbb R$, then the sufficient statistic is complete. Here $\theta\in(0,\infty)$, an open set, so the statistic is complete. So the MLE is a complete, sufficient statistic.
Here is how you would prove completeness directly from the definition. $T(x)=\sum_i x_i$ is gamma-distributed so it has density $f(t)=\frac{\theta^n}{\Gamma(n)}t^{n-1}e^{-\theta t}$, and so $E(g(T))=\int_0^\infty g(t)t^{n-1}e^{-\theta t}dt$. This is called a Laplace transform and it equals zero only if $g(t)t^{n-1}=0$, so $g(t)=0$. Thus, the sufficient statistic is complete.
Best Answer
Yes, if you have some bijective transformation $\tau = \tau(\theta)$, then the MLE for $\tau$ is $\hat \tau = \tau (\hat \theta)$, where $\hat \theta$ is the MLE for $\theta$; this is called invariance property of MLE (the comment by @BGM points to a stronger version of it).
Hint: $\hat{q}_p$ is proportional to $n/\hat \theta = \sum_{i=1}^n X_i\simeq \Gamma(n,\theta)$.