As you mentioned, $$P(T=t)=\frac{e^{-n\theta} (n\theta)^t}{t!} $$
Let $I(T)$ be a function such that
\begin{align}
I(T)=\begin{cases}\displaystyle\frac{T!}{(T-k)!}&,\text{ when }T\geq k\\0 &,\text{ elsewhere }\end{cases}
\end{align}
Then,
$$E\left[ I(T) \right]=\sum_{t=k}^{\infty} \frac{e^{-n\theta} (n\theta)^{t-k}}{(t-k)!} n^k \theta^k =n^k\theta^k$$
So, $$E\left[ \frac{I(T)}{n^k} \right]=\theta^k$$
And we know that $T$ is the complete and sufficient statistic, so the function of $T$ given by $I(T)/n^k$ is the UMVUE of $\theta^k$.
Ok well I made some progress on this, and I guess it technically answers my question, although I have some dissatisfaction with it, as I will explain.
The estimator I gave in the OP can indeed be improved to the cases I am interested in, as follows.
Suppose that the Poisson distribution we are interested in can be written as
$$Pr(k|\lambda=Np) = \frac{e^{-Np}(Np)^k}{k!}$$
as in, this is the Poisson limit of N Bernoulli draws with success probability $p$. Now, $p$ is unknown, but we can estimate it via Monte Carlo methods. Suppose these can be describe as draws from another Poisson distribution
$$Pr(x|np) = \frac{e^{-np}(np)^x}{x!}$$
where $x$ is the number of 'successes' in the Monte Carlo simulation. We can then use the MC result to obtain an unbiased estimator for $Pr(k|Np)$ by following closely the proof in the paper I linked in the question. Essentially, we can expand $e^{-Np}$ in a power series, and then use the MC samples to estimate all powers of $p$ unbiasedly as $p^i \sim x!/(n^i (x-i)!)$ , which then gives the estimator we want. Here is the proof:
$\begin{aligned}\frac{e^{-Np}(Np)^k}{k!} &= \frac{(Np)^k}{k!}\sum_{\alpha=0}^\infty \frac{(-Np)^\alpha}{\alpha!} \\
&= \frac{(N)^k}{k!}\sum_{\alpha=0}^\infty \frac{(-N)^\alpha}{\alpha!} p^{\alpha + k} \\
&\sim \frac{(N)^k}{k!}\sum_{\alpha=0}^\infty \frac{(-N)^\alpha}{\alpha!} \left(\frac{1}{n}\right)^{\alpha+k}\frac{x!}{(x-k-\alpha)!} \\
&= \binom{x}{k} \left(\frac{N}{n}\right)^k \sum_{\alpha=0}^\infty \binom{x-k}{\alpha}\left(-\frac{N}{n}\right)^\alpha \\
&= \binom{x}{k} \left(\frac{N}{n}\right)^k \left(1 - \frac{N}{n}\right)^{x-k}
\end{aligned}$
Which indeed seems to work out numerically when I test it. So technically that's my question answered. However, there are a couple of unsatisfying properties here which I wonder if they can be avoided somehow, or if not, I'd like to know why.
First, the estimator is zero or maybe undefined for $x<k$. This means that you need to simulate more successful events than are actually observed, which, if $p$ is small, can require quite a lot of Monte Carlo work. Secondly, it requires that $n>N$ in order for it to be non-negative (and maybe to make sense at all), so again, you need more trials in the simulation than the real experiment, which is a problem if $N$ is big. Neither of these requirements exist in order to use the maximum likelihood estimator from the simulation, which is just $\hat{p}=x/n$, however that estimator is biased.
Is there no way to have my cake and eat it too here? Is the cost of having an unbiased estimate really what this estimator requires, or is there some other estimator that has less strict requirements on the MC distribution? I think the estimator derived here is the minimum variance unbiased estimator given our MC samples, but actually I would be happier with an estimator with larger variance but which worked for $x<k$ and $n<N$ (while still being unbiased).
Best Answer
Hint: Given observed data $x_1,...,x_n$, consider the estimator of the form:
$$\widehat{e^{-\lambda}} = \sum_{k=0}^\infty w_k \Bigg[ \frac{1}{n} \sum_{i=1}^n \mathbb{I}(x_i = k) \Bigg],$$
where $w_0,w_1,w_2,...$ are a series of estimator weights corresponding to the possible outcomes of a Poisson random variable. You can see that this estimator estimates the target value as a weighted sum of the proportions of values equal to each possible outcome of a Poisson random variable.
Try to find an expression for the expected value of this estimator, and then see if there is any choice of values you could make for $w_0,w_1,w_2,...$ that would lead the expected value of this estimator to be equal to the quantity you are trying to estimate.