Context
I am having difficulty trying to understand a step of a proof which relies on a property of series.
Proof
Suppose that $X_1, X_2, \ldots , X_n$ is a random sample of size $n$ from a Poisson distribution with parameter $\lambda > 0$. The goal is to show that $T = \sum_{i=1}^n X_i$ is a complete statistic.
Since we know that $T = \sum_{i=1}^n X_i \sim \mathrm{Poisson}(n\lambda)$:
$$
\mathbb{E}(h(T)) = \sum_{k=0}^{\infty} h(k) \, e^{-n\lambda} \, \frac{(n\lambda)^k}{k!} = 0\Longrightarrow \sum_{k=0}^{\infty} h(k) \, \frac{(n\lambda)^k}{k!} = 0
$$
The textbook I am using and some others sources I've found argue that:
$$
\boxed{\displaystyle\sum_{k=0}^{\infty} h(k) \, \frac{(n\lambda)^k}{k!} = 0 \Longrightarrow h(k) \, \frac{(n\lambda)^k}{k!} = 0 \qquad \forall k}
$$
It probably is an obvious result from calculus, but I am unable to prove it.
If $ h(k) \, (n\lambda)^k/k! = 0$ for all $k$ then $T$ is a complete statistic because $\lambda$ is nonnegative and then $h(k) = 0$ for all $k$ .
Best Answer
If $s(\lambda) = \sum_{k=0}^\infty h(k)\frac{(n\lambda)^k}{k!}$ and $s(\lambda) =0$ for all $\lambda$, then clearly $h(0)=0$ since $s(0)=h(0)$.
Similarly if you find the $m$th derivative of $s(\lambda)$ at $\lambda=0$, which must also be $0$, you will have $h(m)=0$ for all $m$.