Method 0: The lazy statistician.
Note that for $y \neq 0$ we have $f(y) = (1-\pi) p_y$ where $p_y$ is the probability that a Poisson random variable takes value $y$. Since the term corresponding to $y = 0$ does not affect the expected value, our knowledge of the Poisson and the linearity of expectation immediately tells us that
$$
\mu = (1-\pi) \lambda
$$
and
$$
\mathbb E Y^2 = (1-\pi) (\lambda^2 + \lambda) \> .
$$
A little algebra and the identity $\mathrm{Var}(Y) = \mathbb E Y^2 - \mu^2$ yields the result.
Method 1: A probabilistic argument.
It's often helpful to have a simple probabilistic model for how a distribution arises. Let $Z \sim \mathrm{Ber}(1-\pi)$ and $Y \sim \mathrm{Poi}(\lambda)$ be independent random variables. Define
$$
X = Z \cdot Y \>.
$$
Then, it is easy to see that $X$ has the desired distribution $f$. To check this, note that $\renewcommand{\Pr}{\mathbb P}\Pr(X = 0) = \Pr(Z=0) + \Pr(Z=1, Y=0) = \pi + (1-\pi) e^{-\lambda}$ by independence. Similarly $\Pr(X = k) = \Pr(Z=1, Y=k)$ for $k \neq 0$.
From this, the rest is easy, since by the independence of $Z$ and $Y$,
$$
\mu = \mathbb E X = \mathbb E Z Y = (\mathbb E Z) (\mathbb E Y) = (1-\pi)\lambda \>,
$$
and,
$$
\mathrm{Var}(X) = \mathbb E X^2 - \mu^2 = (\mathbb E Z)(\mathbb E Y^2) - \mu^2 = (1-\pi)(\lambda^2 + \lambda) - \mu^2 = \mu + \frac{\pi}{1-\pi}\mu^2 \> .
$$
Method 2: Direct calculation.
The mean is easily obtained by a slight trick of pulling one $\lambda$ out and rewriting the limits of the sum.
$$
\mu = \sum_{k=1}^\infty (1-\pi) k e^{-\lambda} \frac{\lambda^k}{k!} = (1-\pi) \lambda e^{-\lambda} \sum_{j=0}^\infty \frac{\lambda^j}{j!} = (1-\pi) \lambda \> .
$$
A similar trick works for the second moment:
$$
\mathbb E X^2 = (1-\pi) \sum_{k=1}^\infty k^2 e^{-\lambda} \frac{\lambda^k}{k!} = (1-\pi)\lambda e^{-\lambda} \sum_{j=0}^\infty (j+1) \frac{\lambda^j}{j!} = (1-\pi)(\lambda^2 + \lambda) \>,
$$
from which point we can proceed with the algebra as in the first method.
Addendum: This details a couple tricks used in the calculations above.
First recall that $\sum_{k=0}^\infty \frac{\lambda^k}{k!} = e^\lambda$.
Second, note that
$$
\sum_{k=0}^\infty k \frac{\lambda^k}{k!} = \sum_{k=1}^\infty k \frac{\lambda^k}{k!} = \sum_{k=1}^\infty \frac{\lambda^k}{(k-1)!} = \sum_{k=1}^\infty \frac{\lambda \cdot \lambda^{k-1}}{(k-1)!} = \lambda \sum_{j=0}^\infty \frac{\lambda^j}{j!} = \lambda e^{\lambda} \>,
$$
where the substitution $j = k-1$ was made in the second-to-last step.
In general, for the Poisson, it is easy to calculate the factorial moments $\mathbb E X^{(n)} = \mathbb E X(X-1)(X-2)\cdots(X-n+1)$ since
$$
e^\lambda \mathbb E X^{(n)} = \sum_{k=n}^\infty k(k-1)\cdots(k-n+1) \frac{\lambda^k}{k!} = \sum_{k=n}^\infty \frac{\lambda^n \lambda^{k-n}}{(k-n)!} = \lambda^n \sum_{j=0}^\infty \frac{\lambda^j}{j!} = \lambda^n e^\lambda \>,
$$
so $\mathbb E X^{(n)} = \lambda^n$. We get to "skip" to the $n$th index for the start of the sum in the first equality since for any $0 \leq k < n$, $k(k-1)\cdots(k-n+1) = 0$ since exactly one term in the product is zero.
Probably you mean that the sample average is sufficient statistics for $\lambda$.
So, if $X_1, X_2$ are independent draws from Poisson distribution with parameter $\lambda$ then the answer is "yes".
Check here: http://en.wikipedia.org/wiki/Sufficient_statistic
Using Fisher–Neyman factorization theorem:
$P\{X=x\}=P\{X_1=x_1,X_2=x_2\}=P\{X_1=x_1\}P\{X_2=x_2\}=\lambda^{x_1}e^{-\lambda}/x_1!*\lambda^{x_2}e^{-\lambda}/x_2!=\lambda^{(x_1+x_2)/2}e^{2-2\lambda}/(x_1!x_2!)=g_\lambda(T(x))h(x)$
where $h(x)=(x_1!x_2!)^{-1}$ and $g_\lambda(T(x))=\lambda^{(x_1+x_2)/2}e^{2-2\lambda}$
Best Answer
The last step simply uses the fact that for each real number $t$, $$\exp(t)=\sum_{i=0}^\infty\frac{t^i}{i!}.$$ Here $t=\lambda s$. (the introduction of $\frac{e^{\lambda s}}{e^{\lambda s}}$ does not seem to be of use here)