Method 0: The lazy statistician.
Note that for $y \neq 0$ we have $f(y) = (1-\pi) p_y$ where $p_y$ is the probability that a Poisson random variable takes value $y$. Since the term corresponding to $y = 0$ does not affect the expected value, our knowledge of the Poisson and the linearity of expectation immediately tells us that
$$
\mu = (1-\pi) \lambda
$$
and
$$
\mathbb E Y^2 = (1-\pi) (\lambda^2 + \lambda) \> .
$$
A little algebra and the identity $\mathrm{Var}(Y) = \mathbb E Y^2 - \mu^2$ yields the result.
Method 1: A probabilistic argument.
It's often helpful to have a simple probabilistic model for how a distribution arises. Let $Z \sim \mathrm{Ber}(1-\pi)$ and $Y \sim \mathrm{Poi}(\lambda)$ be independent random variables. Define
$$
X = Z \cdot Y \>.
$$
Then, it is easy to see that $X$ has the desired distribution $f$. To check this, note that $\renewcommand{\Pr}{\mathbb P}\Pr(X = 0) = \Pr(Z=0) + \Pr(Z=1, Y=0) = \pi + (1-\pi) e^{-\lambda}$ by independence. Similarly $\Pr(X = k) = \Pr(Z=1, Y=k)$ for $k \neq 0$.
From this, the rest is easy, since by the independence of $Z$ and $Y$,
$$
\mu = \mathbb E X = \mathbb E Z Y = (\mathbb E Z) (\mathbb E Y) = (1-\pi)\lambda \>,
$$
and,
$$
\mathrm{Var}(X) = \mathbb E X^2 - \mu^2 = (\mathbb E Z)(\mathbb E Y^2) - \mu^2 = (1-\pi)(\lambda^2 + \lambda) - \mu^2 = \mu + \frac{\pi}{1-\pi}\mu^2 \> .
$$
Method 2: Direct calculation.
The mean is easily obtained by a slight trick of pulling one $\lambda$ out and rewriting the limits of the sum.
$$
\mu = \sum_{k=1}^\infty (1-\pi) k e^{-\lambda} \frac{\lambda^k}{k!} = (1-\pi) \lambda e^{-\lambda} \sum_{j=0}^\infty \frac{\lambda^j}{j!} = (1-\pi) \lambda \> .
$$
A similar trick works for the second moment:
$$
\mathbb E X^2 = (1-\pi) \sum_{k=1}^\infty k^2 e^{-\lambda} \frac{\lambda^k}{k!} = (1-\pi)\lambda e^{-\lambda} \sum_{j=0}^\infty (j+1) \frac{\lambda^j}{j!} = (1-\pi)(\lambda^2 + \lambda) \>,
$$
from which point we can proceed with the algebra as in the first method.
Addendum: This details a couple tricks used in the calculations above.
First recall that $\sum_{k=0}^\infty \frac{\lambda^k}{k!} = e^\lambda$.
Second, note that
$$
\sum_{k=0}^\infty k \frac{\lambda^k}{k!} = \sum_{k=1}^\infty k \frac{\lambda^k}{k!} = \sum_{k=1}^\infty \frac{\lambda^k}{(k-1)!} = \sum_{k=1}^\infty \frac{\lambda \cdot \lambda^{k-1}}{(k-1)!} = \lambda \sum_{j=0}^\infty \frac{\lambda^j}{j!} = \lambda e^{\lambda} \>,
$$
where the substitution $j = k-1$ was made in the second-to-last step.
In general, for the Poisson, it is easy to calculate the factorial moments $\mathbb E X^{(n)} = \mathbb E X(X-1)(X-2)\cdots(X-n+1)$ since
$$
e^\lambda \mathbb E X^{(n)} = \sum_{k=n}^\infty k(k-1)\cdots(k-n+1) \frac{\lambda^k}{k!} = \sum_{k=n}^\infty \frac{\lambda^n \lambda^{k-n}}{(k-n)!} = \lambda^n \sum_{j=0}^\infty \frac{\lambda^j}{j!} = \lambda^n e^\lambda \>,
$$
so $\mathbb E X^{(n)} = \lambda^n$. We get to "skip" to the $n$th index for the start of the sum in the first equality since for any $0 \leq k < n$, $k(k-1)\cdots(k-n+1) = 0$ since exactly one term in the product is zero.
Since a Possion distribution models 'objects' arriving randomly and independently in time or, space, the probability of that event occurring in any sub-interval given you know it has occurred already, is uniform. I believe that's where the proportionality part comes in.
Let me give you another scenario.
Suppose a grocery store opens at 8:00am daily, and customers arrive to the store according to a Poisson process. Mike, the shift manager, sees a customer in aisle three at 8:03am and says to himself, "I wonder what the probability is that they entered the store in the last three minutes?"
To answer Mike's question for him recall, "the probability of that event occurring in any sub-interval given you know it has occurred already, is uniform."
Best Answer
In general Poisson distribution is defined as having parameter $\lambda > 0$. For $\lambda < 1$ the smaller $\lambda$ gets, the more mass is accumulated around $0$, i.e. $\lim_{\lambda \rightarrow 0} P(K=0|\lambda) = 1$ and $\lim_{\lambda \rightarrow 0} P(K=k|\lambda) = 0$ for $k>0$ (Said, 1958). This is shown on an ugly plot below, that illustrates Poisson pmf for $\lambda = 1/10^0, ..., 1/10^{20}$ and $\lambda = 0$ as a red line (notice that it is heavily zoomed).
As noted by you and @ChristophHanck, number to the zeroth power is one and common convention is that $0^0 = 1$. On another hand $0^x = 0$ for non-zero $x$'s. Dividing zero by non-zero gives you zero. So if $\lambda = 0$ Poisson pmf simplifies to
$$ \frac{0^k \times \text{(whatever)}}{ \text{(whatever)} } $$
so it is degenerate distribution with all point mass at zero (cf. here):
$$ f(k) = \begin{cases} 1 & \text{if }k=0, \\[6pt] 0 & \text {if }k>0.\end{cases} $$
Mean and variance for Poisson distribution are equal to $\lambda$ and in this case only zero has non-zero probability, so expected value is obvious and there is no variability (variance is zero). Also R does not have any problem with $\lambda$'s defined as non-negative (rather than positive) values:
So the behavior of Poisson pmf at $\lambda=0$ is coherent with its behavior at limit, moreover there is no problem with calculating Poisson pmf form this value, but it yields a degenerate distribution.
Said, A.S. (1958). Some properties of the poisson distribution. AIChE Journal, 4(3), 290-292.