Suppose we have a collection of samples $X_1,X_2,\dots,X_n$ that have been drawn from some fixed probability distribution (Poisson, exponential, normal, etc.) that depends on some unknown parameter $\mu$.
We often want to use the samples to estimate the value of $\mu$. To this end we come up with functions that take in $X_1,X_2,\dots,X_n$ and return an estimate of $\mu$. Any such function is called an estimator.
Now, some estimators are better or worse than others. For example,
$$f(X_1,X_2,\dots,X_n) = 0$$
is clearly a terrible estimator (unless it so happens that $\mu=0$), given that it doesn't use any information provided by the samples.
A basic criteria for an estimator to be any good is that it is unbiased, that is, that on average it gets the value of $\mu$ correct. Formally, an estimator $f$ is unbiased iff
$$E[f(X_1,X_2,\dots,X_n)] =\mu.$$
In your case, the estimator is the sample average, that is,
$$f(X_1,X_2,\dots,X_n)=\frac{1}{n}\sum_{i=1}^n X_i,$$
and it is unbiased since on average it guesses the unknown parameter, $\lambda$, correctly. An example of a biased estimator would be
$$f(X_1,X_2,\dots,X_n)=1+\frac{1}{n}\sum_{i=1}^n X_i,$$
since $E[f(X_1,X_2,\dots,X_n)] = 1+\lambda$. On average it gets the value of $\lambda$ wrong by $1$; it has a bias of $1$.
Returning to the sample average, suppose that the samples are drawn from any distribution (not necessarily Poison) which has an expected value (or mean) of $\mu$. Then
$$E[f(X_1,X_2,\dots,X_n)] = E\left[\frac{1}{n}\sum_{i=1}^n X_i\right] = \frac{1}{n}\sum_{i=1}^n E[X_i] = \frac{1}{n}\sum_{i=1}^n \mu = \mu.$$
So in general, the sample average is an unbiased estimator of the expected value of the distribution from which the samples are drawn.
Best Answer
Recall that for every random variable $X$ Poisson with parameter $\lambda$ one has $E(X)=\lambda$ and $E(X^2)=\lambda^2+\lambda$ hence $\mathrm{var}(X)=\lambda$. Thus, if $(Y_k)$ is i.i.d. Poisson $\lambda$ and $\bar Y=\frac1n\sum\limits_{k=1}^nY_k$ then $\bar Y$ is an unbiased estimator of $\lambda$ since $E(\bar Y)=\lambda$.
Likewise, $\mathrm{var}(\bar Y)=\frac1{n}\mathrm{var}(Y_1)=\frac1n\lambda$ hence $E(\bar Y^2)=\frac1n\lambda+\lambda^2$. Solving for $\theta=3\lambda+\lambda^2$ yields $\theta=E(\bar Y^2)+(3-\frac1n)E(\bar Y)$ hence an unbiased estimator of $\theta$ is $$ \Theta=\bar Y^2+\left(3-\frac1n\right)\bar Y. $$