For the first question, the best unbiased estimator is $\chi\left(\sum_i x_i = n\right)$ as you wrote, because the going probability function for the $n$ observations:
$$
\mathbb{P}\left( X_1=x_1, \ldots, X_n=x_n \right)=p^{x_1}(1-p)^{1-x_1} \cdots p^{x_n} (1-p)^{x_n} = p^{\sum_i x_i} (1-p)^{n - \sum_i x_i}
$$
Thus it factors into $(p^n)^{\chi\left(\sum_i x_i = n\right)} \cdot \left( p^{\sum_i x_i} (1-p)^{n - \sum_i x_i} \right)^{1-\chi\left(\sum_i x_i = n\right)}$.
For the second question $\bar{x}=\frac{1}{n} \sum_{i=1}^n x_i$ is the BUE for $\mu$. The factor of the likelihood that depends on this statistics is $\exp(-\frac{n}{2} \left( \mu - \bar{x} \right)^2 )$.
The variance of $\bar{x}$ is $\mathrm{Var}(\bar{x}) = \frac{1}{n^2} \sum_i \mathrm{Var}(x_i) = \frac{1}{n^2} \cdot n = \frac{1}{n}$, hence the Fisher information is $\mathcal{I}(\mu) = \frac{1}{\mathrm{Var}(\bar{x})} = n$.
For the third question, the joint density for the sample:
$$
f = 2^n \chi_{\theta-\frac{1}{4} \le \min(x_1,\ldots, x_n)} \chi_{\theta+\frac{1}{4} \ge \max(x_1,\ldots,x_n)} = 2^n \chi_{ \max(x_1,\ldots,x_n) -\frac{1}{4} \le \theta \le \min(x_1, \ldots,x_n) + \frac{1}{4} }
$$
Thus $\theta$ is determined by two-component vector statistics consisting of the minimal and maximal element of the sample suitably shifted, and $\theta$ can be anywhere in between. The mean of these two values could be a possible choice for the estimator.
Suppose we have a collection of samples $X_1,X_2,\dots,X_n$ that have been drawn from some fixed probability distribution (Poisson, exponential, normal, etc.) that depends on some unknown parameter $\mu$.
We often want to use the samples to estimate the value of $\mu$. To this end we come up with functions that take in $X_1,X_2,\dots,X_n$ and return an estimate of $\mu$. Any such function is called an estimator.
Now, some estimators are better or worse than others. For example,
$$f(X_1,X_2,\dots,X_n) = 0$$
is clearly a terrible estimator (unless it so happens that $\mu=0$), given that it doesn't use any information provided by the samples.
A basic criteria for an estimator to be any good is that it is unbiased, that is, that on average it gets the value of $\mu$ correct. Formally, an estimator $f$ is unbiased iff
$$E[f(X_1,X_2,\dots,X_n)] =\mu.$$
In your case, the estimator is the sample average, that is,
$$f(X_1,X_2,\dots,X_n)=\frac{1}{n}\sum_{i=1}^n X_i,$$
and it is unbiased since on average it guesses the unknown parameter, $\lambda$, correctly. An example of a biased estimator would be
$$f(X_1,X_2,\dots,X_n)=1+\frac{1}{n}\sum_{i=1}^n X_i,$$
since $E[f(X_1,X_2,\dots,X_n)] = 1+\lambda$. On average it gets the value of $\lambda$ wrong by $1$; it has a bias of $1$.
Returning to the sample average, suppose that the samples are drawn from any distribution (not necessarily Poison) which has an expected value (or mean) of $\mu$. Then
$$E[f(X_1,X_2,\dots,X_n)] = E\left[\frac{1}{n}\sum_{i=1}^n X_i\right] = \frac{1}{n}\sum_{i=1}^n E[X_i] = \frac{1}{n}\sum_{i=1}^n \mu = \mu.$$
So in general, the sample average is an unbiased estimator of the expected value of the distribution from which the samples are drawn.
Best Answer
The exercise is correct. As Clarinetest says in a comment, your error seems to be with $\frac{1}{n^2}\mathbb{E}_\theta\left[\left(\sum_{i=1}^n X_i\right)^2\right]$ i.e. with $\mathbb E[\bar{X}^2]$.
You should have $\mathbb E[X_1 \mid \lambda]=\lambda$ and $\operatorname{Var}(X_1 \mid \lambda) = \lambda$ for a Poisson distribution
so $\mathbb E[\bar{X} \mid \lambda]=\lambda$ and $\operatorname{Var}(\bar{X} \mid \lambda) = \frac1n\lambda$
leading to $\mathbb E[\bar{X}^2 \mid \lambda]=\lambda^2+ \frac1n\lambda$
and thus $\mathbb E[\bar{X}^2 - \frac1n\bar{X} \mid \lambda]=\lambda^2$