Let's assume you have a population of size $N$ with values $x_1,\ldots,x_N$, mean $\bar x=\frac{1}{N}\sum_{i=1}^N x_i$ and variance $\sigma^2=\frac{1}{N}\sum_{i=1}^N(x_i-\bar x)^2$. (Note that I use lower case $x_i$ to indicate these are not random, but fixed values.)
Now, let's take a random sample $Y_1,\ldots,Y_n$ of $n$ elements (without replacement), with all such subsets equally likely. (Now I use capital $Y$ to indicate these are random.)
Now, $\bar Y=\frac{1}{n}\sum_{i=1}^n Y_i$ and let $V=\sum_{i=1}^n (Y_i-\bar Y)^2$ so that the sample variance would be $V/n$ (like the expression for $\sigma^2$). If we write $V$ out in terms of $(Y_i-\bar x)^2$ and $(Y_i-\bar x)(Y_j-\bar x)$, we get
$$
\begin{split}
V
=& \sum_{i=1}^n (Y_i-\bar Y)^2
= \sum_{i=1}^n \left[(Y_i-\bar x)-(\bar Y-\bar x)\right]^2 \\
=& \sum_{i=1}^n \left[(Y_i-\bar x)^2-2(Y_i-\bar x)(\bar Y-\bar x)+(\bar Y-\bar x)^2 \right] \\
=& \sum_{i=1}^n (Y_i-\bar x)^2 - n(\bar Y-\bar x)^2 \\
=& \left(1-\frac{1}{n} \right) \sum_{i=1}^n (Y_i-\bar x)^2
-\frac{2}{n}\sum_{1\le i<j\le n} (Y_i-\bar x)(Y_j-\bar x)
\end{split}
$$
where in the last step we use that
$$
\left(\sum_{i=1}^n (Y_i-\bar x)\right)^2
= \sum_{i=1}^n (Y_i-\bar x)^2 + 2\sum_{1\le i<j\le n} (Y_i-\bar x)(Y_j-\bar x).
$$
We know that $\text{E}[(Y_i-\bar x)^2]=\sigma^2$: this is just taking the average of $(Y_i-\bar x)^2$ for $Y_i$ sampled from $x_1,\ldots,N$.
For $i<j$, we can compute $\text{E}[(Y_i-\bar x)(Y_j-\bar x)]$ by using that this is the same as the average of $(x_i-\bar x)(x_j-\bar x)$ for all $1\le i<j\le N$. Since $\sum_{i=1}^N (x_i-\bar x)=0$, we get
$$
0 = \sum_{1\le i,j\le N} (x_i-\bar x)(x_j-\bar x)
= \sum_{i=1}^N (x_i-\bar x)^2 + 2\sum_{1\le i<j\le N} (x_i-\bar x)(x_j-\bar x)
$$
which for $i<j$ makes
$$
\text{E}\left[(Y_i-\bar x)(Y_j-\bar x)\right]
= -\frac{\sigma^2}{N-1}.
$$
Combining these results, we get
$$
\text{E}[V] = (n-1)\sigma^2 + \frac{n-1}{N-1}\sigma^2
= \frac{(n-1)N}{N-1}\sigma^2
$$
giving an unbiased estimator
$$
\hat\sigma^2 = \frac{N-1}{N(n-1)}V
= \frac{N-1}{N(n-1)} \sum_{i=1}^n (Y_i-\bar Y)^2.
$$
As $N\rightarrow\infty$, you get the familiar $s^2$ estimator which corresponds to independent sampling from a distribution, while $n=N$ gives just $\sigma^2$ as it should when the $x_i$ are known for the whole population.
(a)
$W=X_1 (1-X_2)$
$E(W)=E(X_1 (1-X_2))==E(X_1) E(1-X_2)=p(1-p)$
(b)
$T=\sum_{i=1}^{n} X_i$ is complete and sufficient estimator so $E(X_1 (1-X_2)|T)$ is UMVUE for $p(1-p)$
$\begin{array}{c|c|c}
X_1 & X_2 & X_1 (1-X_2) \\ \hline
0 & 0 & 0 \\
0 & 1 & 0 \\
1 & 0 & 1 \\
1 & 1 & 0 \\
\end{array}$
so $X_1 (1-X_2)$ is a Bernoulli random.
$p(X_1 (1-X_2)=1|T=t)=p(X_1=1,X_2=0|\sum_i X_i=t)$
$=\frac{P(X_1=1,X_2=0,\sum_{i=1}^{n} X_i=t)}{P(\sum_{i=1}^{n} X_i=t)}$
$=\frac{P(X_1=1,X_2=0,\sum_{i=3}^{n} X_i=t-1)}{P(\sum_{i=1}^{n} X_i=t)}$
$=\frac{p q P(\sum_{i=3}^{n} X_i=t-1))}{P(\sum_{i=1}^{n} X_i=t)}$
$=\frac{p q {n-2 \choose t-1} p^{t-1} q^{n-2-t+1}}{ {n \choose t} p^{t} q^{n-t}}=$
$=\frac{{n-2 \choose t-1} }{ {n \choose t} }$ (for $t\geq 1$ O.W equal zero)
$E(X_1 (1-X_2)|T=t)=0+1\times \frac{{n-2 \choose t-1} }{ {n \choose t} }=\frac{{n-2 \choose t-1} }{ {n \choose t} }=\frac{t(n-t)}{n(n-1)} $
so UMVUE for $p(1-p)=\frac{\sum X_i(n-\sum X_i)}{n(n-1)}=\frac{n \bar X(n-n \bar X)}{n(n-1)}=\frac{n \bar X(1- \bar X)}{(n-1)}$
also it equal to $\frac{1}{n-1} \sum(X_i -\bar X)^2$
$\frac{1}{n-1} \sum(X_i -\bar X)^2=\frac{1}{n-1} (\sum X_i^{2} -n \bar X^2)$
since $X_i\in \{0,1\}$
$=\frac{1}{n-1} (\sum X_i -n \bar X^2)$
$=\frac{1}{n-1} (n \bar X -n \bar X^2)=\frac{n}{n-1} \bar X (1- \bar X)$
at this step since $S^{2}=\frac{n}{n-1} \bar X (1- \bar X)$ so $S^{2}$ is a function of $\bar X$ (sufficient and complete estimator) and also is unbaised so directly it is UMVUE for $p(1-p)$(and this is another proof for part(b))
Best Answer
Of course if the mean is a known value it is self evident that it is better to use it instead of using an estimation of the mean...to understand if the "natural" estimator for $\sigma^2$ is biased / unbiased it is enough to calculate its expectation
$$\frac{1}{n}E(\Sigma_iX_i^2-2\mu\Sigma_iX_i+n\mu^2)=E(X^2)-2\mu^2+\mu^2=\sigma^2+\mu^2-2\mu^2+\mu^2=\sigma^2$$
Furthermore observe that you are in a Gaussian model, thus it is important to realize that
$$\frac{\Sigma_i(X_i-\overline{X}_n)^2}{\sigma^2}\sim \chi_{(n-1)}^2$$
while
$$\frac{\Sigma_i(X_i-\mu)^2}{\sigma^2}\sim \chi_{(n)}^2$$