If you wanted to show only that the sample mean has a smaller variance than every other weighted average of the observations, then this would be an exercise in Lagrange multipliers. But if you want to include all unbiased estimators of $\mu$ based on $X_1,\ldots,X_n$ (for example, the sample median is one such estimator, and is not a weighted average of the observations), then this becomes equivalent to the one-to-one nature of the two-sided Laplace transform.
Observe that the conditional distribution of $(X_1,\ldots,X_n)$ given $\bar X = (X_1+\cdots+X_n)/n$ does not depend on $\mu$. (I could add the details of how to find the conditional distribution if necessary.) In other words, the sample mean $\bar X$ is a sufficient statistic for $\mu$. Therefore, the Rao–Blackwell theorem tells us that any minimum-variance estimator is to be found only among functions of $\bar X$.
Therefore it is enough to show that the only function $g(\bar X)$ of $\bar X$ (where of course, which function $g$ is, is not allowed to depend on $\mu$; i.e. $g(\bar X)$ is actually a statistic) that is an unbiased estimator of $\mu$ is $\bar X$ itself.
The density function of $\bar X$ is
$$
x\mapsto \text{constant}\cdot \exp\left(\frac{-1}{2}\cdot\left(\frac{x-\mu}{1/\sqrt{n}}\right)^2\right).
$$
In order that the function $g(\bar X)$ be an unbiased estimator of $\mu$, we must $g(\bar X)-\bar X$ being an unbiased estimator of $0$. Let $h(x) = g(x)-x$; then we must have
$$
\int_{-\infty}^\infty (\text{same constant})\cdot h(x) \exp\left(\frac{-1}{2}\cdot\left(\frac{x-\mu}{1/\sqrt{n}}\right)^2\right) \, dx = 0
$$
for all values of $\mu$. Hence
$$
\text{same constant}\cdot \exp\left(\frac{-n\mu^2}{2}\right) \cdot \int_{-\infty}^\infty \left(h(x) \exp\left(\frac{-n}{2} x^2\right)\right) \exp\left(nx\mu\right) \, dx = 0
$$
regardless of the value of $\mu$. Thus the two-sided Laplace transform of the function
$$
x\mapsto h(x)\exp\left( \frac{-nx^2}{2} \right)
$$
is $0$ for all values of $\mu$.
Since the two-sided Laplace transform is one-to-one, it can map only one function to the identically zero function.
Let's assume you have a population of size $N$ with values $x_1,\ldots,x_N$, mean $\bar x=\frac{1}{N}\sum_{i=1}^N x_i$ and variance $\sigma^2=\frac{1}{N}\sum_{i=1}^N(x_i-\bar x)^2$. (Note that I use lower case $x_i$ to indicate these are not random, but fixed values.)
Now, let's take a random sample $Y_1,\ldots,Y_n$ of $n$ elements (without replacement), with all such subsets equally likely. (Now I use capital $Y$ to indicate these are random.)
Now, $\bar Y=\frac{1}{n}\sum_{i=1}^n Y_i$ and let $V=\sum_{i=1}^n (Y_i-\bar Y)^2$ so that the sample variance would be $V/n$ (like the expression for $\sigma^2$). If we write $V$ out in terms of $(Y_i-\bar x)^2$ and $(Y_i-\bar x)(Y_j-\bar x)$, we get
$$
\begin{split}
V
=& \sum_{i=1}^n (Y_i-\bar Y)^2
= \sum_{i=1}^n \left[(Y_i-\bar x)-(\bar Y-\bar x)\right]^2 \\
=& \sum_{i=1}^n \left[(Y_i-\bar x)^2-2(Y_i-\bar x)(\bar Y-\bar x)+(\bar Y-\bar x)^2 \right] \\
=& \sum_{i=1}^n (Y_i-\bar x)^2 - n(\bar Y-\bar x)^2 \\
=& \left(1-\frac{1}{n} \right) \sum_{i=1}^n (Y_i-\bar x)^2
-\frac{2}{n}\sum_{1\le i<j\le n} (Y_i-\bar x)(Y_j-\bar x)
\end{split}
$$
where in the last step we use that
$$
\left(\sum_{i=1}^n (Y_i-\bar x)\right)^2
= \sum_{i=1}^n (Y_i-\bar x)^2 + 2\sum_{1\le i<j\le n} (Y_i-\bar x)(Y_j-\bar x).
$$
We know that $\text{E}[(Y_i-\bar x)^2]=\sigma^2$: this is just taking the average of $(Y_i-\bar x)^2$ for $Y_i$ sampled from $x_1,\ldots,N$.
For $i<j$, we can compute $\text{E}[(Y_i-\bar x)(Y_j-\bar x)]$ by using that this is the same as the average of $(x_i-\bar x)(x_j-\bar x)$ for all $1\le i<j\le N$. Since $\sum_{i=1}^N (x_i-\bar x)=0$, we get
$$
0 = \sum_{1\le i,j\le N} (x_i-\bar x)(x_j-\bar x)
= \sum_{i=1}^N (x_i-\bar x)^2 + 2\sum_{1\le i<j\le N} (x_i-\bar x)(x_j-\bar x)
$$
which for $i<j$ makes
$$
\text{E}\left[(Y_i-\bar x)(Y_j-\bar x)\right]
= -\frac{\sigma^2}{N-1}.
$$
Combining these results, we get
$$
\text{E}[V] = (n-1)\sigma^2 + \frac{n-1}{N-1}\sigma^2
= \frac{(n-1)N}{N-1}\sigma^2
$$
giving an unbiased estimator
$$
\hat\sigma^2 = \frac{N-1}{N(n-1)}V
= \frac{N-1}{N(n-1)} \sum_{i=1}^n (Y_i-\bar Y)^2.
$$
As $N\rightarrow\infty$, you get the familiar $s^2$ estimator which corresponds to independent sampling from a distribution, while $n=N$ gives just $\sigma^2$ as it should when the $x_i$ are known for the whole population.
Best Answer
Let $X_i$ be i.i.d. Then the estimator is $s^2=n\cdot Var(\overline X)=n\cdot Var\left(\frac1n\cdot\sum\limits_{i=1}^n X_i\right) =\frac1n\cdot Var\left(\sum\limits_{i=1}^n X_i\right)$
Now you can calculate the expected value of $s^2$ and you will see, that $s^2$ is a biased estimator.
$E\left(\frac1n\cdot Var\left(\sum\limits_{i=1}^n X_i\right) \right)=\frac1n\cdot E\left(Var\left(\sum\limits_{i=1}^n X_i\right) \right)$
Due the independence we get
$\frac1n\cdot E\left(\sum\limits_{i=1}^n Var \left(X_i\right) \right)$. Since the variables are identical distributed we get $\frac1n\cdot E\left(n\cdot Var \left(X_i\right) \right)=E\left(Var(X_i)\right)=\frac{1}{n}E\left(\sum\limits_{i=1}^n (X_i-\overline X )^2\right)\quad \pm \mu$
$=\frac{1}{n}E\left[\sum_{i=1}^n \left[(X_i-\mu)-(\overline X-\mu) \right]^2 \right] \quad$
multipliying out
$=\frac{1}{n}E\left[\sum_{i=1}^n \left[(X_i-\mu)^2-2(\overline X-\mu)(X_i-\mu)+(\overline X-\mu)^2 \right]\right] \quad$
writing for each summand a sigma sign
$=\frac{1}{n}E\left[\sum_{i=1}^n (X_i-\mu)^2-2(\overline X-\mu)\sum_{i=1}^n(X_i-\mu)+\sum_{i=1}^n(\overline X-\mu)^2 \right] \quad$
$=\frac{1}{n}E\left[\sum_{i=1}^n (X_i-\mu)^2-2(\overline X-\mu)\color{blue}{\sum_{i=1}^n(X_i-\mu)}+n(\overline X-\mu)^2 \right] \quad$
transforming the blue term
$\sum_{i=1}^n(X_i-\mu)=n\cdot \overline X-n\cdot \mu$
Thus $2(\overline X-\mu)\color{blue}{\sum_{i=1}^n(X_i-\mu)}=2(\overline X-\mu)\cdot (n\cdot \overline X-n\cdot \mu)=2n( \overline X- \mu)^2$
$=\frac{1}{n}E\left[\sum_{i=1}^n (X_i-\mu)^2-2n( \overline X- \mu)^2+n(\overline X-\mu)^2 \right] \quad$
$=\frac{1}{n}E\left[\sum_{i=1}^n (X_i-\mu)^2-n( \overline X- \mu)^2\right] \quad$
$=\frac{1}{n}\left[\sum_{i=1}^n E\left[(X_i-\mu)^2\right]-nE\left[( \overline X- \mu)^2\right]\right] \quad$
We know, that $E\left[(X_i-\mu)^2\right]=\sigma^2$ and $E\left[( \overline X- \mu)^2\right]=\sigma_{\overline x}^2=\frac{\sigma^2}{n}$ Thus we get
$=\frac{1}{n}\left[n \cdot \sigma ^2-n \frac{\sigma ^2}{n}\right]=\sigma^2-\frac{\sigma^2}{n}=\sigma^2\cdot \left(1-\frac1n\right)$
Thus $s^2=n\cdot Var(\overline X)$ is an biased estimator for $\sigma^2$. But it is asymptotically unbiased.