Using Bayes' Rule, the posterior distribution of the variance is given by the product of the likelihood and the prior distribution of the variance divided by the marginal likelihood
$$p(\sigma^2|t_1,t_2,\cdots,t_n)=\frac{p(t_1,t_2,\cdots,t_n|\sigma^2)p(\sigma^2)}{ p(t_1,t_2,\cdots,t_n)}=\frac{p(t_1,t_2,\cdots,t_n|\sigma^2)p(\sigma^2)}{\int p(t_1,t_2,\cdots,t_n|\sigma^2)p(\sigma^2)d(\sigma^2)} $$
As the observed values $t_i$ are i.i.d. (with zero mean) the likelihood function $p(t_1,t_2,...,t_n|\sigma^2)$ is given by
$$p(t_1,t_2,...,t_n|\sigma^2)=\prod_{i=1}^n\frac{1}{\sigma\sqrt{2\pi}}\exp\left(-\frac{t_i^2}{2\sigma^2}\right)=\frac{1}{\sigma^n(2\pi)^{n/2}}\exp\left(-\frac{1}{2\sigma^2}\sum_{i=1}^nt_i^2\right)$$
while the prior on the variance is simply the following probability mass function
$$p(\sigma^2=1)=\frac{1}{2},\ p(\sigma^2=4)=\frac{1}{2}$$
Given the discrete nature of the prior of the variance, the marginal likelihood simplifies to
$$\begin{align}p(t_1,t_2,\cdots,t_n)&=p(t_1,t_2,\cdots,t_n|\sigma^2=1)p(\sigma^2=1)+p(t_1,t_2,\cdots,t_n|\sigma^2=4)p(\sigma^2=4)\\&=\frac{1}{2}\left[\frac{1}{(2\pi)^{n/2}}\exp\left(-\frac{1}{2}\sum_{i=1}^nt_i^2\right)+\frac{1}{2^n(2\pi)^{n/2}}\exp\left(-\frac{1}{8}\sum_{i=1}^nt_i^2\right)\right]\end{align}$$
In order to use the MAP rule to determine whether the samples were drawn from a distribution where the variance was $1$, we need to satisfy the following inequality (have a look at Equations $3.4$ and $3.5$ in here):-
$$\frac{p(t_1,t_2\cdots,t_n|\sigma^2=4)p(\sigma^2=4)}{p(t_1,t_2,\cdots,t_n)}<\frac{p(t_1,t_2\cdots,t_n|\sigma^2=1)p(\sigma^2=1)}{p(t_1,t_2,\cdots,t_n)}$$
which (given the prior of the variance) simplifies to
$$p(t_1,t_2\cdots,t_n|\sigma^2=4)<p(t_1,t_2\cdots,t_n|\sigma^2=1)$$
which is the condition that
$$\frac{1}{2^n(2\pi)^{n/2}}\exp\left(-\frac{1}{8}\sum_{i=1}^nt_i^2\right)<\frac{1}{(2\pi)^{n/2}}\exp\left(-\frac{1}{2}\sum_{i=1}^nt_i^2\right)$$
Taking the logarithm of both sides and simplifying results in the following inequality
$$\frac{3}{8n\log 2}\sum_{i=1}^nt_i^2<1$$
Thus, we have $c_1=\frac{3}{8n\log 2}$ and $c_2=0$.
Let
$s^2=\frac{n}{n-1}\left(\overline{x^2}-\left(\overline x\right)^2\right)=\frac{1}{n-1}\sum_{i=1}^n \left(x_i-\overline x\right)^2$. Define centered r.v.'s $y_i=x_i-\mathbb Ex_1$ and rewrite sample variance in terms of this r.v.'s:
$$
s^2=\frac{1}{n-1}\sum_{i=1}^n \left(y_i-\overline y\right)^2 = \frac{n}{n-1}\left(\overline{y^2}-\left(\overline y\right)^2\right)=\overline{y^2}-\left(\overline y\right)^2+\frac{s^2}{n}.
$$
Note that $\sigma^2=\text{Var}(x_1)=\mathbb E[y_1^2]$.
Find the limiting distribution of $\sqrt{n}\left(s^2-\sigma^2\right)$:
$$\tag{1}\label{1}
\sqrt{n}\left(s^2-\sigma^2\right) = \sqrt{n}\left(\overline{y^2}-\left(\overline y\right)^2+\frac{s^2}{n} -\sigma^2 \right)=\sqrt{n}\left(\overline{y^2}-\sigma^2 \right) -\sqrt{n}\left(\overline y\right)^2+\sqrt{n}\frac{s^2}{n} .
$$
Next prove that $\sqrt{n}\left(\overline y\right)^2 \xrightarrow{p} 0$ and $\sqrt{n}\,\dfrac{s^2}{n}=\dfrac{s^2}{\sqrt{n}}\xrightarrow{p} 0$ as $n\to\infty$. Indeed, by Slutsky's theorem,
$$\sqrt{n}\left(\overline y\right)^2 = \underbrace{\overline y}_{\begin{array}$\downarrow p \cr 0\end{array}} \cdot \underbrace{\sqrt{n}\left(\overline y\right)}_{\begin{array}$\downarrow d \cr N(0,1)\end{array}}\xrightarrow{d} 0\cdot N(0,1)=0$$
The convergence in distribution to zero implies the convergence in probability.
Next,
$$
\dfrac{s^2}{\sqrt{n}} = s^2\cdot \frac{1}{\sqrt{n}}\xrightarrow{p} \sigma^2\cdot 0=0.$$
We obtain that the second and third terms in r.h.s. of (\ref{1}) tends to zero in probability. Consider the first term:
$$
\sqrt{n}\left(\overline{y^2}-\sigma^2 \right) = \sqrt{n}\left(\overline{y^2}-\mathbb E\left[y_1^2\right] \right) \xrightarrow{d} N(0,\text{Var}(y_1^2))=N(0,\mathbb E\left[y_1^4\right]-\sigma^4).
$$
By Slutsky's theorem,
$$\tag{2}\label{2}
\sqrt{n}(s^2-\sigma^2)\xrightarrow{d}N(0,\mathbb E\left[y_1^4\right]-\sigma^4)= N(0,{\mathbb E}\left[(x_1-\mathbb Ex_1)^4\right]-\sigma^4).
$$
So, you can say that the limiting distribution of $s^2$ is normal with mean $\sigma^2$ and variance $$\dfrac{{\mathbb E}\left[(x_1-\mathbb Ex_1)^4\right]-\sigma^4}{n}.$$ But this words are extremely non-rigorous. The rigorous statement is (\ref{2}).
Best Answer
I just figured out the answer, it is a good drill for understanding the basic idea about the law of iterated expectations.
1) $E[AB]=E[E[AB|N]]=E[E[A|N]E[B|N]]$, Because $E[A|N]=E[B|N]=N$, so $E[AB]=E[N^2]$. We know $E[N]=\frac{1}{p}$ and $Var(N)=\frac{1-p}{p^2}$. It is easy to get $E[N^2]$. Similarly, $E[AN]=E[E[AN|N]]=E[E[N|N]E[A|N]]$
2) After getting the terms in part 1, this part is pretty smooth sailing.$$\hat{N}=E[N] + \frac{Cov(N,A)}{Var(A)}(A-E[A])$$