I need to find the minimum-variance unbiased estimator for $\sigma^{2}$ when $\mu$ is known and this is a normal distribution.
From an earlier exercise, $\mu$ is known and $\sigma^{2}$ is unknown so our random sample comes from a normal distribution with mean $\mu$ and variance $\sigma^{2}$.
Also, $\sum^{n}_{i=1} (Y_{i}- \mu)^{2}$ is a sufficient statistic for $\sigma^{2}$.
Since $\mu$ is known, we have
$E((Y_{i}- \mu)^{2}) = E((Y_{i}-\mu)(Y_{i}-\mu)) = E((Y_{i})^{2}-2Y_{i}\mu+(\mu)^{2})$
$E((Y_{i}-\mu)^{2}) = E((Y_{i})^{2})-E[2Y_{i}\mu]+E[(\mu)^{2}]$
$E((Y_{i}-\mu)^{2}) = E((Y_{i})^{2})-E[2]E[Y_{i}]E[\mu]+E[(\mu)^{2}]$
$E((Y_{i}-\mu)^{2}) = \mu^{2}-2(\mu)^{2}+(\mu)^{2}$
$E((Y_{i}-\mu)^{2}) = 2\mu^{2}-2(\mu)^{2}$
$E((Y_{i}-\mu)^{2}) = 0$
I must have done something wrong because the terms are not supposed to cancel. I'm suppose to get the variance from $E((Y_{i}-\mu)^{2}) $. Variance is $E[Y_{i}^{2}]-([E[Y_{i}])^{2}$
Just realizing that a sampling distribution related to the normal distribution is $\bar{Y} = \frac{1}{n} \sum^{n}_{i=1} Y_{i}$.
Maybe using that may get me to the variance formula?
Best Answer
So, as Henry pointed out, I forgot the $\sigma^{2}$ when I was taking the expected value of one of the terms.
Since $\mu$ is known, we have
$E((Y_{i}- \mu)^{2}) = E((Y_{i}-\mu)(Y_{i}-\mu)) = E((Y_{i})^{2}-2Y_{i}\mu+(\mu)^{2})$
$E((Y_{i}-\mu)^{2}) = E((Y_{i})^{2})-E[2Y_{i}\mu]+E[(\mu)^{2}]$
$E((Y_{i}-\mu)^{2}) = E((Y_{i})^{2})-E[2]E[Y_{i}]E[\mu]+E[(\mu)^{2}]$
$E((Y_{i}-\mu)^{2}) = \mu^{2} + \sigma^{2}-2(\mu)^{2}+(\mu)^{2}$
$E((Y_{i}-\mu)^{2}) = 2\mu^{2}+ \sigma^{2}-2(\mu)^{2}$
$E((Y_{i}-\mu)^{2}) = \sigma^{2}$
which is the minimum variance unbiased estimator...well more like $\hat{\sigma^{2}}$.