[Math] the Difference between Variance and MSE

definitionmean square errorvariance

I know that the variance measures the dispersion of an estimator around its mean i.e. $\sigma^2 = E[(X – \mu)^2]$ (the second central moment about the mean).

But I'm not getting the meaning of the definition below:

The mean squared error measures the dispersion around the true
value of the parameter being estimated. If the estimator is unbiased
then both are identical.

I know that both variance and MSE are related to second moment. But I'm not getting the actual difference between them. Can anybody explain to me the basic difference between them in simple language?

Best Answer

The main difference is whether you are considering the deviation of the estimator of interest from the true parameter (this is the mean squared error), or the deviation of the estimator from its expected value (this is the variance). Consequently, we can see that when the bias of the estimator is zero, the variance and mean squared error are equal.

Mathematically, if $\hat \theta$ is an estimator for $\theta$, then $$\operatorname{MSE}[\hat \theta] = \operatorname{E}[(\hat\theta - \theta)^2],$$ whereas $$\operatorname{Var}[\hat\theta] = \operatorname{E}[(\hat\theta - \operatorname{E}[\hat\theta])^2].$$ And regarding the previous remark, $$\operatorname{Bias}[\hat\theta] = \operatorname{E}[\hat\theta - \theta],$$ so when the bias is zero, $\operatorname{E}[\hat\theta] = \theta$ and now we easily see how the MSE and variance become equivalent.

Note, however, we can also write: $$\operatorname{Var}[\hat\theta] = \operatorname{E}[\hat \theta^2 - 2\hat\theta \operatorname{E}[\hat \theta] + \operatorname{E}[\hat\theta]^2] = \operatorname{E}[\hat\theta^2] - \operatorname{E}[\hat\theta]^2,$$ so that $$\begin{align*} \operatorname{Var}[\hat\theta] + \operatorname{Bias}^2[\hat\theta] &= \operatorname{E}[\hat\theta^2] - \operatorname{E}[\hat\theta]^2 + (\operatorname{E}[\hat \theta] - \theta)^2 \\ &= \operatorname{E}[\hat\theta^2] - 2\theta \operatorname{E}[\hat\theta] + \theta^2 \\ &= \operatorname{E}[(\hat \theta - \theta)^2] \\ &= \operatorname{MSE}[\hat\theta]. \end{align*}$$

Related Question