[Math] Minimum variance of estimator

meansstatisticsvariance

Consider two processes, both have mean $\mu$. Meanwhile the variance of the first process is $\sigma^2$ (with sample size $n$), and the variance of the second process is $4\sigma ^2$ (with sample size m).
First I proved that $$\hat{\mu}=a\bar{X}+(1-a)\bar{Y}$$ is an unbiased estimator for $\mu$.
Now I want want to find the value of $a$ for which the value of the variance of $\hat{\mu}$ minimum.

My attempt:
I found that the variance of $\hat{\mu}$ is given by
$$a^2\frac{\sigma^2}{n}+(1-a)^2\frac{4\sigma^2}{m}$$
Next I want to minimize this by setting the derivative with respect to $a$ to $0$?

Best Answer

Correct. Once you find the critical point, it suffices to show it corresponds to a minimum by considering the second derivative at this value. You will also find that the minimum variance attained should be $$\frac{4\sigma^2}{m + 4n}.$$

Related Question