Choosing a value to minimize variance (can this be done without partial derivatives?)

statistics

Suppose ${\hat{\theta_1}}$ and ${\hat{\theta_2}}$ are each unbiased estimators of $\theta$, with $V(\hat{\theta_1})=\sigma^2_1$ and $V(\hat{\theta_2})=\sigma^2_2$. A new unbiased estimator for $\theta$ can be formed by

$$\hat{\theta_3}=a\hat{\theta_1}+(1-a)\hat{\theta_2}$$

$(0\le a \le 1)$. If ${\hat{\theta_1}}$ and ${\hat{\theta_2}}$ are independent, how should $a$ be chosen so as to minimize $V(\hat{\theta_3})$?

My understanding is $Var(X + Y) = Var(X) + Var(Y)$ if X and Y are independent, which is the case in this problem

$V(θ_3) = V(aθ_1 + (1-a)θ_2)$

$V(θ_3) = a^2V(θ_1) + (1-a)^2V(θ_2) = a^2(σ_1)^2 + (1-a)^2(σ_2)^2$

$V(θ_3) = a^2(σ_1)^2 + (1-2a + a^2)(σ_2)^2$

$V(θ_3) = a^2(σ_1)^2 + a^2(σ_2)^2 -2a(σ_2)^2 +(σ_2)^2$

I'm not really sure how to proceed from here. The only solution I have been able to find online uses partial derivatives, which I do not know how to do since differential equations is not a prerequisite class and it's not a topic that has been covered in my statistics book nor by my teacher. I'm not quite sure how to "minimize" $V(θ_3)$ without knowing which is larger between $σ1$ and $σ_2$

My intuition is to set $V(θ_3)$ to $0$ and try to solve for $a$, but substituting $x = (σ_1)^2$ and $y = (σ_2)^2$ to make life easier for myself:

$0 = a^2x + a^2y -2ay + y$

$-y + 2ay = a^2x + a^2y$

$-y + 2ay = a^2(x + y)$

$y(-1 + 2a) = a^2(x + y)$

I'm not quite sure I can isolate $a$. Thanks in advance!

Best Answer

We can find the critical point with respect to $a$, corresponding to the solution to the equation $$0 = \frac{\partial}{\partial a}\operatorname{Var}[\hat \theta_3] = 2a \sigma_1^2 - 2(1-a) \sigma_2^2.$$ Consequently, $$a = \frac{\sigma_2^2}{\sigma_1^2 + \sigma_2^2},$$ and we have $$\hat \theta_3 = \frac{\hat \theta_1 \sigma_2^2 + \hat \theta_2 \sigma_1^2}{\sigma_1^2 + \sigma_2^2}.$$ We can also do this without calculus: $$\begin{align}\operatorname{Var}[\hat \theta_3] &= a^2 \sigma_1^2 + (1-a)^2 \sigma_2^2 \\ &= (\sigma_1^2 + \sigma_2^2) a^2 - 2\sigma_2^2 a + \sigma_2^2 \\ &= (\sigma_1^2 + \sigma_2^2) \left( a^2 - \frac{2\sigma_2^2}{\sigma_1^2 + \sigma_2^2} a \right) + \sigma_2^2 \\ &= (\sigma_1^2 + \sigma_2^2) \left( a^2 - \frac{2\sigma_2^2}{\sigma_1^2 + \sigma_2^2} a + \left(\frac{\sigma_2^2}{\sigma_1^2 + \sigma_2^2}\right)^2 \right) + \sigma_2^2 - \frac{\sigma_2^4}{\sigma_1^2 + \sigma_2^2} \\ &= (\sigma_1^2 + \sigma_2^2) \left( a - \frac{\sigma_2^2}{\sigma_1^2 +\sigma_2^2}\right)^2 + \frac{\sigma_1^2 \sigma_2^2}{\sigma_1^2 + \sigma_2^2}. \end{align}$$ Since no square is negative, the variance is minimized for a choice of $a$ such that the term $\left(a - \frac{\sigma_2^2}{\sigma_1^2 + \sigma_2^2}\right)^2 = 0$, namely $$a = \frac{\sigma_2^2}{\sigma_1^2 + \sigma_2^2}.$$ The minimum value thus attained is $\frac{\sigma_1^2 \sigma_2^2}{\sigma_1^2 + \sigma_2^2}$.

Related Question