[Math] For what values k1 and k2 will the given combined estimator be an unbiased estimator with smallest variance among all such linear combinations

combinationsstatisticsvariance

$\hat \theta_1$ and $\hat \theta_2$ are independent unbiased estimators of $\theta$ with $Var[\hat \theta_2] = 3Var[\hat \theta_1]$.

For what values of $k_1$ and $k_2$ will the combined estimator $k_1\hat \theta_1 + k_2\hat \theta_2$ be an unbiased estimator with smallest variance amongst all such linear combinations?


My attempt:
I started by establishing $$E[\hat \theta_1] = E[\hat \theta_2] = \theta$$
Next, I wrote $$E[k_1\hat \theta_1 + k_2\hat \theta_2] = \theta \quad (*)$$
(*) this summation is correct since they're independent and unbiased estimators, yes?

I then split this with linearity properties.. $$E[k_1\hat \theta_1] + E[k_2\hat \theta_2] = \theta$$
$$k_1(E[\hat \theta_1]) + k_2(E[\hat \theta_2]) = \theta$$
$$k_1(\theta) + k_2(\theta) = \theta$$
$$k_1 + k_2 = 1$$

Now I know that the sum of the $k$ constants is 1.. And I'm assuming that I need to use the $\theta$ Variance relationship given to me in the question to find their individual values but when I tried it, I didn't really get anywhere.

I wrote: $$Var[\hat \theta] = E[\hat \theta^2] – (E[\hat \theta])^2$$
Given:
$$ Var[\hat \theta_2] = 3Var[\hat \theta_1]$$
$$E[\hat \theta^2_2] – (E[\hat \theta_2])^2 = 3(E[\hat \theta^2_1] – (E[\hat \theta_1])^2)$$
$$E[\hat \theta^2_2] – \theta^2 = 3E[\hat \theta^2_1] – 3\theta^2$$
$$2\theta^2 = 2E[\hat \theta^2_1] \qquad (**)$$
$$\theta^2 = E[\hat \theta^2_1]$$
???
Where do I go from here?

(**) I'm not even sure this is right, I just assumed since $$E[\hat \theta_1] = E[\hat \theta_2]$$ then $$E[\hat \theta^2_1] = E[\hat \theta^2_2]$$

Even if this assumption is correct, my answer reduces down to $$ \theta = \theta$$ which doesn't really get me anywhere..

Where am I going wrong? What am I missing?

Best Answer

You are wrong in (**). If both first and second moments of estimates are equal, the variances should be equal too. But this is not the case. And also the argument in (*) is inconsistent: the expectation of sum is a sum of expectations whenever they exist. No indenedence is needed,

Next, I do not understand what are you doing firther and what for. You need to calculate the variance of $k_1\hat \theta_1 + k_2\hat \theta_2$ and then minimize it over any $k_1, k_2$ s.t. $k_1+k_2=1$.

Use the variance properties: for independent random variables with finite variances $$\text{Var}[X+Y]=\text{Var}[X]+\text{Var}[Y].$$ And the other property: if $c$ is a constant, $$\text{Var}[cX]=c^2\text{Var}[X].$$

Then $$ \text{Var}[k_1\hat \theta_1 + k_2\hat \theta_2]=k_1^2\text{Var}[\hat \theta_1] + k_2^2\text{Var}[\hat \theta_2] = k_1^2 \text{Var}[\hat \theta_1]+(1-k_1)^2 3\text{Var}[\hat \theta_1]. $$ Find $k_1$ that provides minimum of this expression and then find $k_2=1-k_1$.

Related Question