Finding the condition on $k_1$ and $k_2$ of an unbiased estimator

estimationmaximum likelihoodstatistics

I'm taking a statistics course and am asked the following :

Suppose that $X$ and $Y$ are independent Poisson distributed values with
means $\theta$ and $2\theta$, respectively. Consider the combined estimator of $\theta$

$$\hatθ = k_1 X + k_2 Y$$

where $k_1$ and $k_2$ are arbitrary constants.

(a) Find the condition on $k 1 $ and $k 2$ such that $\hatθ$ is an
unbiased estimator of $θ$.

How to approach answering this question ?

In order to find an unbiased estimator I need to discover the average of the distributed values of X and Y ?

From how to compute unbiased estimator :

A basic criteria for an estimator to be any good is that it is
unbiased, that is, that on average it gets the value of $\mu$ correct. Formally, an estimator $f$ is unbiased iff

$$E[f(X_1,X_2,\dots,X_n)] =\mu.$$

If this were a simpler question such that $\hatθ = k_1 XY$ then the condition on $k_1$ would be that $\theta\hat=E[f(X_1,X_2,\dots,X_n)] =\mu$ ?

But how to find the condition on $k_1$ and $k_2$ ?

Best Answer

Assuming $\hatθ = k_1 X + k_2 Y$

So in other words, your $\mu$ in the criterion is $\theta$ because it is the true value. You would require that the mean of the estimator is at the true value, namely $$E(\hat{\theta}) = \theta$$ Hence replacing the estimator, we have $$E( k_1 X + k_2 Y) = \theta$$ which by linearity gives $$ k_1 E X + k_2 EY = \theta$$ But $EX = \theta$ and $EY = 2\theta$, so $$ k_1\theta + 2k_2 \theta= \theta$$ means that our condition should be
$$k_1 + 2k_2 = 1$$


Why do you say $\hat{\theta} = k_1 XY$ is easier ?

Assuming $\hat{\theta} = k_1 XY$, then $E\hat{\theta} = k_1 EXY = k_1 EX EY = 2k_1\theta^2$ which has to be equal to the true value $\theta$., so you'd require $$k_1 = \frac{1}{\theta}$$which means that $k_1$ should depend on the true value !!!! So why estimate it :)

Related Question