"Assume independent random variables $Y_i$~$Poisson(λx_i)$. Supposing that $x_i$ are given, fixed constants, obtain the least squares estimator of $λ$ and compute its variance."
This kind of a problem is the first one faced by me. The first things that are coming to my mind are regression equation of the form $log(Y)=a+bx$ and then trying the least square algorithm by minimizing $∑(log(y)-a-bx)^2$. From this I will be getting the least squares estimates of $a,b$ but how do I get $λ$?
As I said, this is the first time I am facing such a question. I know point estimation in terms of MLE and moment-equality. I need help and resources (i.e. more knowledge) about solving this kind of a question. Can you kindly supply me with the answer and the explanation along with maybe links to some pdf's? Google is not helping me. Thanks in advance.
(EDIT:) While I was waiting for any response to this question, I looked up deeper and found that I did not know one information. That is, the least square estimator for a parameter $θ$ is obtained by minimising $∑(Y_i-E(Y_i|x_i,θ))^2$. So I tried to work on this using my new knowledge.
I get the LSE of $λ$ as $\dfrac{∑x_iY_i}{∑(x_i)^2}$. Please tell me if this is correct.
Then the variance of this LSE is $\dfrac{λ∑(x_i)^3}{(∑(x_i)^2)^2}$.
Is this right? Thanks for your time.
Best Answer
Log Likelihood Solution
The Log Likelihood Function is given by:
$$ \begin{align*} \hat{\lambda} & = \arg \max_{\lambda} p \left( \boldsymbol{y} \mid \boldsymbol{x}, \lambda \right) \\ & = \arg \max_{\lambda} \log \prod_{i = 1}^{n} \frac{ \left( \lambda {x}_{i} \right)^{ {y}_{i} } }{ {}y_{i}! } {e}^{-\lambda {x}_{i}} \\ & = \arg \max_{\lambda} \sum_{i = 1}^{n} \log \left( \frac{ \left( \lambda {x}_{i} \right)^{ {y}_{i} } }{ {}y_{i}! } {e}^{-\lambda {x}_{i}} \right) \\ & = \arg \max_{\lambda} \log \lambda \sum_{i = 1}^{n} {y}_{i} + \sum_{i = 1}^{n} \log {x}_{i} - \lambda \sum_{i = 1}^{n} {x}_{i} -\sum_{i = 1}^{n} \log {y}_{i} ! \\ & \Rightarrow \hat{\lambda} = \frac{ \sum_{i}^{n} {y}_{i} }{ \sum_{i}^{n} {x}_{i} } \end{align*} $$
This indeed coincide with the classic case where $ {x}_{i} = 1 $ and then the MLE is the empirical average.
Least Squares Solution
The Least Squares solution is given by:
$$ \begin{align*} \hat{\lambda} & = \arg \min_{\lambda} \sum_{i}^{n} \left( \mathbb{E} \left[ {y}_{i} \mid {x}_{i}, \lambda \right] - {y}_{i} \right)^{2} \\ & = \arg \min_{\lambda} \sum_{i}^{n} \left( \lambda {x}_{i} - {y}_{i} \right)^{2} \\ & \Rightarrow \hat{\lambda} = \frac{ \sum_{i}^{n} {x}_{i} {y}_{i} }{ \sum_{i = 1}^{n} {x}_{i}^{2} } \end{align*} $$
Small simulation in MATLAB:
The full code is available on my StackExchange Cross Validated Q122153 GitHub Repository.