I have the log-likelihood function:
$$l(p_i,y_i) = \sum_{i = 1}^n \left( \ln(p_i) + y_i \ln(1 – p_i) \right) $$
And I need to calculate the maximum likelihood estimator of $p_i$. When I do this, for some reason when differentiating, the summation sign vanishes. Why is this?
EDIT: To calculate the maximum likelihood estimators, I would differentiate my log likelihood, equal that to 0 and solve:
$$
\frac{\partial{l}}{\partial{p_i}} = \sum_{i = 1}^n \left( \frac{1}{p_i} – \frac{y_i}{1 – p_i} \right) = 0
$$
Rearranging gives me
$$
\sum \frac{1}{p_i} = \sum \left( \frac{y_i}{1 – p_i} \right)
$$
Oh, does this now become
$$\frac{n}{p_i} = \frac{n y_i}{ 1 – pi} $$
And then I can divide through by n and rearrange to get my value for $\hat{p_i}$?
Best Answer
This isn't a stats question but a question relating to basic properties of calculus and algebra.
It may help to consider a simpler problem, to avoid any confusion about the issue:
$$\frac{\partial{}}{\partial{p_i}} \sum_{i = 1}^n p_i^2$$
Think of the summation written out:
$$ \frac{\partial{}}{\partial{p_i}} (p_1^2 + p_2^2 + ... + p_{i-1}^2 + p_i^2 + p_{i+1}^2 + ... + p_n^2) $$
Take the derivative term by term:
$$ = \frac{\partial{p_1^2}}{\partial{p_i}} + \frac{\partial{p_2^2}}{\partial{p_i}} + ... + \frac{\partial{p_{i-1}^2}}{\partial{p_i}} + \frac{\partial{p_{i}^2}}{\partial{p_i}} + \frac{\partial{p_{i+1}^2}}{\partial{p_i}} + ... + \frac{\partial{p_{n}^2}}{\partial{p_i}} $$
Now take those derivatives (leaving the $i^{\rm{th}}$ term unevaluated for the moment):
$$ = 0 + 0 + ... + 0 + \frac{\partial{p_{i}^2}}{\partial{p_i}} + 0 + ... + 0 $$
and we now see why the summation disappears - there's only one term that isn't zero:
$$ = \frac{\partial{p_{i}^2}}{\partial{p_i}} = 2p_i $$
Your question is the same but with a different, slightly more complicated function.
Regarding the original problem:
$$ l(p_i,y_i) = \sum_{i = 1}^n \left( \ln(p_i) + y_i \ln(1 - p_i) \right) $$
is fine, but as soon as you took derivatives, you went astray.