Solved – Help with Taylor expansion of log likelihood function

mathematical-statisticsmaximum likelihoodtaylor series

I've reed the following part of a sketch of the proof that the maximum likelihood estimator is asymptotically normal:

"Sketch of the second part of the proof.
Recall that we may write the likelihood equation as
$$\sum_{i=0}^n U_i(\hat\theta)=0$$ where $ U_i(\phi)$ denotes the derivative of $ \log\! f(Y_i;\phi) $ with respect to $\phi$.
Let $ U_i'(\phi) $ denote the derivative of $ \log\!f(Y_i;\phi)$ with respect to $ \phi$ . Now a Taylor expansion around $ \phi=\theta$ yields:

$$\sum_{i=0}^n U_i(\phi )-\sum_{i=0}^n U_i(\theta)\approx \left( \sum_{i=0}^n U_i'(\theta) \right)(\phi-\theta)$$"

This Taylor expansion does not make any sense to me. I am familiar with a Taylor expansion of $f(x) $ at $a$ as: $$\sum_{i=0}^\infty\frac{f^{(n)}(a)}{n!}(x-a)^n$$
I can see that all terms except the second term from the Taylor expansion of the log likelihood are excluded but I do not recognise where this term comes from as the original term involved subtracting two summations.

Best Answer

You should convince yourself that $f(x)-f(y)\approx f'(x)(x-y)$ is just another way to express Taylor expansion (under appropriate regularity assumptions).

Then, using linearity of the derivation, you can generalize it to the sum.

Related Question