Is the Second Partial Derivative equal to the First Partial Derivative Squared

calculusderivativesprobabilitystatistics

In most situations, it is obvious that the second partial derivative is NOT equal to the first partial derivative squared.

However, I was reading the following page and noticed the following statement:

enter image description here

Here, it seems that the (expected value of the) second partial
derivative is in fact equal to the (expected value of the) first partial derivative squared.

To give some context, we are dealing with the Log Likelihood Function of a Probability Distribution Function from the field of Statistics.

I tried to look more into this and couldn't find any more information as to why this relationship is true. Could someone please comment on this – in this particular case, why is the second partial derivative equal to squaring the first partial derivatives?

Thanks!

Best Answer

This has already been explained by leonbloy's comment, but I will spell it out. By the definition of $L_i$, we have $$ L_i(y_i, {\pmb \beta})=\log p(y_i; {\pmb \beta}), \qquad \hbox{where $p(y_i; {\pmb \beta})$ is the probability density of $y_i$ given $\pmb \beta$} $$ so \begin{eqnarray*} -\frac{\partial^2 L_i}{\partial \beta_h \partial \beta_j} &=& -\frac{\partial^2(\log p) }{\partial \beta_h \partial \beta_j}\\ &=& -\frac{\partial}{\partial \beta_h} \left(\frac{1}{p} \frac{\partial p}{\partial \beta_j}\right)\\ &=& \frac{1}{p^2} \frac{\partial p}{\partial \beta_h}\frac{\partial p}{\partial \beta_j} -\frac{1}{p}\frac{\partial^2 p}{\partial \beta_h \partial \beta_j}\\ &=& \left( \frac 1 p \frac{\partial p}{\partial \beta_h} \right) \left( \frac 1 p \frac{\partial p}{\partial \beta_j} \right) -\frac{1}{p}\frac{\partial^2 p}{\partial \beta_h \partial \beta_j}\\ &=& \frac{\partial (\log p)}{\partial \beta_h} \frac{\partial (\log p)}{\partial \beta_j} -\frac{1}{p}\frac{\partial^2 p}{\partial \beta_h \partial \beta_j}\\ &=& \frac{\partial L_i}{\partial \beta_h} \frac{\partial L_i}{\partial \beta_j} -\frac{1}{p}\frac{\partial^2 p}{\partial \beta_h \partial \beta_j} \end{eqnarray*} and taking expected values gives \begin{eqnarray*} E\left( -\frac{\partial^2 L_i}{\partial \beta_h \partial \beta_j} \right) = E\left( \frac{\partial L_i}{\partial \beta_h} \frac{\partial L_i}{\partial \beta_j} \right) -E\left( \frac{1}{p}\frac{\partial^2 p}{\partial \beta_h \partial \beta_j} \right) \end{eqnarray*} which is what we want, except that we must get rid of the second term. But by the definition of the expected value, $$ E(f)=\int f(y_i, {\pmb \beta})\, p(y_i; {\pmb \beta}) \, dy_i \qquad \hbox{for any function $f$} $$ so \begin{eqnarray*} E\left( \frac{1}{p}\frac{\partial^2 p}{\partial \beta_h \partial \beta_j} \right) &=& \int \frac{1}{p}\frac{\partial^2 p}{\partial \beta_h \partial \beta_j} \, p \, dy_i \\ &=& \int \frac{\partial^2 p}{\partial \beta_h \partial \beta_j} \, dy_i \\ &=& \frac{\partial^2}{\partial \beta_h \partial \beta_j} \int p \, dy_i \\ &=& \frac{\partial^2}{\partial \beta_h \partial \beta_j} 1, \qquad \hbox{since $p$ is a probability density}\\ &=& 0. \end{eqnarray*}

Related Question