Solved – Derivation of Fisher information

fisher information

The Fisher information is derived from the second moment of the log likelihood equation. I am unable to follow the step below (from wiki) moving from the Expectation of the second partial derivative to the (chain rule result?) difference of a second order partial derivative and the square of the first order partial derivative.

If someone were feeling "talkative" and would be willing to fill in the blanks there – it would be appreciated.

https://en.wikipedia.org/wiki/Fisher_information

enter image description here

Best Answer

This is a chain rule problem.

Given

d(log(x))/dx = 1/x  

and

 d(log(f(x))/dx = [d(f(x))/dx] / x

and: starting with the first order derivative:

enter image description here

Let us now take the second partial derivative by re-differentiating that first order one. This requires the chain rule:

d(f(a)f(b))/ dx  = d(f(a))/dx f(b) + f(a) d(f(b))/dx

Where we set

f(a) = 

enter image description here

f(b) = 1/ (f(X; theta)  = f(X;theta)**-1

The final result then follows.