Solved – Conditions for the existence of a Fisher information matrix

fisher information

Different textbooks cite different conditions for the existence of a Fisher information matrix. Several such conditions are listed below, each of which appears in some, but not all, of the definitions of "Fisher information matrix".

  1. Is there a standard, minimal set of conditions?
  2. Of the 5 conditions below, which can be done away with?
  3. If one of the conditions can be done away with, why do you reckon it was included in the first place?
  4. If one of the conditions can not be done away with, does it mean that those textbooks that did not specify it gave an erroneous, or at least an incomplete, definition?
  1. Zacks, The Theory of Statistical Inference (1971), p. 194.

    The matrix $\mathcal{I}\left(\theta\right)$ is positive definite for all $\theta\in\Theta$.

  2. Schervish, Theory of Statistics (1997, corr. 2nd printing), Definition 2.78, p. 111

    The set $C=\left\{x:f\left(x;\theta\right)>0\right\}$ is the same for all $\theta$.

  3. Borovkov, Mathematical Statistics (1998). p. 147

    $f\left(x;\theta\right)$ are continuously differentiable w.r.t. $\theta_i$.

  4. Borovkov, Mathematical Statistics (1998). p. 147

    $\mathcal{I}\left(\theta\right)$ is continuous and invertible.

  5. Gourieroux & Monfort, Statistics and Econometric Models, Vol I (1995). Definition (a), pp. 81-82

    $\frac{\partial^2}{\partial\theta_i\partial\theta_j}f\left(x;\theta\right)$ exist

In comparison, here is the full list of conditions in Lehman & Cassella. Theory of Point Estimation (1998). p. 124:

  1. $\Theta$ is an open interval (finite, infinite, or semi-infinite)
  2. The set $C=\left\{x:f\left(x,\theta\right)>0\right\}$ is the same for all $\theta\in\Theta$.
  3. $\frac{\partial f\left(x;\theta\right)}{\partial\theta_i}$ exists and is finite.

And here is the complete list of conditions in Barra, Notions fondamentales de statistique mathematique (1971). Definition 1, p. 35:

The score is defined for all $\theta\in\Theta$, each of its components is square-integrable and has integral $=0$.

It is interesting to note that neither Lehman & Cassella nor Barra stipulate that $\int f\left(x;\theta\right)\space \mu\left(dx\right)$ be differentiable under the integral sign w.r.t. each $\theta_i$, a condition that occurs in most other textbooks i surveyed.

Best Answer

I do not have access to all the references, but I would like to point out a few remarks on some of your points:

  • Borovkov, Mathematical Statistics (1998). p. 140 presents another assumption, Condition (R), which is quite strong. This condition assumes that $E[(\partial\log f(x;\theta)/\partial\theta)^2]<\infty$. Then, the author basically assumes that each entry of the Fisher information matrix (FIM) is well-defined.

  • The double differentiability and exchangeability of the integral and differential operators assumptions are employed to deduce the equality $E[(\partial\log f(x;\theta)/\partial\theta)^2]=-E[\partial^2\log f(x;\theta)/\partial\theta^2]$. This equality is often helpful, but not strictly necessary.

  • It is difficult to establish general conditions for the existence of the FIM without discarding some models for which the FIM actually exists. For instance, differentiability condition is not a necessary condition for the existence of the FIM. An example of this is the double exponential or Laplace model. The corresponding FIM is well defined, but the density is not doubly differentiable at the mode. Some other models which are doubly differentiable have bad-behaved FIM and require some additional conditions (see this paper).

It is possible to come up with very general sufficient conditions, but they might be too strict. Necessary conditions for the existence of the FIM have not been fully studied. Then, the answer to your first question may not be simple.

Related Question