Statistics – Calculating Fisher Information for Bernoulli Random Variable

expected valueprobability distributionsstatistics

Let $X_1,…,X_n$ be Bernoulli distributed with unknown parameter $p$.

My objective is to calculate the information contained in the first observation of the sample.

I know that the pdf of $X$ is given by $$f(x\mid p)=p^x(1-p)^{1-x}$$, and my book defines the Fisher information about $p$ as

$$I_X(p)=E_p\left[\left(\frac{d}{dp}\log\left(p^x(1-p)^{1-x}\right)\right)^2\right]$$

After some calculations, I arrive at

$$I_X(p)=E_p\left[\frac{x^2}{p^2}\right]-2E_p\left[\frac{x(1-x)}{p(1-p)}\right]+E_p\left[\frac{(1-x)^2}{(1-p)^2}\right]$$

I know that the Fisher information about $p$ of a Bernoulli RV is $\frac{1}{p(1-p)}$, but I don't know how to get rid of the X-values, since I'm calculating an expectation with respect to $p$, not $X$. Any clues?

Best Answer

\begin{equation} I_X(p)=E_p \left[\frac{X^2}{p^2}\right]-2E_p \left[ \frac{X - X^2}{p(1-p)} \right] + E_p \left[ \frac{X^2 - 2X + 1}{(1-p)^2}\right] \tag{1}. \end{equation} For a Bernoulli RV, we know \begin{align} E(X) &= 0(\Pr(X = 0)) + 1(\Pr(X = 1)) = p\\ E(X^2) &= 0^2(\Pr(X = 0)) + 1^2(\Pr(X = 1)) = p. \end{align} Now, replace in $(1)$, we get \begin{equation} I_X(p)=\frac{p}{p^2}-2\frac{0-0}{p(1-p)}+\frac{p-2p+1}{(1-p)^2} = \frac{1}{p}-\frac{p-1}{(1-p)^2} = \frac{1}{p} - \frac{1}{p-1} = \frac{1}{p(1 - p)}. \end{equation}