[Math] Finding the Fisher’s Information in a normal distribution with known $\mu$ and unknown $\sigma^{2}$

normal distributionprobabilityprobability distributionsstatistics

I have a point statistic $x_{i},…,x_{n} X \in N(\mu,\sigma^{2}), \mu$ is known.

I have to apply the Rao-cramer theorem but calculating the Fisher's information I stumbled upon this problem:

$I(\sigma)=-E(\frac{n}{\gamma}+3\sum(\frac{x_{i}-\mu)^{2}}{\sigma^{2}})=\frac{n}{\gamma}+ 3\frac{E(\sum(x_{i}-\mu)^{2})}{E\sigma^{4}}=\frac{n}{\gamma}+\frac{3}{\sigma^{4}}E(\sum(x_{i}-\mu)^{2})$

$$E(\sum(x_{i}-\mu)^{2})=?$$
${\displaystyle \operatorname {E} [X]=\int _{\mathbb {R} }xf(x)\,dx.}$ But what is f(x) here ? Could it possibly be the function itself ${\displaystyle \operatorname {E} [X]=\int _{\mathbb {R} }x∑(x_i−μ)2\,dx.}$

From wiki, we know that Fisher's information is:
$$(
\begin{matrix}
\frac{1}{\sigma^{2}} & 0 \\
0 & \frac{1}{2\sigma^{4}}
\end{matrix})
$$

But I need a number, what is that matrix supposed to mean?

What is $I(\sigma^{2})$ for a normal distribution with $\mu$ – known and $\sigma^{2}$- unknown?

Best Answer

Let $\sigma ^ 2 = \theta $, thus $ X \sim N( \mu, \theta)$, hence $$ f_X(x; \theta) = \frac{1}{\sqrt{ 2 \pi \theta }} \exp\left( \frac {- (x - \mu ) ^ 2} { 2\theta} \right), $$

$$ l(\theta) = - \tfrac 1 2 \ln \theta - \frac {(x - \mu )^2} {2\theta} + \text {constant} $$ $$ l'(\theta) = -\frac{1}{2\theta} + \frac{(x- \mu) ^2}{2\theta ^ 2} $$ $$ - \mathbb{E} l'' (\theta) = - \mathbb{E}[ \frac{1}{2\theta ^ 2} - \frac{(x- \mu) ^2}{\theta ^ 3} ] = -\frac{1}{2\theta ^ 2} + \frac{1}{\theta^2} = \frac{1}{2 \theta ^ 2}. $$ Use the additive property of Fisher's information to get the Info. for sample of size $n$, i.e., $$ I_{X_1,...,X_n}(\theta) = \frac{n}{2\theta ^ 2} = \frac{n}{2\sigma ^ 4}, $$ for the observed information replace $\sigma ^2 $ with $$ S ^ 2 = \frac{\sum_{i=1}^n ( X_i - \mu) ^ 2}{n}. $$ (And note that $\operatorname{var}(X) = \mathbb{E}(X - \mu ) ^2 = \sigma ^ 2$).