[Math] the Fisher information for a Uniform distribution

fisher informationprobability theorystatistics

If X is U[$0$,$\theta$], then the likelihood is given by $f(X,\theta) = \dfrac{1}{\theta}\mathbb{1}\{0\leq x \leq \theta\}$. The definition of Fisher information is $I(\theta) = \mathbb{E} \left[ \left(\dfrac{d \log(f(X,\theta))}{d\theta} \right)^2 \right]$. How can this be calculated when $\log f(X,\theta) $ is not defined for $\theta < X$? I understand that we also have $f(X,\theta) = 0$ for $\theta < X$ but can we ignore this when taking the expectation? If so, why?

Best Answer

it is $n^2/\theta$.

We get this from calculating the log-likelihood first which is $-n \log(\theta)$, then taking its derivative, we will get $\frac{-n}{\theta}$. squaring it and take its expectations we will have $\frac{n^2}{\theta^2}$