Cramer-Rao Bound – Calculating the Variance of Unbiased Estimators for ? = ?/?

estimatorsfisher informationinferencemathematical-statisticsunbiased-estimator

Let $X_1, \cdots, X_n$ be a sample from the $N(\mu, \sigma^2)$ density, where $\mu, \sigma^2$ are unknown.
I want to find a lower bound $L_n$ which is valid for all sample-sizes $n$ for the variance of unbiased estimators of $\theta = \mu/\sigma$.

I know that for one-dimensional parameters the Cramer-Rao lower bound for unbiased estimators is given by $\frac{1}{I(\theta)}$, where $I(\theta)$ is the fisher information.
Since $\mu, \sigma^2$ are unknown the Fisher information becomes a $2\times 2$ matrix. So how can the Fisher information matrix help me to find the Cramer-Rao lower bound?

Any hints will be appreciated.

Best Answer

Suppose $\theta=(\mu,\sigma^2) \in \Omega$ is the two-dimensional parameter where $\Omega=\mathbb R\times (0,\infty)$.

You are interested in the parametric function $g:\Omega \to \mathbb R$ where $g(\theta)=\frac{\mu}{\sigma}$.

Then $g$ is differentiable and its gradient at $\theta$ is denoted by $\nabla g(\theta)$.

The Cramér-Rao lower bound (see this or this) for the variance of any unbiased estimator of $g(\theta)$ is the inverse of Fisher information about $g(\theta)$, which is given by

$$\text{CRLB}(g(\theta))=\nabla g(\theta)^T(I(\theta))^{-1}\nabla g(\theta)$$

Related Question