Cramér–Rao bound on estimating the parameters of an impulse

estimationfisher informationlog likelihoodsignal processingstatistics

Given a noisy discrete-time complex signal that is the sum of an impulse at some time, $t_0$, (with amplitude, $a_0 e^{i \phi_0}$) and additive white Gaussian noise, what is the Cramér–Rao lower bound on the variance of an unbiased estimator of $t_0, a_0, \phi_0$?


If I have a discrete-time signal of $N$ samples (let $N$ be even for simplicity), $z_n$, as described above, if you took the discrete Fourier transform, you would get:

$$ Z_n = a_0 \exp\left(\frac{-2\pi i t_0 n}{N} + i\phi_0\right) + \mathcal{CN}(0, 2 N \sigma^2)$$

where $t_0$ is the time of the impulse in the time-domain (and the parameter to be estimated), $a_0$ is some complex amplitude of this impulse, $n = -\frac{N}{2}, … \frac{N}{2} – 1$, and $i$ is the imaginary unit. Here I have assumed a sampling frequency of $1$ without loss of generality. The additive complex Gaussian noise, $\mathcal{CN}(0, 2 N \sigma^2)$, is a complex random variable where both the real and imaginary parts follow a $\mathcal{N}(0, N \sigma^2)$ distribution each. The factor of $N$ in the variance of the additive noise accounts for the normalization factor in the inverse Discrete Fourier Transform, ensuring a constant noise variance in the time-domain.

$a_0 > 0, t_0 \in [0, N], \phi_0 \in [-\pi, \pi)$ are real parameters that describe the impulse in the time-domain.


Intuitively, it seems to me that if we take the discrete-time Fourier transform

$$ f(t) = \frac{1}{N} \sum_{n} Z_n \exp\left(\frac{2 \pi i t n}{N}\right) $$

then an unbiased estimator of $t_0$ is

$$\hat{t} = \underset{t}{\operatorname{argmax}} \left|f(t) \right| $$

and $a_0$ and $\phi_0$ can also be similarly estimated via $f(\hat{t}) = \hat{a} e^{i \hat{\phi}}$. I have a hunch that this should be a maximum-likelihood estimator and should achieve the Cramér–Rao lower bound.


To determine the Cramér–Rao lower bounds, we need to derive the likelihood function. Let,

$$ p_n = a \cos\left(-2\pi t \frac{n}{N} + \phi\right) \\
q_n = a \sin\left(-2\pi t \frac{n}{N} + \phi\right)$$

With $Z_n = X_n + i Y_n$, we have

$$X_n = a_0 \cos\left(-2\pi t_0 \frac{n}{N} + \phi_0\right) + \mathcal{N}(0, N\sigma^2) \\
Y_n = a_0 \sin\left(-2\pi t_0 \frac{n}{N} + \phi_0\right) + \mathcal{N}(0, N\sigma^2)$$

Then, the likelihood function is

$$ \mathcal{L}(\boldsymbol{Z}) = \left(\frac{1}{2 \pi N \sigma^2}\right)^N \exp\left[-\frac{1}{2 N \sigma^2} \sum_n\left((X_n – p_n)^2 + (Y_n – q_n)^2\right)\right]$$

Now, I must derive a $3 \times 3$ Fisher Information matrix for three unknown parameters, $a_0, t_0, \phi_0$, using this likelihood function and invert it to get the lower bound on the variance of an unbiased estimator for the impulse's parameters.

This is where I am stuck. I have no idea how to proceed in this case.

Best Answer

I believe I have figured this out, and so am posting it as an answer to my own question.
Using the likelihood function provided in the question above:

$$ \mathcal{L}(\boldsymbol{Z}; \theta) = \left(\frac{1}{2 \pi N \sigma^2}\right)^N \exp\left[-\frac{1}{2 N \sigma^2} \sum_n\left((X_n - p_n)^2 + (Y_n - q_n)^2\right)\right]$$

Since, there are $3$ unknown parameters, $\boldsymbol{\theta} = \begin{bmatrix}t & a & \phi\end{bmatrix}^\textsf{T}$, we must determine the elements of a $3 \times 3$ Fisher Information matrix, $\mathcal{I}$, using:

$$\mathcal{I}_{ij} = -\operatorname{E}\left[ \frac{\partial^2}{\partial\theta_i\, \partial\theta_j} \log \mathcal{L}(\boldsymbol{Z}; \theta)\right] = \frac{1}{N \sigma^2} \sum_n \left[\frac{\partial p_n}{\partial \theta_i} \frac{\partial p_n}{\partial \theta_j} + \frac{\partial q_n}{\partial \theta_i} \frac{\partial q_n}{\partial \theta_j}\right]$$

where $n = -\frac{N}{2}, ..., \frac{N}{2} - 1$.

With this, the $i$th diagonal element of $\mathcal{I}^{-1}$ is the Cramér–Rao lower bound on the variance of an unbiased estimator of $\theta_i$. Using Mathematica to invert $\mathcal{I}$, we get:

$$\operatorname{var}(\hat{t}) = \frac{3 \sigma^2}{\pi^2 a^2} \frac{N^2}{N^2 - 1}$$

$$\operatorname{var}(\hat{a}) = \sigma^2 $$

$$\operatorname{var}(\hat{\phi}) = \frac{\sigma^2}{a^2} \frac{N^2 + 2}{N^2 - 1} $$

I have also experimentally confirmed that the maximum-likelihood estimator outlined in the question seems to achieve these lower bounds.

Related Question