Solved – UMVU estimator for non-linear transformation of a parameter

estimationnonlinearself-studyumvueunbiased-estimator

Let $X_1, …, X_n$ be iid. and $X_1\sim N(\mu,1)$. $\gamma(\mu)=e^{t\mu}$ for $t\neq 0$

My question is how to find an UMVU estimator for $\gamma(\mu)$

My concern is not so much about the specific problem but rather about how to approach this kind of estimation problem in general, if one wants to estimate a non-linear transformation of a parameter. The problem for me is that
$\overline{X}_n$ is UMVUE for $\mu$ but I think that: $E_\theta[e^{\overline{X}_n}]\neq e^{t\theta}$.

We never covered this kind of problem in our lecture so I dont really have an idea of how to solve this kind of estimation problem

EDIT: I think I solved it myself: Let $\frac{1}{k}:=\frac{1}{(2\pi)^{n/2}}$

$ \int e^{t\overline{X}_n} \cdot \frac{1}{k}\exp(-\frac{1}{2}\sum_{i=1}^n(x_i-\theta)^2)d(x_1, …, x_n)= $

$=\int \frac{1}{k}\exp(-\frac{1}{2}\sum_{i=1}^n(x_i-2(\theta+\frac{t}{n})x_i+\theta^2)d(x_1, …, x_n)= $

$=\exp(t\theta+\frac{t^2}{2n})\int \frac{1}{k}\exp(-\frac{1}{2}\sum_{i=1}^n(x_i-2(\theta+\frac{t}{n})x_i+(\theta+\frac{t}{n})^2)d(x_1, …, x_n)=$

$=\exp(t\theta+\frac{t^2}{2n})$

The UMVU estimator is therefore $T(X1, …, X_n)=\exp(-t^2/2n) \exp(t\overline{X}_n)$ It is UMVU because $\overline{X}_n $ is sufficient and thereby by Lehmann-Scheffe the estimator is UMVU, as $T(X_1, … X_n)=g(\overline{X}_n)$

Remark: I would still have to proof that $Var_\theta(g(\overline{X}_n)) < \infty \forall \theta $ for Lehmann-Scheffe to be applicable.

Best Answer

Your final answer is not quite right. The conclusion due to the sample mean $\bar X$ being only sufficient for $\mu$ also looks faulty.

Recall that $T(X_1,X_2,\cdots,X_n)=\sum_{i=1}^n X_i$ is a complete sufficient statistic for $\mu$.

It is easy to see this if you work with the exponential family setup.

Now we know the distribution of $\bar X$, namely $\bar X\sim\mathcal N\left(\mu,\frac{1}{n}\right)$.

From the moment generating function of a univariate normal distribution, it follows that

$$E_{\mu}(e^{t\bar X})=\exp\left(\mu t+\frac{t^2}{2n}\right)$$

That is,

$$E_{\mu}(e^{t\bar X-t^2/2n})=e^{\mu t}$$

So an unbiased estimator of $e^{\mu t}$ is \begin{align}h(T)&=\exp\left(t\bar X-t^2/2n\right) \\&=\exp\left(T\frac{t}{n}-\frac{t^2}{2n}\right) \end{align}

$h(T)$ is a function of the complete sufficient statistic $T$.

Hence by the Lehmann-Scheffe theorem, $h(T)$ is the UMVUE of $e^{\mu t}$.


You ask how to approach problems regarding estimation of a non-linear transformation of the parameter of interest. I think this is pretty much the same as estimating any function of the parameter.

The usual tools to find a UMVUE (if it exists) are the Rao-Blackwell theorem and/or the Lehmann-Scheffe theorem. One needs to find a complete sufficient statistic ($T$, say), if it exists, for some $\theta$ (which may well be a vector) that parametrises the given population distribution. The first step is to find an unbiased estimator (if it exists) of $\theta$. Now if this unbiased estimator is a function of the complete sufficient statistic, then it will be the UMVUE of $\theta$. This is a corollary of the Lehmann-Scheffe theorem. Even if the unbiased estimator is not a function of the complete sufficient statistic, we have the Rao-Blackwell theorem at hand. This says that if we take any trivial unbiased estimator ($h$, say) of $\theta$, then the conditional expectation $E(h\mid T)$ is the UMVUE of $\theta$. Finding this conditional expectation explicitly might not be an easy task in general.

The remark by @CagdasOzgenc in the comments is worth noticing.

If $U$ is unbiased for $\theta$, then we do not expect $g(U)$ to be unbiased for $g(\theta)$ for any $g$. That approach would just not work in general. One way to see this is Jensen's inequality: $E(g(U))\ge g(E(U))$ for convex $g$ (inequality reverses when $g$ is concave). Note that equality holds in Jensen's inequality if $g$ is a constant function or if $g$ is an affine function.