Solved – Find the UMVUE of $\frac{\mu^2}{\sigma}$ where $X_i\sim\mathsf N(\mu,\sigma^2)$

estimationinferencemathematical-statisticsself-studyumvue

Suppose $X_1, …, X_4$ are i.i.d $\mathsf N(\mu, \sigma^2)$
random variables. Give the UMVUE of $\frac{\mu^2}{\sigma}$ expressed in terms of $\bar{X}$, $S$, integers, and $\pi$.

Here is a relevant question.

I first note that if $X_1,…,X_n$ are i.i.d $\mathsf N(\mu,\sigma^2)$ random variables having pdf

$$\begin{align*}
f(x\mid\mu,\sigma^2)
&=\frac{1}{\sqrt{2\pi\sigma^2}}\text{exp}\left(-\frac{(x-\mu)^2}{2\sigma^2}\right)\\\\
&=\frac{1}{\sqrt{2\pi\sigma^2}}e^{-\frac{\mu^2}{2\sigma^2}}\text{exp}\left(-\frac{1}{2\sigma^2}x^2+\frac{\mu}{\sigma^2}x\right)
\end{align*}$$

where $\mu\in\mathbb{R}$ and $\sigma^2\gt0$, then

$$T(\vec{X})=\left(\sum_{i=1}^n X_i^2, \sum_{i=1}^n X_i\right)$$

are sufficient statistics and are also complete since $$\{\left(-\frac{1}{2\sigma^2},\frac{\mu}{\sigma^2}\right):\mu\in\mathbb{R}, \sigma^2\gt0\}=(-\infty,0)\times(-\infty,\infty)$$

contains an open set in $\mathbb{R}^2$

I also note that the sample mean and sample variance are stochastically independent and so letting

$$\overline{X^2}=\frac{1}{n}\sum_{i=1}^n X_i^2$$

$$\overline{X}^2=\frac{1}{n}\sum_{i=1}^n X_i$$

we have

$$\mathsf E\left(\frac{\overline{X^2}}{S}\right)=\mathsf E\left(\overline{X^2}\right)\cdot\mathsf E\left(\frac{1}{S}\right)=\overline{X^2}\cdot\mathsf E\left(\frac{1}{S}\right)$$

It remains only to find $\mathsf E\left(\frac{1}{S}\right)$

We know that $$(n-1)\frac{S^2}{\sigma^2}\sim\chi_{n-1}^2$$

Hence

$$\begin{align*}
\mathsf E\left(\frac{\sigma}{S\sqrt{3}}\right)
&=\int_0^{\infty} \frac{1}{\sqrt{x}} \cdot\frac{1}{\Gamma(1.5)2^{1.5}}\cdot\sqrt{x}\cdot e^{-x/2}dx\\\\
&=\frac{4}{\sqrt{\pi}\cdot2^{1.5}}
\end{align*}$$

So $$\mathsf E\left(\frac{1}{S}\right)=\frac{4\sqrt{3}}{\sqrt{\pi}\cdot 2^{1.5}\cdot \sigma}$$

But since $\mathsf E(S)\neq\sigma$ I don't think I can just plug in $S$ for $\sigma$ here.

I have that since $\mathsf E\left(\overline{X^2}\right)=\mathsf{Var}\left(\overline{X}\right)+\mathsf E\left(\bar{X}\right)^2=\frac{\sigma^2}{4}+\mu^2$

Hence

$$\sigma=\sqrt{4\left(E\left(\overline{X^2}\right)-E\left(\overline{X}\right)^2\right)}=\sqrt{4\left(\overline{X^2}-\overline{X}^2\right)}$$

Hence the UMVUE of $\frac{\mu^2}{\sigma}$ is

$$\frac{4\sqrt{3}\cdot\overline{X^2}}{\sqrt{\pi}\cdot 2^{1.5}\cdot \sqrt{4\left(\overline{X^2}-\overline{X}^2\right)}}=\frac{\sqrt{\frac{3}{2\pi}}\left(\frac{S^2}{4}+\bar{X}^2\right)}{\sqrt{\frac{S^2}{4}}}$$

Is this a valid solution?

Best Answer

I have skipped some details in the following calculations and would ask you to verify them.

As usual, we have the statistics $$\overline X=\frac{1}{4}\sum_{i=1}^4 X_i\qquad,\qquad S^2=\frac{1}{3}\sum_{i=1}^4(X_i-\overline X)^2$$

Assuming both $\mu$ and $\sigma$ are unknown, we know that $(\overline X,S^2)$ is a complete sufficient statistic for $(\mu,\sigma^2)$. We also know that $\overline X$ and $S$ are independently distributed.

As you say,

\begin{align} E\left(\overline X^2\right)&=\operatorname{Var}(\overline X)+\left(E(\overline X)\right)^2 \\&=\frac{\sigma^2}{4}+\mu^2 \end{align}

Since we are estimating $\mu^2/\sigma$, it is reasonable to assume that a part of our UMVUE is of the form $\overline X^2/S$. And for evaluating $E\left(\frac{\overline X^2}{S}\right)=E(\overline X^2)E\left(\frac{1}{S}\right)$, we have

\begin{align} E\left(\frac{1}{S}\right)&=\frac{\sqrt{3}}{\sigma}\, E\left(\sqrt\frac{\sigma^2}{3\,S^2}\right) \\\\&=\frac{\sqrt{3}}{\sigma}\, E\left(\frac{1}{\sqrt Z}\right)\qquad\qquad,\,\text{ where }Z\sim\chi^2_{3} \\\\&=\frac{\sqrt{3}}{\sigma}\int_0^\infty \frac{1}{\sqrt z}\,\frac{e^{-z/2}z^{3/2-1}}{2^{3/2}\,\Gamma(3/2)}\,dz \\\\&=\frac{1}{\sigma}\sqrt\frac{3}{2\pi}\int_0^\infty e^{-z/2}\,dz \\\\&=\frac{1}{\sigma}\sqrt\frac{6}{\pi} \end{align}

Again, for an unbiased estimator of $\sigma$, $$E\left(\frac{1}{2}\sqrt\frac{3\pi}{2}S\right)=\sigma$$

So,

\begin{align} E\left(\frac{\overline X^2}{S}\right)&=E\left(\overline X^2\right)E\left(\frac{1}{S}\right) \\&=\left(\mu^2+\frac{\sigma^2}{4}\right)\frac{1}{\sigma}\sqrt\frac{6}{\pi} \\&=\sqrt\frac{6}{\pi}\left(\frac{\mu^2}{\sigma}+\frac{\sigma}{4}\right) \end{align}

Or, $$E\left(\sqrt{\frac{\pi}{6}}\,\frac{\overline X^2}{S}-\frac{\frac{1}{2}\sqrt\frac{3\pi}{2}S}{4}\right)=\frac{\mu^2}{\sigma}$$

Hence our unbiased estimator based on the complete sufficient statistic $(\overline X,S^2)$ is

\begin{align} T(X_1,X_2,X_3,X_4)&=\sqrt{\frac{\pi}{6}}\,\frac{\overline X^2}{S}-\frac{1}{8}\sqrt\frac{3\pi}{2}S \end{align}

By Lehmann-Scheffe, $T$ is the UMVUE of $\mu^2/\sigma$.