[Math] Basu’s theorem for normal sample mean and variance

independenceprobabilitystatistics

I'm working on the following problem: Suppose that $X \sim N(\mu,\sigma^2)$. Let the sample mean and variance, $\overline{X}$ and $S^2$ be defined as usual so that $\mathbb{E} S^2 = \sigma^2$. Prove that the sample mean is independent of the sample variance. Given that both $\mu$ and $\sigma^2$ are unknown, find the MVUE for $\mu \sigma^2$.

The second part is simple. Given that $\overline{X}$ and $S^2$ are independent, and that $(\overline{X}, S^2)$ is a jointly complete and sufficient statistic for $(\mu, \sigma^2)$, we have that $\phi ( \overline{X}, S^2) = \overline{X} S^2$ is unbiased for $\mu \sigma^2$. By Lehmann-Schefee, $\phi$ is the MVUE for $\mu \sigma^2$.

To prove independence, I would like to implore Basu's Theorem. This says, briefly, that any boundedly (which I will ignore) complete sufficient statistic is independent of any ancillary statistic. It is not hard to show that the normal distribution is exponential class. Moreover, given known variance, we get that $\sum X_i$ is a complete and sufficient statistic for $\mu$. It is clear that $S^2$ is ancillary, since (accept my abuse of notation) for any $a \in \mathbb{R}$,
$$
S^2+a = \frac{1}{n-1} \sum_{i=1}^n \Big( X_i + a – \frac{1}{n}\sum_{i=1}^n \big( X_i + a \big) \Big)^2 = \frac{1}{n-1} \sum_{i=1}^n \Big( X_i – \overline{X}\Big)^2
$$
so that $S^2$ is location invariant, and hence ancillary.

So, if we were only interested in estimating $\mu$, we would be done. My question is, how do I deal with applying Basu's theorem when I have a jointly complete and sufficient statistic. Does it suffice to only show that $\sum X$ is complete and sufficient statistic for $\mu$ even though I'm estimating a function of $\mu$ and $\sigma^2$?

Note: there is a similar (mostly unanswered), but not identical question here: UMVUE using complete and sufficient statistic

Best Answer

I hope I got your question correctly:

The way you explained is correct we show the joint pdf in the form of exponential family and suffice that the summation of Xi's is Complete and sufficient statistics for unknown u, however, in order to show this we need to fix our variance to some known arbitrary constant sigma not square.

And hence by Basu's theorem Mean(complete statistics) and S^2(Ancillary statistics) are independent for fixed sigma square. And it will also be true for all sigma square values as sigma square is positive, thus mean and variance are independent for all u and sigma square values.

mean/sigma is independent of S^2/sigma square. and mean/sigma follows standard normal N(0,1).

As sigma is not random, we have mean and S^2 are independent.

I didn't use latex hope I didn't confuse you further

Related Question