Statistics – Independence of Sample Mean and Variance Through Covariance

probability theorystatistics

I have seen the text book derivation where the independence is established through factoring the joint distribution. But has anyone tried to prove that the covariance is zero?.

Let $Z_{i}$ come from a standard normal distribution.

Let $X = \bar{Z}$ be the sample mean and let $$Y = \sum_i\frac{1}{n-1} (Z_{i} -\bar{Z})^2$$ be the sample variance.

Prove that $\operatorname{Cov}(X,Y) = 0$ .

I am getting terms like E($\bar{Z}^3$) in the expansion which is making it very cumbersome to handle…

Best Answer

Consider the random vectors $Z=(Z_i)_{1\leqslant i\leqslant n}$ and $-Z=(-Z_i)_{1\leqslant i\leqslant n}$. Then $X=\xi(Z)$ for a given odd function $\xi$ and $Y=\eta(Z)$ for a given even function $\eta$. The function $\zeta=\xi\cdot\eta$ is odd as well, hence $\xi(-Z)=-\xi(Z)$ and $\zeta(-Z)=-\zeta(Z)$. Now, $Z$ and $-Z$ follow the same distribution, hence $\xi$ and $\zeta$ being odd functions yields that $\mathrm E(\xi(Z))=\mathrm E(\zeta(Z))=0$. In particular, the covariance of $X$ and $Y$ is $\mathrm E(\zeta(Z))-\mathrm E(\xi(Z))\mathrm E(\eta(Z))=0$.

To sum up, the result you ask a simple proof of (that the empirical mean and empirical variance are uncorrelated) has nothing to do with gaussianity since it holds for every symmetric distribution.

Related Question