Distribution of ratio of sample variances

probabilityprobability theorystatistical-inferencestatistics

I have two independent zero mean populations (not necessarily normally distributed) and I am trying to estimate the ratio of variances of the two population. I have i.i.d. sample from each population of size $n$.
I do know that sample variance converges to population variance almost surely, so by Slutsky theorem, ratio of variances will converge to ratio population variances almost surely. Thus, ratio of sample variances is a consistent estimator.

But what can I say about it's asymptotic distribution, of ratio of sample variances without knowing the distribution of the population or assuming it to be normal?
I do know when population distribution is normal then it follows $F$ distribution with $(n-1,n-1)$ degrees of freedom. But what about the general case?

Thanks in Advance

Best Answer

Let

$$ S_X^2 = \frac{1} {n} \sum_{i=1}^n X_i^2, S_Y^2 = \frac{1} {n} \sum_{i=1}^n Y_i^2$$

be the sample variance estimator of the variances $\sigma_X^2$ and $\sigma_Y^2$ respectively.

By the bivariate CLT,

$$ \sqrt{n} \left(\begin{bmatrix} S_X^2 \\ S_Y^2 \end{bmatrix} - \begin{bmatrix} \sigma_X^2 \\ \sigma_Y^2 \end{bmatrix} \right) \stackrel {d} {\to} \mathcal{N} \begin{bmatrix} Var[X_1] & 0 \\ 0 & Var[Y_1] \end{bmatrix} $$

where

$$ Var[X_1] = E[X_1^4] - E[X_1^2]^2 = E[X_1^4] - \sigma_X^4$$ $$ Var[Y_1] = E[Y_1^4] - E[Y_1^2]^2 = E[Y_1^4] - \sigma_Y^4$$

Next we will need the bivariate Delta method to obtain the asymptotic distribution of $\displaystyle \frac {S_X^2} {S_Y^2}$

See, e.g. https://en.wikipedia.org/wiki/Taylor_expansions_for_the_moments_of_functions_of_random_variables

Consider $\displaystyle f(x, y) = \frac {x} {y}$. Then

$$ \frac {\partial f} {\partial x} = \frac {1} {y}, ~~ \frac {\partial f} {\partial y} = - \frac {x} {y^2}, ~~ \frac {\partial^2 f} {\partial x^2} = 0, ~~ \frac {\partial^2 f} {\partial y^2} = \frac {2x} {y^3}, ~~ \frac {\partial^2 f} {\partial x \partial y} = - \frac {1} {y^2}$$

We Taylor Expand $\displaystyle \frac {S_X^2} {S_Y^2} = f(S_X^2, S_Y^2)$ about the mean $(\sigma_X^2, \sigma_Y^2)$ up to the second order:

$$ \begin{align} f(S_X^2, S_Y^2) \approx &~ f(\sigma_X^2, \sigma_Y^2) + \left.\frac {\partial f} {\partial x}\right|_{(\sigma_X^2, \sigma_Y^2)} (S_X^2 - \sigma_X^2) + \left.\frac {\partial f} {\partial y}\right|_{(\sigma_X^2, \sigma_Y^2)} (S_Y^2 - \sigma_Y^2) \\ &~ + \frac {1} {2} \left.\frac {\partial^2 f} {\partial x^2}\right|_{(\sigma_X^2, \sigma_Y^2)}(S_X^2 - \sigma_X^2)^2 + \frac {1} {2} \left.\frac {\partial^2 f} {\partial y^2}\right|_{(\sigma_X^2, \sigma_Y^2)}(S_Y^2 - \sigma_Y^2)^2 \\ &~ + \left.\frac {\partial^2 f} {\partial x \partial y}\right|_{(\sigma_X^2, \sigma_Y^2)}(S_X^2 - \sigma_X^2)(S_Y^2 - \sigma_Y^2) \\ \Rightarrow \frac {S_X^2} {S_Y^2} \approx &~ \frac {\sigma_X^2} {\sigma_Y^2} + \frac {1} {\sigma_Y^2} (S_X^2 - \sigma_X^2) - \frac {\sigma_X^2} {\sigma_Y^4} (S_Y^2 - \sigma_Y^2) \\ &~ + 0 + \frac {\sigma_X^2} {\sigma_Y^6} (S_Y^2 - \sigma_Y^2)^2 - \frac {1} {\sigma_Y^4} (S_X^2 - \sigma_X^2)(S_Y^2 - \sigma_Y^2) \end{align} $$

Taking expectation,

$$ E\left[\frac {S_X^2} {S_Y^2}\right] \approx \frac {\sigma_X^2} {\sigma_Y^2} + 0 + 0 + \frac {\sigma_X^2} {\sigma_Y^6} Var[S_Y^2] - 0 = \frac {\sigma_X^2} {\sigma_Y^2} + \frac {\sigma_X^2(E[Y_1^4] - \sigma_Y^4)} {n\sigma_Y^6} $$

and variance (with first order only) $$ Var\left[\frac {S_X^2} {S_Y^2}\right] \approx 0 + \frac {1} {\sigma_Y^4}Var[S_X^2] + \frac {\sigma_X^4} {\sigma_Y^8}Var[S_Y^2] = \frac {E[X_1^4] - \sigma_X^4} {n\sigma_Y^4} + \frac {\sigma_X^4(E[Y_1^4] - \sigma_Y^4)} {n\sigma_Y^8}$$

So by Delta method, collecting the terms,

$$ \sqrt{n} \left(\frac {S_X^2} {S_Y^2} - \frac {\sigma_X^2} {\sigma_Y^2} \right) \stackrel {d} {\to} \mathcal{N} \left(0, \frac {E[X_1^4] - \sigma_X^4} {\sigma_Y^4} + \frac {\sigma_X^4(E[Y_1^4] - \sigma_Y^4)} {\sigma_Y^8} \right)$$

The second order term here just serve as a correction term providing higher accuracy if needed.

Related Question