[Math] Showing that an estimator for covariance is consistent

covariancestatisticsvariance

I'm having trouble proving that a certain estimator is consistent. I know that to show an estimator is consistent, I have to show that the variance of the estimator approaches zero as n grows large/goes to infinity.

The estimator for $Cov(X_i,Y_i)$ from a random sample $(X_i,Y_i)$ for $i = (1, … , n)$ is as follows:

$$\frac{1}{n}\sum_{i=1}^{n} (X_{i}-\overline{X})(Y_i-\overline{Y})$$

I let $\hat{\theta}$ be the estimator and thought that doing this would work:

$$V(\hat{\theta}) = V(\frac{1}{n}\sum_{i=1}^{n} (X_{i}Y_{i}-X_{i}\overline{Y}-Y_{i}\overline{X}+\overline{X} \overline{Y}))$$

I expanded the terms but I have no idea how to go from here. Any help is greatly appreciated!

Best Answer

We can begin by using the formula $$ \frac{1}{n} \sum_{i=1}^n \left(X_i - \bar{X}\right)\left(Y_i - \bar{Y}\right) = \frac{1}{n} \sum_{i=1}^n X_i Y_i - \bar{X} \bar{Y} $$

Proof. $$ \begin{aligned} \frac{1}{n} \sum_{i=1}^n \left(X_i - \bar{X}\right)\left(Y_i - \bar{Y}\right) &= \frac{1}{n} \sum_{i=1}^n \left(X_i Y_i - X_i \bar{Y} - Y_i \bar{X} + \bar{X} \bar{Y}\right) \\ &= \frac{1}{n} \sum_{i=1}^n X_i Y_i - \frac{1}{n} \sum_{i=1}^n X_i \bar{Y} - \frac{1}{n} \sum_{i=1}^n Y_i \bar{X} + \frac{1}{n} \sum_{i=1}^n \bar{X} \bar{Y} \\ &= \frac{1}{n} \sum_{i=1}^n X_i Y_i - \bar{Y} \frac{1}{n} \sum_{i=1}^n X_i - \bar{X} \frac{1}{n} \sum_{i=1}^n Y_i + \bar{X} \bar{Y} \\ &= \frac{1}{n} \sum_{i=1}^n X_i Y_i - \bar{Y} \bar{X} - \bar{X} \bar{Y} + \bar{X} \bar{Y} \\ &= \frac{1}{n} \sum_{i=1}^n X_i Y_i - \bar{X} \bar{Y} \end{aligned} $$

Next, by the weak law of large numbers, the quantities $\bar{X}$, $\bar{Y}$, and $\frac{1}{n} \sum_{i=1}^n X_i Y_i$ converge in probability to $E[X_i]$, $E[Y_i]$, and $E[X_i Y_i]$, respectively. By two applications of Slutsky's theorem, it follows that $\bar{X} \bar{Y}$ converges in probability to $E[X_i] E[Y_i]$ and hence that $$ \frac{1}{n} \sum_{i=1}^n X_i Y_i - \bar{X} \bar{Y} $$ converges in probability to $E[X_i Y_i] - E[X_i] E[Y_i] = \operatorname{Cov}(X_i, Y_i)$.

Related Question