Central Limit Theorem: $ \mathbb{E}\left|\frac{1}{N} \sum_{i=1}^{N} X_i – \mu\right| = O\left(\frac{1}{\sqrt{N}}\right)$

central limit theoremexpected valueprobability theory

I am self-studying the High Dimensional Probability book by Roman Vershynin and came across this problem:

Let $X_1, X_2, \dots$ be a sequence of i.i.d random variables with mean $\mu$ and finite variance. Show that

$$ \mathbb{E}\left|\frac{1}{N} \sum_{i=1}^{N} X_i – \mu\right| = O\left(\frac{1}{\sqrt{N}}\right)$$
as $N \rightarrow \infty$

I feel I need to use central limit theorem somehow but not sure how to deal with absolute value

Best Answer

Notice that $\mathbb{E}[|Z|]^2 \leq \mathbb{E}[Z^2]$ for any random variable $Z$.note 1) Now by writing $\bar{X}_N = \frac{1}{N}\sum_{i=1}^{N}X_i$ and noting that $\mathbb{E}[\bar{X}_N] = \mu$, it follows that

\begin{align*} \mathbb{E}\big[\left|\bar{X}_N - \mu\right|\big]^2 &= \mathbb{E}\big[ \left|\bar{X}_N - \mathbb{E}[\bar{X}_N]\right| \big]^2 \leq \mathbb{E}\big[\left(\bar{X}_N - \mathbb{E}[\bar{X}_N] \right)^2 \big]\\ &= \mathbf{Var}\left( \bar{X}_N \right) = \frac{1}{N^2} \sum_{i=1}^{N} \mathbf{Var}(X_i) = \frac{\sigma^2}{N}, \end{align*}

where $\sigma^2 = \mathbf{Var}(X_n)$ is the common value of the variances of $X_n$'s. From this, we get

$$ \mathbb{E}\left[\left|\frac{1}{N}\sum_{i=1}^{N}X_i - \mu\right|\right] \leq \frac{\sigma}{\sqrt{N}} $$

and the desired claim follows.


Note 1) If $Z$ has finite second moment, this inequality is equivalent to saying that $\mathbf{Var}(|Z|) \geq 0$. This fact itself can be proved in various ways such as Jensen's inequality, Cauchy-Schwarz inequality, etc.

Related Question