Functional Analysis – Central Limit Theorem in Subgaussian Orlicz Norm

fa.functional-analysispr.probability

A real random variable $X$ is said to be subgaussian if there exists an $a > 0$ such that $\mathbb{E}[e^{\lambda X}] < e^{a^2 \lambda^2}$ for all $\lambda \in \mathbb{R}$. The space of such random variables admits a Banach space structure, with an Orlicz norm given by $$\| X \|_{\psi_2} = \inf\left\{ t > 0 : \mathbb{E}\left[ \psi_2\left( \frac{|X|}{t} \right) \right] \leq 1 \right\},$$ with $\psi_2(x) = e^{x^2} – 1$.

The main question here is a simple one to ask, but I've been unable to find an answer. Suppose I have a sequence of i.i.d. subgaussian random variables. Is it known whether or not the normalized sums $$S_n =
\frac{1}{\sqrt{n}} \sum_{i=1}^n X_i,$$

converge in this norm to a normally distributed random variable? Moreover, if they do not do so in general, is a sufficient condition known?

Best Answer

$\newcommand\ep\varepsilon$No. If this were so, then (by Lemma 1 below) $S_n$ would converge to a normally distributed random variable $Y$ in probability, which is false for any iid $X_i$'s.


Lemma 1: If $\|S_n-Y\|_{\psi_2}\to0$ (as $n\to\infty$), then $S_n\to Y$ in probability.

Proof: Suppose that $\|S_n-Y\|_{\psi_2}\to0$. Then $$E\psi_2(|S_n-Y|/t_n)\le1$$ for all natural $n$, where $t_n:=\|S_n-Y\|_{\psi_2}+1/n\to0$. So, by Markov's inequality, for each real $\ep>0$, $$P(|S_n-Y|\ge\ep)\le\frac{E\psi_2(|S_n-Y|/t_n)}{\psi_2(\ep/t_n)} \le\frac1{\psi_2(\ep/t_n)}\to0.$$ So, $S_n\to Y$ in probability. $\quad\Box$

Related Question