Why isn’t a uniform distribution on a bounded set subgaussian

concentration-of-measureprobability

On High Dimensional Probability, by Vershynin, there is an exercise that asks to prove that the uniform distribution on the $l_1$ ball of radius $n$, $X \sim {Unif} \{x \in \mathbb{R}^n : ||x||_1 <= n\}$ isn't subgaussian. However, since this set is bounded, all the marginals $\langle X,x \rangle$ should be bounded, so shouldn't be $X$ subgaussian? In fact, by Cauchy-Schwartz, $||X||_{\psi_2}$ should be less than $\sup_{x \in K} ||x||_2$ if $K$ is bounded.

Where is my reasoning failing?

Best Answer

Check the latest version of the book (it's just on his website), he corrected this problem in the latest version. Now you are asked to "Show that the subgaussian norm of this distribution is not bounded by an absolute constant as the dimension n grows." There’s nothing wrong with this argument.

But what worth noticing is that you need to have a finer bound in the case where your bounds on $\langle X,x \rangle$ depends on $n$. Because the orcliz norm may not grow as the dimension $n$ and the size of convex body grows. Like the case when $X\sim \text{Unif}(\sqrt{n} S^{n-1})$, you need a better bound for it. But if you are only interested in Unif$(S^{n-1})$ (say), using the boundedness should be just fine.