$\newcommand{\si}{\sigma}$
Let us prove a stronger estimate of $EY$, and let us do that under less restrictive conditions. Namely, let us prove that
\begin{equation*}
EY-\sqrt n=O(1/\sqrt n) \tag{1}
\end{equation*}
assuming only that $EX_1^4<\infty$ (instead of the $X_i$'s being sub-Gaussian).
Substituting
$$U:=\frac{Y^2}n=\frac1n\,\sum_1^n X_i^2$$
for $u$ in the inequalities
$$\frac{1+u-(u-1)^2}2\le\sqrt u\le\frac{1+u}2$$
for $u\ge0$, taking the expectations, and using that $EU=1$ and $E(U-1)^2=\operatorname{Var}\,U=\sigma^2/n$, where $\si^2:=\operatorname{Var}(X_1^2)<\infty$,
we have
$$1-\frac{\sigma^2}{2n}\le\frac{EY}{\sqrt n}\le1,$$
so that (1) indeed follows.
Let
\begin{equation}
\mu_X=\tfrac12\,\mu_{aZ}+\tfrac12\,\mu_{bZ},
\end{equation}
where $\mu_U$ denotes the probability distribution of a random vector $U$, $Z\sim N(0,I_n)$,
and $a,b$ are constants such that
\begin{equation}
0<a<1<b\quad\text{and}\quad \tfrac12\,a^2+\tfrac12\,b^2=1.
\end{equation}
Then $EXX^T=I_n$. Also, for any unit vector $u$ and real $s>0$
\begin{equation}
E\exp\{\left<X,u\right>^2/s^2\}=\frac1{2\sqrt{1-2a^2/s^2}}+\frac1{2\sqrt{1-2b^2/s^2}}<2
\end{equation}
if $s$ is large enough (depending only on $a,b$),
so that, by the definition of $\|\cdot\|_{\psi_2}\|$, we have $\|\left<X,u\right>\|_{\psi_2}\le s$. For instance, here we can take $a=1/5,b=7/5,s=3$.
On the other hand, for
\begin{equation}
t:=(b-1)\sqrt{n}/2,
\end{equation}
\begin{multline}
2\,Ee^{(\|X\|-\sqrt n)^2/t^2}>Ee^{(\|bZ\|-\sqrt n)^2/t^2}
>Ee^{(\|bZ\|-\sqrt n)^2/t^2}1_{\|Z\|^2>n} \\
>e^{(b\sqrt n-\sqrt n)^2/t^2}\,P(\|Z\|^2>n)=e^4\,P(\|Z\|^2>n)\to e^4/2>4,
\end{multline}
because, by the central limit theorem, $P(\|Z\|^2>n)\to1/2$.
So, for all large enough $n$,
\begin{equation}
\|\|X\|-\sqrt n\|_{\psi_2}\ge t=(b-1)\sqrt{n}/2\to\infty,
\end{equation}
as desired.
Best Answer
If $(Z_i)_{1 \leq i \leq N}$ are scalar random variables with $\|Z_i\|_{\Psi_2} \leq C$, then $\mathbf{E} \max Z_i \leq CC'\sqrt{\log N}$ by the usual union bound argument.
Now let $X$ be an isotropic random vector in $\mathbf{R}^n$ such that $\|\langle X,\theta \rangle \|_{\Psi_2} \leq C $ for every unit vector $\theta$, and let $S$ be the support of $X$. Choose a number $A$ such that $S$ is contained in the ball of radius $A\sqrt{n}$. Since (denoting by $\|\cdot\|$ the Euclidean norm) $$ ||X||^2 \leq \sup_{x \in S} |\langle X,x\rangle|, $$ it follows from the aforementioned estimate that $$ n = \mathbf{E} ||X||^2 \leq A \sqrt{n} CC'\sqrt{\log |S|}.$$ In particular, we get $|S| \geq \exp(c(A,C) n)$.
We are going to reduce to the situation where $A$ is bounded. To that end, take an isotropic subgaussian random vector $X$ with $||\langle X,\theta \rangle||_{\Psi_2} \leq C$. This implies in particular $\mathbf{E} \langle X,\theta \rangle^4 \leq 4C^2$ (possibly change $4$ into another number depending on which definition of $\|\cdot\|_{\psi_2}$ you use). Define a new random vector as $Y = X {\bf 1}_{\{ ||X|| \leq 4C \sqrt{n}\} }$. The vector $Y$ is not exactly isotropic but satisfies $\frac 12 I_n \leq \mathbf{E} YY^T \leq I_n$. This is because (by Cauchy-Schwarz and Markov inequalities) $$ \mathbf{E} \left[ \langle X,\theta \rangle^2 {\bf 1}_{\{||X|| > 4C \sqrt{n}\}} \right] \leq \left( \mathbf{E} \left[ \langle X,\theta \rangle^4 \right] \cdot \mathbf{P}(||X||^2 > 16C^2n)\right)^{1/2} \leq \sqrt{\frac{4C^2}{16C^2}} = \frac{1}{2}$$
We are now going to apply the first part of the argument to a linear image $\tilde{Y}$ of $Y$ which is isotropic. The random vector $\tilde{Y}$ is supported in the ball of radius $A\sqrt{n}$ for $A=8C$, and satisfies $\|\langle \tilde{Y},\theta \rangle \|_{\Psi_2} \leq 2C$, so the support of $Y$ (which is contained into $S \cup \{0\}$) contains at least $\exp(c(8C,2C)n)$ points.