Weak converges of partial sums in a a Hilbert space

analysisconvergence-divergencefunctional-analysismeasure-theoryreal-analysis

Let $\{e_1,e_2,\ldots\}$ be an orthonormal basis of a Hilbert space $H$. Prove that the sequence defined by
$$x_N=\frac{1}{\sqrt{N}}\sum_{n=1}^N e_n$$
converges to zero weakly in $H$.

So I'm pretty sure it suffices to show that
$\langle x_N,y\rangle \rightarrow 0$ for $y\in H$. But man these problems of weak convergence have always been tricky for me, so unintuitive. I've been working on this one for like a week and haven't made any progress, if somebody could help me out you'd be a lifesaver…

Best Answer

Partial Answer: Here is an attempt that I started writing.

Your statement is exactly right: the definition of weak convergence is such that $x_N$ "converges weakly to zero" when $\langle x_N, y \rangle \to 0$ for all $y \in H$.

Now, $\{e_1,e_2,\dots\}$ is an orthonormal basis, which means that we can write $y = \sum_{n=1}^\infty y_ne_n$. In terms of these coefficients $y_n$, we have $$ \langle x_N,y \rangle = \frac{1}{\sqrt{N}}\sum_{n=1}^N y_n. $$ We want to show that this sequence of sums converges to zero.

Note that $\left| \frac{1}{\sqrt{N}}\sum_{n=1}^N y_n\right| \leq \frac{1}{\sqrt{N}}\sum_{n=1}^N |y_n|$, and $\|y\|^2 = \langle y,y \rangle = \sum_{n=1}^\infty |y_n|^2$. With all that, we see that it suffices to show the following:

Claim: If $a_n \geq 0 $ is a sequence such that $\sum_{n} a_n^2$ converges, then $\lim_{n \to \infty} \frac{1}{\sqrt{N}}\sum_n a_n = 0$.

Proof: Suppose for the purpose of contradiction that the limit is non-zero. By the definition of a limit, it follows that there exists an $\epsilon > 0$ and infinitely many integers $N_1<N_2<\dots$ for which $$ \frac1{\sqrt{N_k}}\sum_{n=1}^{N_k} a_n \geq \epsilon \implies S_k :=\sum_{n=1}^{N_k} a_n \geq \epsilon \sqrt{N_k}. $$ It follows that for $k = 1,2,\dots$, we have $$ S_{k+1} - S_k = \sum_{n=N_k+1}^{N_{k+1}} a_n \geq \epsilon (\sqrt{N_{k+1}} - \sqrt{N_k}). $$ Now, we note that $\sum_{n=1}^N a_n^2 \geq \frac 1N \left(\sum_{n=1}^N a_n\right)^2$ (as can be seen by Cauchy Schwarz). Thus, we have $$ \begin{align} \sum_{n=N_k+1}^{N_{k+1}} a_n^2 &\geq \frac 1{N_{k+1} - N_k}\left(\sum_{n=N_{k+1}}^{N_{k+1}} a_n\right)^2 \\ & \geq \epsilon^2 \frac{(\sqrt{N_{k+1}} - \sqrt{N_k})^2}{N_{k+1} - N_k} = \epsilon^2 \left(\frac{2 \sqrt{N_{k+1}}}{\sqrt{N_{k+1}}+ \sqrt{N_k}} - 1\right) \end{align} $$


Another idea: write $\beta_N := \langle x_N,y \rangle$. Note that $x_{N+1} = \frac{1}{\sqrt{N+1}}(\sqrt{N}x_N + e_{N+1})$. It follows that $$ \beta_{N+1} = \frac 1{\sqrt{N+1}}(\sqrt{N}\beta_N + y_{N+1}) = \sqrt{\frac{N}{N+1}} \beta_N + \frac 1{\sqrt{N+1}}y_{N+1} $$