A random series has infinitely many zeros in $[0,1)$ almost surely.

brownian motionprobability theoryrootsstochastic-processes

These days I've been learning the properties of Brownian sample paths(Chapter 2 in Le Gall's Brownian Motion, Martingales, and Stochastic Calculus). As he mentioned in Proposition 2.14:

If $B=(B_t)_{t\geq0}$ is a Brownian motion, then we have a.s. for every $\epsilon>0$,
$$\sup_{0\leq s\leq\epsilon}B_s>0,\qquad\inf_{0\leq s\leq\epsilon}B_s<0,$$
which means $B$ attains zero infinitely many times on $s\in[0,\epsilon]$ almost surely.

This property reminds me a problem I met several weeks ago, which goes as follows:

Suppose $(\epsilon_n)_{n\geq1}$ is a sequence of i.i.d. random variables and the common law is Bernoulli:
$$\mathbb{P}[\epsilon_1=1]=\mathbb{P}[\epsilon_1=-1]=1/2.$$
Consider the random series $f(x)=\sum_{n=1}^\infty \epsilon_nx^n$. Show that the random series attains zero infinitely many times on $x\in[0,1)$ almost surely.

I had some idea on this problem: the series $f(x)$ must vibrate a lot in the left of $x=1$. All we need to prove is that for all $0<c<1$, we can find a zero of $f(x)$ in the interval $[c, 1)$.

Any help would be appreciated.

Best Answer

Lemma 1: One of the following three statements holds true:

  • $\lim_{x \uparrow 1} f(x)=\infty$ a.s.
  • $\lim_{x \uparrow 1} f(x) = - \infty$ a.s.
  • $\limsup_{x \uparrow 1} f(x) = \infty$ and $\liminf_{x \uparrow 1} f(x) = -\infty$ a.s.

Proof: Denote by $\mu$ the distribution of $\limsup_{x \uparrow 1} f(x)$, i.e. $$\mu(B) := \mathbb{P} \left( \limsup_{x \uparrow 1} \sum_{n=1}^{\infty} \epsilon_n x^n \in B \right).$$

If we set

$$\tau := \inf\{N \in \mathbb{N}; \sum_{n=1}^N \epsilon_n = 1\}$$

then $\tau< \infty$ almost surely and

$$\xi_n := \epsilon_{n + \tau(\omega)}, \qquad n \geq 1$$

defines a sequence of iid Bernoulli random variables. In particular, $(\xi_n)_{n \in \mathbb{N}}$ equals in distribution $(\epsilon_n)_{n \in \mathbb{N}}$, and so

$$\mu(B) = \mathbb{P} \left( \limsup_{x \uparrow 1} \sum_{n=1}^{\infty} \xi_n x^n \in B \right) \tag{1}$$

for all $B$. Moreover, we have

\begin{align*} \limsup_{x \uparrow 1} \sum_{n =1}^{\infty} \xi_n x^n &= \limsup_{x \uparrow 1} \sum_{n=\tau+1}^{\infty} \epsilon_n x^n \\ &= - \sum_{n=1}^{\tau} \epsilon_n + \limsup_{x \uparrow 1} \sum_{n=1}^{\infty} \epsilon_n x^n \\ &= -1 + \limsup_{x \uparrow 1} \sum_{n=1}^{\infty} \epsilon_n x^n. \end{align*}

Combining this with $(1)$ we get

$$\mu(B) = \mu(B+1)$$

for any Borel set $B$. The only finite measure on $\mathcal{B}(\mathbb{R})$ which is invariant under (non-trivial) translations is the trivial measure, and therefore we conclude that $\mu(\mathbb{R})=0$. The same reasoning works for $\liminf_{x \uparrow 1} f(x)$ (because of symmetry), and this finishes the proof of the lemma.


Lemma 2: $\limsup_{x \uparrow 1} f(x) = \infty$ and $\liminf_{x \uparrow 1} f(x)= - \infty$ almost surely.

Proof: The sequence $(-\epsilon_n)_{n \in \mathbb{N}}$ equals in distribution $(\epsilon_n)_{n \in \mathbb{N}}$, and therefore the random variables

$$\limsup_{x \uparrow 1} \sum_{n =1}^{\infty} \epsilon_n x^n$$

and

$$\limsup_{x \uparrow 1} \sum_{n=1}^{\infty} (-\epsilon_n) x^n = - \liminf_{x \uparrow 1} \sum_{n=1}^{\infty} \epsilon_n x^n$$

have the same distribution. Now the assertion follows from Lemma 1.

Corollary: $f$ has infinitely many zeros in $(0,1)$ with probability $1$.

Proof: As already noted by the OP, it suffices to show that for any $c \in (0,1)$ there exists with probability $1$ some $x^* \in (c,1)$ such that $f(x^*)=0$. Fix $c \in (0,1)$. By Lemma 2, we can find (with probability $1$) some $x_1 \in (c,1)$ and $x_2 \in (c,1)$ such that $f(x_1)>1$ and $f(x_2)<-1$. Since $f$ is continuous on $(0,1)$ this implies, by the intermediate value theorem, that there exists $x^* \in (x_1,x_2) \subseteq (c,1)$ such that $f(x^*)=0$.

Remark: In this paper you can find some more general statements on the behaviour of random series.

Related Question