If a characteristic function is infinitely divisible, it is nonzero everywhere

complex-analysisprobability theory

Let $E$ be a $\mathbb R$-Banach space and $\mathcal M_1(E)$ denote the set of probability measures on $(E,\mathcal B(E))$. Remember that if $\mu\in\mathcal M_1(E)$, then $$\varphi_\mu:E'\to\mathbb C\;,\;\;\;x'\mapsto\int\mu({\rm d}x)e^{{\rm i}\langle x',\:x\rangle}$$ is the characteristic function of $\mu$. Let $$C_1(E):=\left\{\phi_\mu:\mu\in\mathcal M_1(E)\right\}.$$ Remember that $\varphi\in C_1(E)$ is called infinitely divisible if $$\forall k\in\mathbb N:\exists\psi\in C_1(E):\varphi=\psi^k\tag1.$$

Now let $\varphi\in C_1(E)$ be infinitely divisible. How can we show that $\varphi(x')\ne0$ for all $x'\in E'$?

The idea should be to take $(\psi_n)_{n\in\mathbb N}$ with $$\varphi=\psi_n^n\;\;\;\text{for all }n\in\mathbb N\tag2$$ and observe that $$\left|\psi_n(x')\right|=\left|\varphi(x')\right|^{\frac1n}\xrightarrow{n\to\infty}\begin{cases}1&\text{, if }\varphi(x')\ne0\\0&\text{, otherwise}\end{cases}\tag3$$ for all $x'\in E'$, but how do we need to argue further?

Best Answer

Let $(E,\| \cdot \|)$ be a Banach space.

If $\varphi$ is infinitelly divisible, then for any $n \in \mathbb N$ you have $\varphi_n$ such that $\varphi_n^n \equiv \varphi$.

Now, since $\varphi$ is a characteristic function, it's continuous and $\varphi(0)=1$. In particular, $ \| \varphi \|^2 \ge \frac{1}{2} $ on some small ball $B(0,\delta)$ in $E^*$.

Hence $$ \|\varphi_n(x)\|^2 = \varphi_n \cdot \bar{\varphi_n}(x) \to 1 $$ for $x \in B(0,\delta)$. But now, given a characteristic function $\psi$ of probability measure $\mu$, then function $\psi \cdot \bar{\psi}$ is a characteristic function (and it corresponds to $X-Y$ where $X,Y$ are independent with $\mu$ distribution)

Hence we showed that $(\varphi_n \cdot \bar{\varphi_n})_n$ is a sequence of characteristic functions that converge to continuous function on $B(0,\delta)$ and by Levy Continuity theorem we get that $\varphi_n \cdot \bar{\varphi_n}$ converges to some $\psi$ which is a characteristic function such that $\psi(x)= 1$ in a neighbourhood of $0$ (for $x \in B(0,\delta)$).

Now, the point is, the only characteristic function being equal to $1$ on some neighbourhood of $0$ is a characteristic function of $\delta_0$ measure. Indeed, by taking any $z \in E^* \setminus \{0\}$ we can introduce function $\eta_z:\mathbb R \to \mathbb C$ by formula $\eta_z(t) = \psi(tz)$. It is easy to see that if $\psi$ corresponds to random variable $X$ on $(E,\mathcal B(E))$, then $\eta_z$ corresponds to random variable $z(X)$ on $(\mathbb R,\mathcal B(\mathbb R))$ (due to $\eta_z(t) = \psi(tz) = \mathbb E[\exp(i\cdot (tz)(X))] = \mathbb E[\exp(it\cdot z(X))] = \varphi_{z(X)}(t)$). But then $\eta_z$ is a characteristic function corresponding to real random variable such that $\eta_z(t) = 1$ for some $t \in (-\delta,\delta)$ (if $\psi(x) = 1$ for $\|x\| \le r$ then $\delta = \frac{r}{\|z\|}$ for example). Hence it is enough to prove that real characteristic function has such property, which is rather known (if you're not familiar with it, I'll provide a proof)

But if $\varphi(x) = 0$ for some $x \in E^*$, then $\varphi_n(x) \to 0$ and that would contradict the fact that norms of $\varphi_n(x)$ need to converge to $1$.

Related Question