If $\sum a_n^k$ converges for all $k \geq 1$, does $\prod (1 + a_n)$ converge

conditional-convergenceconvergence-divergenceinfinite-product

By definition, an infinite product $\prod (1 + a_n)$ converges iff the sum $\sum \log(1 + a_n)$ converges, enabling us to use various convergence tests for infinite sums, and the Taylor expansion

$$
\log(1 + x) = x – x^2/2+x^3/3-x^4/4 + …=\sum_{k=1}^\infty (-1)^{k-1}x^k/k,
$$

to find various simple convergence tests for the original product:

Absolute convergence – if $\sum |a_n|$ converges, then $\prod (1 + a_n)$ converges, and converges to the same value regardless of how the factors are rearranged.

Conditional convergence – if $\sum a_n$ converges and $\sum |a_n|^2$ converges, then $\prod (1 + a_n)$ converges.

We can clearly generalize this last one as follows: "If there exists some power $p$ so that $\sum a_n^k$ converges for all $1 \leq k < p$, and $\sum |a_n|^p$ converges, then $\prod (1 + a_n)$ converges."

Question: Can we weaken the above assumption that $\sum a_n^k$ converges absolutely for some $k \geq 1$, to the assumption that $\sum a_n^k$ converges (possibly conditionally) for all $k \geq 1$?
That is, may we conclude $\prod (1 + a_n)$ converges if $\sum a_n^k$ converges for all $k \geq 1$?

I'm specifically interested in convergence of the infinite product
$$
\prod_{n=1}^\infty \left( 1 + \frac{e^{i n \theta}}{\log(n+1)^s}\right)
$$

for $\theta \in \mathbb{R} \setminus (\mathbb{Q} \pi)$ and $s > 0$, as when $a_n = \frac{e^{i n \theta}}{\log(n+1)^s}$, it's easy to use Dirichlet's test to check that $\sum a_n^k$ converges for $k \geq 1$.

Best Answer

I believe the following construction shows that the answer is no.

Lemma: There exists a sequence $(s_1,s_2,\dots)$, where each $s_j = (r_{j,1},\dots,r_{j,2^j})$ is a permutation of the $2^j$th roots of unity, with the following property: for every integer $k\ge1$, there exists a constant $C(k)$ such that all of the partial sums $$ r_{j,1}^k+\cdots+r^k_{j,m} \quad (j\ge1,\, 1\le m\le 2^j) $$ are bounded by $C(k)$.

Proof: Fix once and for all an irrational number $\alpha$. Let $\|t\|$ denote the distance from $t$ to the nearest integer, and set $d_k = \min\{\|\alpha\|,\|2\alpha\|,\dots,\|k\alpha\|\}>0$. Choose $J_k$ large enough that $2^{-(J_k-1)} < d_k/2k$.

For every $j\ge J_k$, choose a rational number $b_j/2^j$ with $b_j$ odd such that $\|b_j/2^j - \alpha\| \le 2^{-(J_k-1)}$; this is possible since the intervals $[b/2^j - 2^{-(J_k-1)},b_j/2^j - 2^{-(J_k-1)}]$ cover the reals. (Note that this is not actually infinitely many constraints, but rather the single constraint corresponding to the largest $k$ such that $J_k \le j$. If $j<J_1$ then just choose $b_j/2^j = 1/2^j$.) It follows from the triangle inequality (since $\|{\cdot}\|$ is a metric on $\Bbb R/\Bbb Z$) that $\|kb_j/2^j\| \ge \|k\alpha\| -k \|b_j/2^j-\alpha\| \ge d_k - k 2^{-(J_k-1)} \ge d_k/2$ for $j\ge J_k$.

We now choose the permutation $(r_{j,1},\dots,r_{j,2^j})$ defined by $r_{j,m} = \exp(2\pi i m b_j/2^j)$ for all $1\le m\le 2^j$. We must verify the statement of the lemma for this sequence of permutations.

For fixed $k$, it suffices to prove the statement for $j$ sufficiently large in terms of $k$; so we assume $j\ge J_k$. The partial sums $r_{j,1}^k+\cdots+r^k_{j,m}$ are geometric series with common ratio $\exp(2\pi i k a_j/2^j)$, and therefore their partial sums are $\ll \|k a_j/2^j\|^{-1} \ll d_k/2$, as needed.

Using the above notation, we make the following construction.

Construction: For any positive integers $g_1,g_2,\dots$ and any positive real numbers $y_1,y_2,\dots$, let $(a_1,a_2,\dots)$ be the concatenation of infinitely many finitely sequences:

  • first, $g_1$ copies of $(y_1r_{1,1},y_1r_{1,2})$,
  • next, $g_2$ copies of $(y_2r_{2,1},y_2r_{2,2},y_2r_{2,3},y_2r_{2,4})$,
  • and so on, at each stage including $g_j$ copies of $(y_jr_{j,1},\dots,y_jr_{j,2^j})$.

Claim 1: if $\lim_{k\to\infty} y_k/C(k) = 0$, then for any $k\ge1$, the series $\sum_{j=n}^\infty a_n^k$ converges.

Proof: It suffices to consider the sum with finitely many terms deleted; so we start with the $g_{J_k}$ copies of $(y_{J_k}r_{J_k,1},\dots,y_{J_k}r_{{J_k},2^{J_k}})$. Note that the partial sum of every individual copy equals $0$ exactly. Therefore the partial sums throughout the $g_{J_k}$ copies never exceed $y_kC(k)$, and the final partial sum equals $0$. This implies that the partial sums (after finitely many terms omitted) actually tend to $0$, which establishes convergence.

Claim 2: For any fixed positive real numbers $y_1,y_2,\dots$, if $(1-y_j^{2^j})^{g_j} < 1/2$ for each $j\ge2$, then the product $\prod_{n=1}^\infty (1+a_n)$ diverges to $0$.

Proof: We look at the partial product over each copy of a permutation, noting that $$ \prod_{m=1}^{2^j} (1+y_jr_{j,m}) = 1-y_j^{2^j} $$ from an evaluation of the $2^j$th cyclotomic polynomial. Therefore the partial product over the $g_j$ copies of that permutation contribute $$ \bigg( \prod_{m=1}^{2^j} (1+y_jr_{j,m}) \bigg)^{g_j} = (1-y_j^{2^j})^{g_j} \in (0, \tfrac12) $$ to the overall product; since there are infinitely many such factors, the overall product diverges to $0$. (Technically this proves that the lim inf of the partial products equals $0$, which is enough for divergence, but a proof similar to that of Claim 1 should establish the full limit.)

One can modify the construction (using odd moduli in place of $2_j$) to produce similar examples where the product diverges to $+\infty$.