I believe the following construction shows that the answer is no.
Lemma: There exists a sequence $(s_1,s_2,\dots)$, where each $s_j = (r_{j,1},\dots,r_{j,2^j})$ is a permutation of the $2^j$th roots of unity, with the following property: for every integer $k\ge1$, there exists a constant $C(k)$ such that all of the partial sums
$$
r_{j,1}^k+\cdots+r^k_{j,m} \quad (j\ge1,\, 1\le m\le 2^j)
$$
are bounded by $C(k)$.
Proof: Fix once and for all an irrational number $\alpha$. Let $\|t\|$ denote the distance from $t$ to the nearest integer, and set $d_k = \min\{\|\alpha\|,\|2\alpha\|,\dots,\|k\alpha\|\}>0$. Choose $J_k$ large enough that $2^{-(J_k-1)} < d_k/2k$.
For every $j\ge J_k$, choose a rational number $b_j/2^j$ with $b_j$ odd such that $\|b_j/2^j - \alpha\| \le 2^{-(J_k-1)}$; this is possible since the intervals $[b/2^j - 2^{-(J_k-1)},b_j/2^j - 2^{-(J_k-1)}]$ cover the reals. (Note that this is not actually infinitely many constraints, but rather the single constraint corresponding to the largest $k$ such that $J_k \le j$. If $j<J_1$ then just choose $b_j/2^j = 1/2^j$.)
It follows from the triangle inequality (since $\|{\cdot}\|$ is a metric on $\Bbb R/\Bbb Z$) that $\|kb_j/2^j\| \ge \|k\alpha\| -k \|b_j/2^j-\alpha\| \ge d_k - k 2^{-(J_k-1)} \ge d_k/2$ for $j\ge J_k$.
We now choose the permutation $(r_{j,1},\dots,r_{j,2^j})$ defined by $r_{j,m} = \exp(2\pi i m b_j/2^j)$ for all $1\le m\le 2^j$. We must verify the statement of the lemma for this sequence of permutations.
For fixed $k$, it suffices to prove the statement for $j$ sufficiently large in terms of $k$; so we assume $j\ge J_k$.
The partial sums $r_{j,1}^k+\cdots+r^k_{j,m}$ are geometric series with common ratio $\exp(2\pi i k a_j/2^j)$, and therefore their partial sums are $\ll \|k a_j/2^j\|^{-1} \ll d_k/2$, as needed.
Using the above notation, we make the following construction.
Construction: For any positive integers $g_1,g_2,\dots$ and any positive real numbers $y_1,y_2,\dots$, let $(a_1,a_2,\dots)$ be the concatenation of infinitely many finitely sequences:
- first, $g_1$ copies of $(y_1r_{1,1},y_1r_{1,2})$,
- next, $g_2$ copies of $(y_2r_{2,1},y_2r_{2,2},y_2r_{2,3},y_2r_{2,4})$,
- and so on, at each stage including $g_j$ copies of $(y_jr_{j,1},\dots,y_jr_{j,2^j})$.
Claim 1: if $\lim_{k\to\infty} y_k/C(k) = 0$, then for any $k\ge1$, the series $\sum_{j=n}^\infty a_n^k$ converges.
Proof: It suffices to consider the sum with finitely many terms deleted; so we start with the $g_{J_k}$ copies of $(y_{J_k}r_{J_k,1},\dots,y_{J_k}r_{{J_k},2^{J_k}})$. Note that the partial sum of every individual copy equals $0$ exactly. Therefore the partial sums throughout the $g_{J_k}$ copies never exceed $y_kC(k)$, and the final partial sum equals $0$. This implies that the partial sums (after finitely many terms omitted) actually tend to $0$, which establishes convergence.
Claim 2: For any fixed positive real numbers $y_1,y_2,\dots$, if $(1-y_j^{2^j})^{g_j} < 1/2$ for each $j\ge2$, then the product $\prod_{n=1}^\infty (1+a_n)$ diverges to $0$.
Proof: We look at the partial product over each copy of a permutation, noting that
$$
\prod_{m=1}^{2^j} (1+y_jr_{j,m}) = 1-y_j^{2^j}
$$
from an evaluation of the $2^j$th cyclotomic polynomial. Therefore the partial product over the $g_j$ copies of that permutation contribute
$$
\bigg( \prod_{m=1}^{2^j} (1+y_jr_{j,m}) \bigg)^{g_j} = (1-y_j^{2^j})^{g_j} \in (0, \tfrac12)
$$
to the overall product; since there are infinitely many such factors, the overall product diverges to $0$. (Technically this proves that the lim inf of the partial products equals $0$, which is enough for divergence, but a proof similar to that of Claim 1 should establish the full limit.)
One can modify the construction (using odd moduli in place of $2_j$) to produce similar examples where the product diverges to $+\infty$.
Best Answer
First, we'd like to take logarithms to relate a sum and a product. So, assume first that none of the $a_n$ lie on the ray $(-\infty, -1],$ so that we may define a branch of the logarithm function $\log(1+z)$ on $\mathbb{C}\setminus (-\infty, -1]$ which we can use on the $a_n.$ Since $|a_n|\rightarrow 0$ by the series convergence, only finitely many of the terms will lie on $(-\infty, -1],$ and so this is in fact sufficient to see the general case, since we can remove those finitely many bad terms from our product without influencing the convergence.
If $\sum_n |a_n|$ converges, then so does $\sum_n \log(1+a_n).$ This is really the crucial claim; noticing that the exponentionals of the partial sums gives the partial product easily shows that the convergence of this sum is equivalent to convergence of the product.
This fortunately follows easily from the limit comparison test. Note that $$\lim_{n\rightarrow\infty} \frac{\log(1+a_n)}{a_n} = 1,$$ since $a_n\rightarrow 0$ and so we actually have a difference quotient $$\frac{\log(1+a_n)}{a_n} = \frac{\log(1+a_n) - \log 1}{a_n - 0}$$ of the $f(z) = \log(1+z)$ function at $z=0.$ The derivative there is $f'(0) = 1,$ and so this limit follows.