Simply having infinitely many non-zero terms is insufficient. There are many cases of infinite series with infinitely many non-zero terms that converge, e.g. $\displaystyle \sum_{n = 1}^\infty \frac{1}{n^p}$ for $p > 1$.
When you say that $\displaystyle \sum_{n = 1}^\infty a_n$ converges conditionally, I presume you mean that it converges but it does not converge absolutely. (As an aside, this already means that we have infinitely many non-zero terms - if there were only finitely many, it would converge absolutely, and thus be an absolutely convergent series).
Let's suppose that $\displaystyle \sum_{n= 1}^\infty a_n^-$ converges for a moment, so that we can say that
$$\sum_{n= 1}^\infty a_n^- = -L$$
for some number $-L$. We know that $\displaystyle \sum a_n$ diverges, so we must have that $\displaystyle \sum a_n^+$ diverges. But if $\displaystyle \sum a_n^+$ diverges, then $\displaystyle \sum a_n = \sum a_n^+ + \sum a_n^-$ will look like $\displaystyle \left( \sum a_n^+\right) - L$, and will still diverge. (Note that I'm being a bit abusive and leaving out details like that these are partial sums; you cannot change the order of elements in a conditionally convergent sum and expect nothing to change).
This contradicts our initial condition that $\displaystyle \sum a_n$ converges. So we must actually have that $\displaystyle \sum a_n^-$ was divergent. $\diamondsuit$
To boil down the essence of the proof, an absolutely convergent series means that all the terms get sufficiently small sufficiently fast to converge. Having a series be merely conditionally convergent means that the positive part and the negative parts cancel out a lot, and without that cancellation you get divergence. By 'cancel out a lot,' I really mean infinite cancellation since this sort of work does not worry about finite contribution. So if either the positive or negative contribution is finite, there isn't enough cancellation.
If $a_j$ are nonnegative real numbers, then we're free to change the order $\sum\limits_{k=0}^\infty\sum\limits_{j=0}^k=\sum\limits_{j=0}^\infty\sum\limits_{k=j}^\infty$: $$\sum_{k=0}^\infty 2^{-k-1}\sum_{j=0}^k 2^j a_j=\sum_{j=0}^\infty 2^j a_j\underbrace{\sum_{k=j}^\infty 2^{-k-1}}_{=2^{-j}}=\sum_{j=0}^\infty a_j.$$
For the general case, we use $|b_k|\leqslant 2^{-k-1}\sum\limits_{j=0}^k 2^j|a_j|$ and the above to get $\sum\limits_{k=0}^\infty|b_k|\leqslant\sum\limits_{j=0}^\infty|a_j|$.
Best Answer
If the series $\sum_{n=1}^\infty |a_n| \sum_{k=n}^\infty |b_k| $ converges, then you can do whatever you want with $a_n b_k$; any rearrangement will converge to the same sum. More generally: if $I$ is an index set and $\sum_{i\in I} |c_i|<\infty$, then any rearrangement of $c_i$ converges to the same sum. This is because for every $\epsilon>0$ there is a finite set $S\subset I$ such that $\sum_{i\in I\setminus S}|c_i|<\epsilon $; any method of summation will use up $S$ at some point, and after that the sum is guaranteed to be within $\epsilon$ of $\sum_{i\in S} c_i$.
But if you only know that $$\sum |b_k|\tag{1}$$ and $$\sum_{n=1}^\infty \left|a_n \sum_{k=n}^\infty b_k \right|\tag{2} $$ converge, then rearrangements can go wrong. For example, take the series $\sum b_k $ to be $\frac{1}{2} -\frac{1}{2}+\frac{1}{4}-\frac{1}{4}+\frac{1}{8}-\frac{1}{8}+\dots$ then every other sum $\sum_{k\ge n} b_k$ is zero, and the corresponding $a_n$ could be arbitrarily large without disturbing the convergence of (2). In this situation you could even have infinitely many terms $a_nb_k$ that are greater than $1$ in absolute value; clearly this series can't be rearranged at will.