Confusion if the series converges or not (alternating series test)

absolute-convergencecalculusconvergence-divergence

I have to test for convergence and absolute convergence for the following series:

$$\sum_{k=1}^{\infty} (-1)^k \frac{k}{1+2k^2}$$

Because of the alternating series test, I have to verify if the series decreases monotonically and then show that the limit goes to zero.
I don't have any problems showing that it decreases monotonically, but I have trouble showing if the limit is zero.

$$\lim_{x\to\infty} \frac{k}{1+2k^2} = \frac{1}{2k} \rightarrow 0$$
Therefore it converges.

But with the direct comparison test it is:
$$ \mid (-1)^k \frac{k}{1+2k^2}\mid = \frac{k}{1+2k^2} =\frac{1}{\frac{1}{k}+2k}\geq \frac{1}{k+2k} = \frac{1}{3k}$$

Which is similar to the harmonic series$\sum_{k=1}^{\infty}\frac{1}{k}$ hence the series should diverge.
So my question is, does it diverge or converge? And how do I know if it converges absolutely?
Thank you for your time and help!

Best Answer

You're doing nothing wrong.

Since $|a_k|$ decreases monotonically, and $a_k \to 0$, you can conclude that $\sum a_k$ converges, by the alternating series test.

Now, we say that a series $\sum a_k$ coverges absolutely if $\sum |a_k|$ converges. But as you show, $\sum |a_k|$ doesn't converge, so we have a conditionally convergent series, one that converges, but not absolutely.