Given $a_n > 0$ and $\sum a_n$ diverges, is this a valid proof that $\sum \frac{a_n}{1+a_n}$ diverges

sequences-and-series

I'm aware that similar questions have been posted before, for example here. But my question is not a duplicate since I am asking about my specific proof. Is it correct, and could it have been made simpler while retaining the same basic idea?

If $a_n\not\to 0$, let $\epsilon > 0$ be such that $a_n > \epsilon$ for infinitely many values of $n$. Then $$\frac{a_n}{1+a_n} = \frac{1}{1/a_n+1} > \frac{1}{\epsilon+1}$$ for infinitely many values of $n$. Thus $\frac{a_n}{1+a_n}\not\to 0$, and the series diverges.

On the other hand, assume that $a_n\to 0$. If $\sum\frac{a_n}{1+a_n}$ converges, then so does $\sum\frac{a_n^2}{1+a_n}$, by the comparison test. But then $$\sum a_n = \sum\left(\frac{a_n}{1+a_n} + \frac{a_n^2}{1+a_n}\right) = \sum\frac{a_n}{1+a_n} + \sum\frac{a_n^2}{1+a_n}$$ converges, a contradiction.

Source: This is problem 11(a) in chapter 3 of Rudin's Principles of Mathematical Analysis.

Best Answer

Yes, your arguments are correct, with the only caveat that your first inequality should have been $$ \frac{a_n}{a_n+1}>\frac1{\frac1\epsilon+1}. $$