Real Analysis – Convergence of a_n/b_n When ?b_n Diverges and ?a_n Converges

real-analysis

$a_n,b_n>0$

I've determined $b_n$ is monotonically increasing then $\frac{1}{b_n}$ is monotonically decreasing and bounded, so $\sum \frac{a_n}{b_n}$ converges and so the limit indeed exists and is equal to zero.

However, I'm not sure if $b_n$ is monotonically decreasing. Intuitively, $a_n$ must decrease faster than $b_n$, because the series converges, but I don't know how to prove that the limit must then exist.

Best Answer

The main flaw in the intuition here seems to be that any sequence must be either increasing or decreasing. There are many sequences that are neither, and one can construct many counterexamples this way.

Consider the following: define $b_n=\frac1n$ for all $n$. Define $$ a_n = \begin{cases} \frac1n, &\text{if $n$ is a power of $2$}, \\ 0, &\text{otherwise}. \end{cases} $$ Then $\sum b_n$ diverges, but $\sum a_n = \sum_{k=0}^\infty \frac1{2^k} = 2$ converges. However, $\frac{a_n}{b_n}$ equals $1$ infinitely often and $0$ infinitely often, and thus $\lim_{n\to\infty} \frac{a_n}{b_n}$ does not exist. (If one requires that $a_n$ is strictly positive, one can just change the $0$s to incredibly small values—even $\frac1{n^2}$ is still enough to preserve the counterexample.)